Skip to content

General

General

Categories

561 results found

  1. This is a pity that we can't see the test description + the attachments during execution.
    + the additon of a button to end the test execution . This is a strange feeling to validate the test by coming back to its description or in instances list.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    Hi Benoit,

    Thanks for your feedback.

    Kindly note that you can hover over ‘Test in Test Library’ and see the test description.

    As for the button to end the test execution, our main goal was always to save tester time hence we initially did not plan to add any additional buttons to that screen.

    Having said that, would recommend to use the keyboard shortcuts, see more information here: https://www.practitest.com/help/settings/keyboard-shortcuts/

    Thanks,
    Yaniv

  2. it is very often usefull to compare testets contents to extract information like:
    - tests removed from one testset to another
    - tests added from on testset to another
    - tests seen as passed in one testset execution and failed in another testset execution (regression or test fixed)

    seems to be possible to do this through filters and excel (macro, formulas) would be useful to have it natively supported in PractiTest

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. Requirement description is very important for users to overview all requirement list or check duplication after importing from excel file but now user only can see requirement description by view each requirement detail.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. Our automation test execution (ATE) machine can receive a remotely command and put into queue, commands from queue will be pulled out and processed, such as 'Build', 'Program', 'Run', 'Report'... but I can't find any way to trigger our ATE from Practitest server now.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  5. There should be an option to allow the user to clone either as anew Test Set or with existing Test Status.

    It will help us to run the same test set with multiple versions of our product.

    Currently cloning reset all the test statuses.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. NARRATIVE: The "Goto Tests…" field currently only works for test IDs. It would be nice to be able to go to tests by test name as well.

    STEPS TO DUPE:
    1. Go to the Test Library module.
    2. Type an existing test name into the "Goto Tests…" field
    3. Push return.

    RESULT: All tests for the currently selected filter are displayed.

    EXPECTED: To go to the test with the matching name.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. I want to get the steps of test cases which are in Automation status (Automation filter).

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. There are many cases where "Login as.." for instance is just called by other tests so that is not repeated in every Test, These "Login as" Tests are necessary but not necessary in the TestSets. Could a small check box, "Do not include in TestSet" be added so when we create a TestSet the Unnecessary tests are not visible or Color Coded to let the Tester know that Test is not needed in the TestSet? We are currently using xx as a prefix in the Test Name to distinguish these types of tests. Love your product!

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. Currently get an automated email with a link to click to download the report. I would like the report downloaded automatically.

    E.g. If I could anticipate the link to download the report (as currently added to the email) I could generate it automatically and hence download the report. You might also provide an api style interface such that the report is available in chrome as a page which can be extracted from chrome as JSON.

    P.s. Not a good solution for me, but still a solution, have you thought about a google drive interface for the reports (and exports) to…

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  10. Currently the API v2platform supports Delete method call for object such as Steps/Requirements/Tests/TestSets.
    As part of our work in the company we post automated runs results to the run data of an instance and sometimes it accumulates a huge amount of runs in the history which slows down the retrieval of data via the Practitest platform.
    so it would be nice if we could delete the runs from the history either bulk deletetion or by ID

    5 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. Currently, test modification email notification does not inform the Test admin about the exact changes done at step description and step expected result. This is must needed request . Even Project history does not keep the record of test modification at the level of Steps description and Step expected result

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  12. PractiTest supports a synchronization with YouTrack, especially for the bug creation (from PractiTest to YouTrack).

    There is currently no way to change the call on PractiTest test side.
    YouTrack proposes a feature to update object on Event, such as issue creation.
    I would need to identify that the issue is coming from PractiTest, and then I would be able to make my change on YouTrack side.

    The signature includes 2 fields, which are not filled today: externalId and externalUrl, which shall be filled in my opinion when the issue is created from an external tool.

    Could you please fill at…

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. We would love the ability to run cURL commands form within Practitest so that we can launch our automation from within a test run.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  14. I would be interested by having a way to group different values of a field used to create a pie chart in 1 "super value"
    To give an example:
    For the Requirement items, I would like to have a pie with only 2 values :
    - Not Covered
    - Covered
    As of today, we have: Not Covered, No Run, Passed, Failed...
    I would like to be able to present a dashboard to my manager with the value "Covered" including all the values except Not Covered and Not Covered.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  15. When creating a bug using JIRA integration currently all steps in the test are pased into the jira description. It would make more sense to only paste the step that failed and the results from that step. this would give the user the information on the actual issue and reduce the amount of editing currently required using jira integration

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  16. 3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  17. Having the Logo of my company/project on the favicon, probably next to the name of my project instead of the suitcase

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  18. When you select the traceability of a requirement you can only add one test at a time, can we have a multiple selection like the add instances on a test run?

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  19. We use a tab for each version and there, we have a pie chart for each build from that version.
    At this moment, only 8 pie charts can be created in a tab.

    We would like to be able to add more Pie Charts, for better visibility of all builds from that version.

    Currently, we can either delete old build pie charts or create a second tab for the same version, both which are not good solutions. After each version, we send a report with all builds from that version and a guest link for stakeholders to see the progress,…

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  20. Add the ability to sort the rows and columns in the tables on the dashboard.
    This would enable the column order in the issues table to follow the lifecycle. i.e. New - assigned - fixed - closed etc.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  • Don't see your idea?