Skip to content

General

General

Categories

613 results found

  1. When Practitest generates a 'detail' report of tests, in Excel format, there are two blank columns on the left (A & B) and a lot of blank rows padding the report. These blanks waste space on the screen/ sheet when printing. I have to manually remove these every time, which wastes time. Please could you take them out of the default report ?

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  2. I want to get the steps of test cases which are in Automation status (Automation filter).

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. NARRATIVE: The "Goto Tests…" field currently only works for test IDs. It would be nice to be able to go to tests by test name as well.

    STEPS TO DUPE:
    1. Go to the Test Library module.
    2. Type an existing test name into the "Goto Tests…" field
    3. Push return.

    RESULT: All tests for the currently selected filter are displayed.

    EXPECTED: To go to the test with the matching name.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. There should be an option to allow the user to clone either as anew Test Set or with existing Test Status.

    It will help us to run the same test set with multiple versions of our product.

    Currently cloning reset all the test statuses.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  5. When using Exploratory testing can we have an option to:
    (1) Take screen shot and add comments via the tool itself
    (2) record all the screen actions as a recording.
    (3) Convert all the actions automatically as steps if you want to make the test a scripted one for later use

    26 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. When a project is integrated with a project board in JIRA or in future ADO, can we please make sure :
    (1) All the requirements flow thru automatically without a manual need to do it
    (2) The syncing of the requirements is automatically done periodically.
    (3) The issues raised in the Jira or ADO should sync automatically.

    26 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. standard .net test runners can produce .trx files e.g.

    vstest.console.exe /logger:trx foo.dll

    dotnet vstest --logger:trx foo.dll

    These are just XML files. We currently have to parse these xml files and perform hundreds of individual uploads, waiting 60 seconds every time the server throttles with HTTP 429

    It would be better if .trx files could just be uploaded to practitest to produce a run of a test set, and the parsing was done on the server.

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. Our automation test execution (ATE) machine can receive a remotely command and put into queue, commands from queue will be pulled out and processed, such as 'Build', 'Program', 'Run', 'Report'... but I can't find any way to trigger our ATE from Practitest server now.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. Requirement description is very important for users to overview all requirement list or check duplication after importing from excel file but now user only can see requirement description by view each requirement detail.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  10. At the moment, when I've created a new test I will leave it as 'draft' status until the feature comes available for me to test. However, I like to have my test sets organised and ready to go. It would be useful to have an option to only test the instances within a test set that have 'ready' status. This would save accidentally initiating tests for the instances that are not yet ready and are still in draft.

    2 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. it is very often usefull to compare testets contents to extract information like:
    - tests removed from one testset to another
    - tests added from on testset to another
    - tests seen as passed in one testset execution and failed in another testset execution (regression or test fixed)

    seems to be possible to do this through filters and excel (macro, formulas) would be useful to have it natively supported in PractiTest

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  12. Each time the Task board (Kaban) screen opens the default filter is set.
    Request to keep the last set filter.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. This is a pity that we can't see the test description + the attachments during execution.
    + the additon of a button to end the test execution . This is a strange feeling to validate the test by coming back to its description or in instances list.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    Hi Benoit,

    Thanks for your feedback.

    Kindly note that you can hover over ‘Test in Test Library’ and see the test description.

    As for the button to end the test execution, our main goal was always to save tester time hence we initially did not plan to add any additional buttons to that screen.

    Having said that, would recommend to use the keyboard shortcuts, see more information here: https://www.practitest.com/help/settings/keyboard-shortcuts/

    Thanks,
    Yaniv

  14. Add parctitest "test id" from Jira (directly)when I created new ticket

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  15. Provide the ability to add attachments and/or links at the filter and folder level in Test Library. This way attachments that apply to all tests in that filter or folder can be stored once and not in each test. Similiar reason for URLs. It would be a place to store test instructions, data paths etc.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  16. Hello,
    Attachements are quite painful to add: this is click-click-click ... We should be able to add images or documents in the description. Images would appear in the description, so we wouldn't have to re-click to see it real sized.

    Additionnaly we don't understand why we can't see the description of the test while executing test cases . Is it intentionnal ? Thanks !

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    The option to drag & drop files into the description will be supported in the new forms UI we are working on these days. The test set / Issues form already have this option - soon all items will have it as well

  17. Currently practitest tools schedules all report (daily / weekly or monthly) at 6.00 AM but it would be great if tool allows to customize time for scheduling the reports to be delivered.

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  18. When batch importing test cases via spreadsheet, Practitest does not take you to the tests you just imported, it just states whether the operation succeeded or not.

    If would be nice if the imported tests were linked to on the result page so we can immediately start work on them. At the moment I have no idea where my imported tests end up so often have to search for them manually which takes time.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  19. Hello,
    Search bar has limited capabilities for now. It would be great to be able to be less limited, mainly to be able to search for numbers.

    Furthermore we could avoid the need for ** or "" .

    Thanks !

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  20. For now Mantis is a 1-way integration. Which means that once a test failed, the bug is created in Mantis.
    The problems are :
    1- we have to manually re-link the bug in Practitest
    2- No info appear in the Report unless we re-recreate the bug in the Issue part.

    Having these 2 features would be a great improvement.
    Thanks !

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  • Don't see your idea?