Skip to content

General

General

Categories

566 results found

  1. I would like to see Entity, Field and Filter in the dashboard list. Currently, it is not seen so I cannot determine whether the dashboard is pulling info from Test, Test Set, Instance, etc. I have to click on each one to look. Having it in the list would make things easier.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  2. Current integration with ADO is only syncing the Description information section from ADO to PractiTest but in most cases there are multiple information sections depending on the Work Item type in ADO like Acceptance Criteria for requirements and Repro Steps, System Info and Acceptance criteria for a bug in ADO. It will be great if the syncing is not limited to just Description information section but is smart enough to sync all the information sections to PractiTest when the sync is done.
    Also there are other important fields like Release Date and Priority for the Requirements which must be synced…

    33 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. Currently when the requirements and bugs are synced from ADO, the comments are not synced.

    For requirements the comments have valuable information reflecting the changes and doubts which are important for a tester to know while creating test cases so it is very critical to include the comments as well.Also if a tester has certain doubts on a requirements if there is a 2 way sync he can just post the comments on the PractiTest and they should flow to the linked requirement card in the ADO.

    For bugs, since the developers won't always have access to PractiTest, the will…

    28 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. There are many cases where "Login as.." for instance is just called by other tests so that is not repeated in every Test, These "Login as" Tests are necessary but not necessary in the TestSets. Could a small check box, "Do not include in TestSet" be added so when we create a TestSet the Unnecessary tests are not visible or Color Coded to let the Tester know that Test is not needed in the TestSet? We are currently using xx as a prefix in the Test Name to distinguish these types of tests. Love your product!

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  5. The test execution progress graph is a real good tool, giving clear info on a project progress status.
    For now the time period of this graph is fixed to 28 days.
    Being able to customize this X-axe would be very useful to study the reasons of slowliness in a project afterwards, for instance.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. You should give the users to add fields at test step level.

    They should be allowed to Add things like Test Data for a particular step like Large Queries it it. For us the Test Precondition or Test Description or Test Name cannot be used for that purpose.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. When Practitest generates a 'detail' report of tests, in Excel format, there are two blank columns on the left (A & B) and a lot of blank rows padding the report. These blanks waste space on the screen/ sheet when printing. I have to manually remove these every time, which wastes time. Please could you take them out of the default report ?

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. I want to get the steps of test cases which are in Automation status (Automation filter).

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. NARRATIVE: The "Goto Tests…" field currently only works for test IDs. It would be nice to be able to go to tests by test name as well.

    STEPS TO DUPE:
    1. Go to the Test Library module.
    2. Type an existing test name into the "Goto Tests…" field
    3. Push return.

    RESULT: All tests for the currently selected filter are displayed.

    EXPECTED: To go to the test with the matching name.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  10. There should be an option to allow the user to clone either as anew Test Set or with existing Test Status.

    It will help us to run the same test set with multiple versions of our product.

    Currently cloning reset all the test statuses.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. When using Exploratory testing can we have an option to:
    (1) Take screen shot and add comments via the tool itself
    (2) record all the screen actions as a recording.
    (3) Convert all the actions automatically as steps if you want to make the test a scripted one for later use

    26 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  12. When a project is integrated with a project board in JIRA or in future ADO, can we please make sure :
    (1) All the requirements flow thru automatically without a manual need to do it
    (2) The syncing of the requirements is automatically done periodically.
    (3) The issues raised in the Jira or ADO should sync automatically.

    26 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. standard .net test runners can produce .trx files e.g.

    vstest.console.exe /logger:trx foo.dll

    dotnet vstest --logger:trx foo.dll

    These are just XML files. We currently have to parse these xml files and perform hundreds of individual uploads, waiting 60 seconds every time the server throttles with HTTP 429

    It would be better if .trx files could just be uploaded to practitest to produce a run of a test set, and the parsing was done on the server.

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  14. Our automation test execution (ATE) machine can receive a remotely command and put into queue, commands from queue will be pulled out and processed, such as 'Build', 'Program', 'Run', 'Report'... but I can't find any way to trigger our ATE from Practitest server now.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  15. Requirement description is very important for users to overview all requirement list or check duplication after importing from excel file but now user only can see requirement description by view each requirement detail.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  16. At the moment, when I've created a new test I will leave it as 'draft' status until the feature comes available for me to test. However, I like to have my test sets organised and ready to go. It would be useful to have an option to only test the instances within a test set that have 'ready' status. This would save accidentally initiating tests for the instances that are not yet ready and are still in draft.

    2 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  17. it is very often usefull to compare testets contents to extract information like:
    - tests removed from one testset to another
    - tests added from on testset to another
    - tests seen as passed in one testset execution and failed in another testset execution (regression or test fixed)

    seems to be possible to do this through filters and excel (macro, formulas) would be useful to have it natively supported in PractiTest

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  18. Each time the Task board (Kaban) screen opens the default filter is set.
    Request to keep the last set filter.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  19. This is a pity that we can't see the test description + the attachments during execution.
    + the additon of a button to end the test execution . This is a strange feeling to validate the test by coming back to its description or in instances list.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    Hi Benoit,

    Thanks for your feedback.

    Kindly note that you can hover over ‘Test in Test Library’ and see the test description.

    As for the button to end the test execution, our main goal was always to save tester time hence we initially did not plan to add any additional buttons to that screen.

    Having said that, would recommend to use the keyboard shortcuts, see more information here: https://www.practitest.com/help/settings/keyboard-shortcuts/

    Thanks,
    Yaniv

  20. Currently all columns are not filtered. We can't sort by test ID, nor linked issues, nor test duration. Is there any plan for that ?

    furthermore, we can sort the columns, but if we enter a test and come back to the instances list, the sorting is lost, it would be nice to keep it .
    thanks

    2 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  • Don't see your idea?