Skip to content

General

General

Categories

544 results found

  1. Hi Admin,

    while running of an instance, it had better provide a status 'SKIP‘ so that we can know how many test cases 'SKIP' in regression testing.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  2. Similar to JIRA headers.
    Ex: A dynamic todays date added to current release.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. Hi, While we integrated API into our testing pipeline, noticed that it could not create / modify dashboard via API.

    Suggest to support this feature.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. Similar to Preconditions in Test Run, add a field Test Data that shows up while running a test

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  5. If we know of a particular release date and we want to track test execution to that date, then a burn up report can help us track and highlight risks at a very early stage.
    This should take into account the testers available for the execution so that it can work out the velocity accordingly

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. Currently we are not syncing the attachments from ADO requirements or bugs to PractiTest nor from PractiTest bugs to ADO. It would be great if we can do a 2 way sync for the requirements as well.
    Many times the big requirements have attached docs which are essential for a tester to see to create test cases.
    Bugs in ADO will have screenshots and log files attached and they need to sync across to the PractiTest as well.
    Also when a bug is created from PractiTest to flow thru to ADO it will/should have screen shots attached along with other…

    22 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    planned  ·  1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. I would like to see Entity, Field and Filter in the dashboard list. Currently, it is not seen so I cannot determine whether the dashboard is pulling info from Test, Test Set, Instance, etc. I have to click on each one to look. Having it in the list would make things easier.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. Currently when the requirements and bugs are synced from ADO, the comments are not synced.

    For requirements the comments have valuable information reflecting the changes and doubts which are important for a tester to know while creating test cases so it is very critical to include the comments as well.Also if a tester has certain doubts on a requirements if there is a 2 way sync he can just post the comments on the PractiTest and they should flow to the linked requirement card in the ADO.

    For bugs, since the developers won't always have access to PractiTest, the will…

    28 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    planned  ·  0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. There are many cases where "Login as.." for instance is just called by other tests so that is not repeated in every Test, These "Login as" Tests are necessary but not necessary in the TestSets. Could a small check box, "Do not include in TestSet" be added so when we create a TestSet the Unnecessary tests are not visible or Color Coded to let the Tester know that Test is not needed in the TestSet? We are currently using xx as a prefix in the Test Name to distinguish these types of tests. Love your product!

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  10. The test execution progress graph is a real good tool, giving clear info on a project progress status.
    For now the time period of this graph is fixed to 28 days.
    Being able to customize this X-axe would be very useful to study the reasons of slowliness in a project afterwards, for instance.

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. You should give the users to add fields at test step level.

    They should be allowed to Add things like Test Data for a particular step like Large Queries it it. For us the Test Precondition or Test Description or Test Name cannot be used for that purpose.

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    under review  ·  1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  12. When Practitest generates a 'detail' report of tests, in Excel format, there are two blank columns on the left (A & B) and a lot of blank rows padding the report. These blanks waste space on the screen/ sheet when printing. I have to manually remove these every time, which wastes time. Please could you take them out of the default report ?

    8 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. I want to get the steps of test cases which are in Automation status (Automation filter).

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  14. NARRATIVE: The "Goto Tests…" field currently only works for test IDs. It would be nice to be able to go to tests by test name as well.

    STEPS TO DUPE:
    1. Go to the Test Library module.
    2. Type an existing test name into the "Goto Tests…" field
    3. Push return.

    RESULT: All tests for the currently selected filter are displayed.

    EXPECTED: To go to the test with the matching name.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  15. There should be an option to allow the user to clone either as anew Test Set or with existing Test Status.

    It will help us to run the same test set with multiple versions of our product.

    Currently cloning reset all the test statuses.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  16. Our automation test execution (ATE) machine can receive a remotely command and put into queue, commands from queue will be pulled out and processed, such as 'Build', 'Program', 'Run', 'Report'... but I can't find any way to trigger our ATE from Practitest server now.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  17. Requirement description is very important for users to overview all requirement list or check duplication after importing from excel file but now user only can see requirement description by view each requirement detail.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  18. At the moment, when I've created a new test I will leave it as 'draft' status until the feature comes available for me to test. However, I like to have my test sets organised and ready to go. It would be useful to have an option to only test the instances within a test set that have 'ready' status. This would save accidentally initiating tests for the instances that are not yet ready and are still in draft.

    2 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  19. it is very often usefull to compare testets contents to extract information like:
    - tests removed from one testset to another
    - tests added from on testset to another
    - tests seen as passed in one testset execution and failed in another testset execution (regression or test fixed)

    seems to be possible to do this through filters and excel (macro, formulas) would be useful to have it natively supported in PractiTest

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  20. Each time the Task board (Kaban) screen opens the default filter is set.
    Request to keep the last set filter.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  • Don't see your idea?