Skip to main content


A Space's primary purpose is to collect and track published test Results. Results come from automated tests by pushing data with the Testspace client and/or from manual testing sessions.

The Current tab provides a representation of your test hierarchy based on the most recent complete Result. This provides a snapshot of the current testing status. The Result can contain a single Suite or large set of folders and Suites, there are no depth or number constraints. Additional metrics such as code coverage, static analysis, defects, etc., can also be collected along with test cases to provide a more comprehensive view of the status.

Current Results

Description of the numbered areas:

  1. Name - The name of the most recent Result.
  2. Duration - Test duration time and how long ago the Result completed.
  3. Metric Badges - Set of badges enabled.
  4. Build Link - Link to CI job that maps to the Result.
  5. Map - A quick navigation showing the test hierarchy.
  6. Annotations - Ancillary data and information associated with the Result.
  7. Folders - Test items organized within folders.

Each Folder and Suite is displayed with columns that show roll-up totals for their child items. Roll-up totals of the following are shown for each:

  • Suites - number of passing and failing Suites
  • Cases - number of passing and failing Test Cases
  • Failures - count of failed test cases (new, tracked, resolved)
  • Duration - total time (if available)

Result Content Types#

A Testspace Result can encompass a wide variety of content in a hierarchical organization using folders. This includes not only test content, but also items, such as code coverage reports, static analysis reports, log file, custom metrics and links to external content. The contents are represented with the following icons to assist in navigating a result.

Metric SuiteMetrics Suite

Test Output#

Test Suites uses the suite icon followed by a name (i.e. Hello World).

Test Suite

Clicking on a Suite name opens the Suite dialog as shown below. The dialog is used to view and manage status for each test case in the Suite.

Results Suite

Code Coverage#

Code Coverage uses a metrics suite icon followed by the metric's name (i.e. Code Coverage).

Metric Suite

Clicking on the Code Coverage Suite name opens the Suite dialog as shown below. The dialog is used to view code coverage rates for each source file.

Code Coverage Metric Suite

Static Analysis#

Static Analysis uses a metrics suite icon followed by the metric's name (i.e. Static Analysis).

Static Analysis

Clicking on the Static Analysis Suite name opens the Suite dialog as shown below. The dialog is used to view static analysis issues for each source file.

Static Analysis Metric Suite


Issues, referenced in the context of a manual testing Session uses a metrics suite icon followed by the metric's name (i.e. Issues).

Clicking on the Issues Suite name opens the Suite dialog as shown below. The dialog presents a listing of the current status of each defect:

Issues Report

Custom Content#

Custom content persisted in a text file (of type .txt, .log, .md or .html), when published, is represented by either a suite icon or metrics suite icon if they have custom metrics attached.

Notice, depending on the purpose a textual file can either be published as a separate suite or an annotation.

External Content#

A url link to external content can be added. A typical use for this would be to access information that you used as the basis of a custom metric. These suite are also represented by the either a suite icon or metrics suite icon if they have custom metrics attached.


Report Annotations make it possible to include supplemental data to a result set you upload. The supplemental data becomes part of your results and can be downloaded as long as the result set exists. Annotations are well suited for binary data, such as Screen Shots and Media files.

Each annotation gets attached to a Folder, Suite or Test Case in your Results. Annotations can then be accessed from the view showing the Results level where the Annotation resides. By default, Annotations are attached to the Root Folder of your results, but you can specify a different node if desired.

Annotations can be attached to the root, a folder, a suite, and a case.

Test Failures#

When test failures occur, they can be deeply nested and widely dispersed throughout a Result hierarchy as shown in the example below.

Test Failure Filtering enables you to find and triage failures quickly:

  • Is the failure new, or did it occur in the previous or other recent builds?
  • Is the case intermittently (i.e. flaky) or consistently failing?
  • How can one tell when failures have been resolved?

Test cases that fail are tracked over subsequent Results and are reported in one of the following states:

  1. New – the first time a test fails and is not being tracked.
  2. Consistent - a test case that is consistently failing.
  3. Passing - a test case passing that is being tracked.
  4. Flaky - a test case that is oscillating between passing or failing.
  5. Resolved – a test case passing 5 times.

Results Failures

The N Failures filter button is shown (only when failures occur) in both the Current and Results tabs, and at every level when traversing the results hierarchy. The Failures tracking (column 4) displays the number of New, Tracked, and Resolved failures.


Clicking on the Failures button reduces the hierarchy of test folders and suites into a single flat view showing only the suites that contain test failures.

  • Types - suites that contain Failed/Errored (default), Failed ony, Errored only, or all Tracked cases.
  • Filter:
    • New - failing cases that passed in the previous five consecutive results.
    • Flaky - failing cases that are not either consistently passing or failing.
    • Consistent - failing cases that are in a consistent failure state.
    • Exempt - failing cases that have been Exempt from failing the Health of the results.

The View all… link is used to disable the view.

Failure Filtering

Test Suite#

When opening a Suite with failures, by selecting the Suite Name, only failing cases are listed by default as shown below.

Test Suite Failures

The Failures types (column 5) are tracking using the following icons and names with the most recent result as the larger icon on left :

Test Suite Failures

The View all… link at the bottom of the dialog is used to disable the view.


Test exemptions allow you to exclude failing cases from the calculation of the pass-fail rate. Exemptions are persisted and are applied to each subsequent results until removed. Removal can be done manually or will be done automatically after the 5th consecutive passing result. Exempt test cases are not counted in passed|failed totals.

Exemptions can only be managed from the Current Result view.

To exempt a failing test case, only while on the Current tab, click on the exclamation (!) button in the column. From this dialog you can add a Note for tracking (discussed in the section below) and also select the Exempt check-box.

Test Suite Exemption

Exemptions are removed automatically after the 5th consecutive passing result.

Exempt test cases are identified with a yellow circled exclamation symbol (refer to 4 below).

Test Suite Exemption View

The passed|failed state is still shown (refer to 2) but grayed-out and striked-through. The existence of exempt test cases is identified – next to the passed|failed totals (refer to 5) – in all views, however it's status it not added to totals. Notes that have been added by triage dialog are displayed under case name (refer to 1). For a test case that does not have an exemption, the info icon remains gray (refer to 3).


Project Notes are used to track Exemptions and Triage Notes for all Spaces under the Project. A new note is automatically created when an exemption is enabled. Each Exemption and/or Triage Note enabled for that Result is tracked by an additional comment added to the note. An additional comment is added for each Exemption when removed. The total number of exempt test cases is summarized in the results notification email.

Results Listing#

The most recent and historical test results are available on the Space's Results tab. In this view results are listed in date order with the newest results listed first. The Results View shows result summary data; to view pass/fail trends over time, visit the Space's Metrics tab.

Results Listing


Each Result is represented by a row that's divided into the following columns:


The Result name is shown here as a clickable link.

To the left of each Result is a circle icon and a number in square brackets. The number shows the Testspace-assigned upload sequence of the Result. This number is used to identify the unique result set.

The circle icon indicates the state of the result set as follows:

solid blackmost current results, result set is complete
solid graynon-current results, result set is complete
hollowpartial (incomplete) results

Note that a Results can be retained indefinitely (see retention policy) using the pin option (see below). When pinned the circle icon is surrounded be a pin.

Results Pinned


The Health column provides a health indicator for each published Results

HealthyHealthy0 nonexempt test failures with all metric criteria met
UnhealthUnhealthy1 or more nonexempt test failures, or unmet metric criteria
InvalidInvalidExcluded from the calculation of Health Rate


The text here shows the number of passing and failing Test Suites and Test Cases. The numbers are shown in brackets and color coded, i.e. [passed, failed] (na):

  • passed - Green Pass Count
  • failed - Red Fail/Error Count
  • na - Gray No-data/NA/Skip Count


A count of failed test cases in last five result sets is shown in brackets and color coded, i.e. [new, tracked, resolved]:

  • new - A unique test case regression that was not previously in the tracked state.
  • tracked - A test case that is not a new regression but has had at least one failure in the last 5 results
  • resolved - A test case that has passed or has been missing from the past five successive results and is now being removed from the tracked state.


The duration of the entire test run represented by the result set is shown here if available. Notice, some test frameworks don't include this information in their results.


This column shows the date that the result set was published along with the username of the uploader.


At the far left end of each result row, on mouse hovering, a hamburger menu icon is shown. When clicked, the menu offers the following options:

CommentProvides a comment on the result set. You will be redirected to a new comment dialog - on submission your comment will be added to a specific to that result set Note (that is pre-populated with high-level details of the result set).
PinPins the result set. This provides the ability to keep the results content indefinitely (see retention policy).
RenameDisplays a dialog allowing you to rename the result set.
This changes the name and/or description shown in the Name column.
(Active only for incomplete result sets.)
AddAllows you to manually upload additional results that are to be added to the result set.
(Active only for incomplete result sets.)
CompleteCompletes an inactive result set without adding any additional results.
(Active only for incomplete result sets.)
ExportDownloads the Result in Testspace XML format to the host computer.
DeleteDeletes the result set (admin only).


Often you want to create a discussion about a specific result set. To streamline the creation of the Note that originates a discussion like this, Testspace can automatically create a Note from a result set comprised of a formatted results summary and a link to the results details.

You can create a Note from a result set either explicitly from the Testspace GUI, or by replying to a results publication notification email.


To create such a Note, navigate to the Space's Results view and click on the hamburger menu (show when mouse hovering) at the left end of the result's row. Select Comment from the menu.

Project Notes New Results

A dialog will be displayed where you can enter text that will be added as a Comment to a specific to that result set Note (that is pre-populated with high-level details of the result set). After submission, you will be taken directly to the Note's thread where you can edit your Comment or add additional Comments if desired.

Results Email#

If you are subscribed for publication notifications, on result set publishing you will receidve an email notification. Simply replying to the email will generate a Comment, with the text you entered, to a specific to that result set Note (that is pre-populated with high-level details of the result set).

You can go to the Notes view and confirm that the Note has been added and your Comment appears.

Retention Policy#

The detailed Results' content, but not the associated metrics, is periodically recycled based on the following retention policy:

  • 30 days to keep (independent of the count)
  • 5 minimum to keep (independent of the timestamp)

Notice, the retention policy does not apply to explicitly pinned Results.


The Map dialog presents the test hierarchy in a collapsible tree. Exemptions and pass/fail counts are summarized for each suite

Current Results Map

Double-click on the desired node for quick navigation. Your current location within the tree is shown in the banner at top.