Skip to main content

Push Results

Files – output from the Continuous Integration process – are pushed to the Testspace Server using a simple command line client; Windows, Linux, and macOS versions available. Adding the Testspace client to a CI workflow is covered in the next section.

To push content to Testspace simply provide the client a list of files and destination URL:

$ testspace file1 file2 ..

Note: The following examples are used to demonstrate publishing content using different scenarios. It is assumed you have created a Project and a Space and have appropriately configured the Testspace client's destination URL.

File Content

The following types of file content is supported, with additional information available here:

Test OutputWidely adopted test framework output formats such as JUnit, NUnit, Visual Studio TRX,, etc.
Code CoverageCommon coverage tools output formats such as Cobertura, Clover, Bullseye, Visual Studio Coverage, etc.
Static AnalysisLarge set of widely used formats such as Checkstyle, Klocwork, PMD, Visual Studio FxCop & PREfast, etc.
Custom MetricsA CSV (.csv) file associated with a Text File or a Test Output suite.
Text FilePlain text (.txt or .log extension), HTML (.html), and Markdown (.md) files.

Test Output

To publish test result simply push a xml output file generated by the test tool. Refer here for the test output formats supported.

$ testspace tests-results.xml

Some test frameworks produce multiple results files that you may want to aggregate into a single Testspace result set. There are several ways to accomplish this and is demonstrated with examples below.

In each of the separate following examples the result files listed will be aggregated and published as a single result set:

$ testspace myResults1.xml myResults2.xml
$ testspace myResults*.xml
$ testspace /path/to/*.xml ReleaseNotes.txt


To aggregate result files from multiple sub-directories use the double star recursive search notation:

$ testspace /path/to/results/root/**/*.xml

To aggregate result files from sub-directories use the single star search notation:

$ testspace /path/*/results/root/*.xml

Process Output

For a step in your automation process (install, build, etc.) that do not have test cases associated with it, but if it fails you would like it be noted. The output log (for example) and the process return code could be used to report status. The suite status is set by adding the process return code as an integer 'N' as metadata to a file being pushed to Testspace. Any non-zero integer will make the suite failed.

Example of pushing log file and it return code to Testspace.

# In this example $MYPROC is set to the return code of the process that generated myproc.log
# Publish output and exit code
$ testspace "myproc.log{$MYPROC:my description}"

Code Coverage

To publish code coverage simply push a xml output file generated by the coverage tool. Refer here for the code coverage formats supported.

$ testspace code-coverage.xml

Third Party

Testspace supports using the third party coverage tools and Both of these tools report lines covered as a percentage. Testspace uses the minimum configuration set by these tools, to update see Codecov configuration or Coveralls configuration. Testspace will automatically create a metrics graph and add the badges for the third party tool to your project upon first push with --link option.

For example:

$ testspace tests-results.xml --link=codecov
$ testspace tests-results.xml --link=coveralls

Example using a private repo:

$ testspace tests-results.xml --link="TOKEN@codecov"

Static Analysis

To publish static analysis simply push a xml output file generated by the analysis tool. Refer here for the static analysis formats supported.

$ testspace static-analysis.xml

Some compilers and SA tools report results in plain text. The lint {meta-tag} identifies the output as analysis results.

$ testspace "build-output.log{lint}"

Custom Content

Preformatted human-readable output files can be published as custom suites. There are 4 types of output files supported .txt, .log, .md, and .html.

A simple example of publishing a log file.

$ testspace myOutput.log

External Content

Any externally hosted content can be published as a linked suite. The URL of the external entity needs to be stored in a .url file, e.g. myLink.url:


Having that, it could be published as simple as:

$ testspace myLink.url

Custom Metrics

Most applications produce output files containing ancillary data of interest or importance. If formatted as CSV (.csv) files, you can publish these files as custom metrics associated with test results.

To add a custom metric along with your test results you need to push two files

  1. Output file for content reference.
  2. Text file containing comma-separated values (i.e. .csv file).

Each row in the CSV file is a separate metric, with an optional 1st column that is text, used as the metric's name. Otherwise the name of the metric defaults to metric-n.


Note that each row in the CSV file is a separate metric, with an optional 1st column that is text, used as the metric's name.


For information on how to customize the user interface associated with the metric refer here.

Log File Metrics

A simple example of publishing a log file with custom metrics.

Contents of the output log file myOutput.log

File myOutput.log: contains some simple timing metrics.

The delay before start was 134 ms
The latency from start to actual start was 65 ms
The duration from actual start to stop was 165 ms

Contents of the CSV file myMetrics.csv with the extracted relevant data.

Timing Metrics, 134, 65, 165

Publish the output file myOutput.log with the extracted metrics in myMetrics.csv:

$ testspace test-results*.xml "myOutput.log{myMetrics.csv:my description}"

Test Suite Metrics

A simple example of publishing custom metrics associated with a well known test suite reported from your test session.

This requires knowing the name of the SUITE

If your test session generates ancillary data. Being stored in a CSV file, e.g. myMetrics.csv, those data could be published as custom metrics associated with a well known test suite reported in that test session, e.g. MySuiteName:

$ testspace test-results*.xml "[MySuiteName].{myMetrics.csv:my-description}"

External Service Metrics

A simple example of publishing a link to externally hosted content with custom metrics.

Assuming AWS CloudWatch Logs is used for your application/service logging. Create a link file cloudwatch.url to the related log with appropriate filter:

echo "InternetShortcut.URL=${BASE_URL}${TASK_ID}:group=$LOG_GROUP;stream=$LOG_STREAM;filter=$EVENT_FILTER" > cloudwatch.url

Create a CSV file cloudwatch.csv with extracted relevant data from the related log.

aws logs filter-log-events --log-group-name "$LOG_GROUP" --log-stream-names "$LOG_STREAM" --filter-pattern "$EVENT_FILTER" > events.json

errors=$(cat events.json | jq '.events | map(select(.message | test("ERROR|FATAL"))) | length')
warnings=$(cat events.json | jq '.events | map(select(.message | test("WARN"))) | length')
echo "$warnings,$errors" > cloudwatch.csv

Publish the link file cloudwatch.url with the extracted metrics in cloudwatch.csv:

$ testspace test-results*.xml "cloudwatch.url{cloudwatch.csv:my-description}"


The addition of supplemental data files (log files, screen shots, etc.) to your result set is easily done using the plus symbol (+) as a prefix.

Example annotating the root folder:

$ testspace myResults*.xml "+log.txt{this is my log}"

Test Session Annotations

A simple example of annotating a well known test suite reported from your test session.

This requires knowing the name of the SUITE

If your test session generates suplemental data files, e.g. screen.png, those files associated with a well known test suite reported in that test session, e.g. MySuiteName:

$ testspace test-results*.xml "[MySuiteName]+/path/to/screen.png{screen shot}"


Results can be organized into folders via the client and/or the repo's source structure.


The organization of results into a folder can be done by bracket-surrounded-path prefixing the file being pushed:

$ testspace "[myFolder]myResults*.xml"

When specifying a sub-folder use the forward-slash (/) as the folder separator.

$ testspace "[myTests/mySubfolder]myResults*.xml"

Similarly, annotations could be associated with any folder:

$ testspace [myFolder]myResults*.xml "[myFolder]+/path/to/screen.png{screen shot}"

One example when using a matrix, a folder is used to store the results specific to each matric entry:

name: CI
on: push
runs-on: ${{ matrix.os }}
os: [ubuntu-latest, macos-latest, windows-latest]
- name: Publish Results to Testspace
run: testspace "[ ${{ matrix.os }} ]results.xml"

Repo Source

Using your test source directory to organize your results can significantly improve your results organization. With path/to {metadata} the Testspace client could be directed to follow the source tree organization:

$ testspace "[Folder1]myResults*.xml{path/to/test-source}"

Using the {path/to/test-source} report descriptor allows automatic organization of the results based on the tests source directory structure

Notice, if your source location is not in a subdirectory of the current directory you may need to appropriatly specify either absolute (e.g. {/path/to/test-source}) or '..' relative (e.g. {../path/to/test-source}) path.

For example, using the following hypothetical directory structure and assuming you are running from the mydir directory:


The test source directory can be provided as {metadata} and the Testspace Client will organize all results based on this hierarchy.

$ testspace "[Tests]test/reports/results*.xml{test}"

Content List

Instead of listing the files to aggregate explicitly on the command line you can provide them via a "content-list" file.

For example having a content-list.txt file:


When specifying a file containing a list of files use the at symbol (@):

$ testspace @content-list.txt

Note if a file is missing it will be captured as an error annotation in the results root folder

Incremental Pushing

You can invoke the Testspace Client multiple times and aggregate all of the results using the how parameter. This can be useful when the tests run on multiple independent environments.

$ testspace myResults1.xml "space-name?start#my-results"
$ testspace myResults2.xml "space-name?add#my-result"
$ testspace "space-name?finish#my-results"
$ testspace "[Folder1]/path/to/*.xml" "space-name?start#my-results"
$ testspace "[Folder2]/path/to/*.xml" "space-name?add#my-result"
$ testspace "space-name?finish#my-results"