User Tools



Push Data

Files – output from the Continuous Integration process – are pushed to the Testspace Server using a simple command line client; Windows, Linux, and macOS versions available. Information on downloading each version can be found under the reference section of this guide. Adding the Testspace client to the CI workflow is covered in the next how-to section.

The following types of file content are supported:

Type Description
Tests Widely adopted test framework output formats such as JUnit, NUnit, Visual Studio TRX, xUnit.net, etc.
Code Coverage Common coverage tools output formats such as Cobertura, Clover, Bullseye, Visual Studio Coverage, etc.
Static Analysis Large set of widely used formats such as Checkstyle, Klocwork, PMD, Visual Studio FxCop & PREfast, etc.
Custom Metric A CSV (.csv) file associated with a Text File.
Text File Plain text (.txt or .log extension), HTML (.html), and Markdown (.md) files.
Note: The following examples are used to demonstrate publishing test content using different scenarios. It is assumed you have created a Project and a Space and have configured the Testspace client appropriately. Use testspace config command to confirm:
domain: username:password@my-org.testspace.com
project: my-project
space: 

To push content to Testspace simply provide the client a list of files:

testspace file1.xml file2.xml "space-name"
testspace Tests.xml Coverage.xml Analysis.xml Text.txt "space-name"

Content List

Instead of listing the files to aggregate explicitly on the command line you can provide them via a "content-list" file.

Note if a file is missing it will be captured as an error annotation in the Schema's root folder

For example having a content-list.txt file:

Tests.xml
Coverage.xml
Analysis.xml
Text.txt

When specifying a file containing a list of files use the at symbol (@):

testspace @content-list.txt "space-name"

Tests

Some test frameworks produce multiple results files that you may want to aggregate into a single Testspace result set. There are several ways to accomplish this and is demonstrated with examples below, using the assumption that the default Project URL has been specified.

In these examples each result file will be aggregated and published as a single result set:

testspace myResults1.xml myResults2.xml "space-name"
testspace myResults*.xml "space-name"
testspace /path/to/*.xml ReleaseNotes.txt "space-name"

To aggregate result files from multiple sub-directories use the double star recursive search notation:

testspace /path/to/results/root/**/*.xml "space-name"

To aggregate result files from sub-directories use the single star search notation:

testspace /path/*/results/root/*.xml "space-name"

Organizing Results

Using the {/path/to/test-source} report descriptor allows automatic organization of the results based on the tests source directory structure

The organization of results into folders and a hierarchy based on the test source directory structure is done simply as described below.

testspace "[Folder1]myResults*.xml{/path/to/test-source}" "space-name"

Folders

For example:

testspace "[myTests]myResults*.xml" "space-name"
testspace "[myTests]myResults*.xml" "space-name"

When specifying sub-folders use the forward slash (/) as the folder separator.

testspace "[myTests/mySubfolder]myResults*.xml" "space-name"

Source Directory

Check out this example under the Tests folder

Using your test source directory to organize your results can significantly improve your results organization. Using the following hypothetical directory structure and assuming you are running from the mydir directory:

mydir
  src
  test
    controller
      test1.php
    model
      test2.php
      test3.php
    view
      testN.php
    reports
      results1.xml
      results2.xml
      coverage.xml

The test source directory can be provided as {metadata} and the Testspace Client will organize all results based on this hierarchy.

For example:

testspace "[Tests]test/reports/results*.xml{test}" "space-name"

Metrics

Testspace has built-in support for tracking quality metrics of interest for software development teams. This includes the typical ones such as test cases, code coverage and static analysis.1) Testspace also has the ability to handle Custom Metrics that are defined by the users to track additional items of importance.

Test Metrics

The Metrics tab by default has graphs created for both Test Suites and Test Cases when a Space is created. In addition, a badge is also created with the percent of Test Cases passing. The Pass/Fail threshold for the Test Cases contribution to a Space's Health is set to all non-exempt test cases must be passing.2)

Metric Name Threshold Badge Name
Test Suites N/A N/A
Test Cases non-exempt passing test

Code Coverage

Code Coverage Metric graph(s) are created when a code coverage file3) is pushed for the first time. The names of the type of coverage are listed in the table below, however, keep in mind that different coverage tools refer to the same thing with different names such as function or method coverage. The default threshold for determining Space Health for all Coverage Metrics is set to 50%, however, this is configurable to a value of your choosing for each type of Coverage Metric.

For example:

testspace tests.xml coverage.xml "space-name"
Metric Name Threshold Badge Name
Code Coverage (lines) or sequences 50% coverage
Code Coverage (methods) or functions 50% coverage(fx)
Code Coverage (branches) or decisions 50% coverage(?:)

Static Analysis

Static Analysis Metrics are also created when the first result that contains this information is pushed 4). The Static Analysis Metric by default does not contribute to a Space's Health, but is set to informational. The Health contribution for Static Analysis is configurable based on whether you have High, Medium or Low severity issues.

Metric Name Severity Threshold Badge Name
Static Analysis (issues) Medium or High severity Informational analysis

For example:

testspace static-analysis.xml [Tests]tests.xml coverage.xml "space-name"
Note: Some compilers and SA tools report results in plain text. The lint {meta-tag} identifies the output as analysis results.
testspace "build-output.log{lint}" [Tests]tests.xml coverage.xml "space-name"

Custom Metrics

Most applications produce output files containing ancillary data of interest or importance. With Testspace you can publish these files with the ancillary data handled as custom metrics.5) There are 4 types of output files supported .txt, .log, .md, and .html, with the data of interest passed in as {metadata} using a separate .csv file.

A simple example of publishing a log file with custom metrics.

Contents of the output log file myOutput.log

File myOutput.log: contains some simple timing metrics.

The delay before start was 134 ms 
The latency from start to actual start was 65 ms
The duration from actual start to stop was 165 ms

Contents of the file myMetrics.csv with the extracted relevant data.

Timing Metrics, 134, 65, 165

The following example publishes the output file myOutput.log the with the extracted metrics in myMetrics.csv with a description to a folder labeled Timing.

testspace [Tests]results*.xml "myOutput.log{myMetrics.csv:my-description}" "space-name"

Additional Options

Named Results

You can name the result set by specifying the name parameter. This requires appending the hash symbol (#) to the Space name, followed by text.

For example:

testspace myResults1.xml myResults2.xml "space-name#my-results"

Adding Annotations

The addition of supplemental data files (log files, screen shots, etc.) to your result set is easily done using the add symbol (+) as a prefix, see Testspace Client.

For example:

testspace myResults*.xml "+log.txt{this is my log}" "space-name"
testspace myResults*.xml "+/path/to/screen.png{screen shot}" "space-name"
testspace myResults*.xml "[myCoverage]coverage.xml" "[myCoverage]+coverage.xml{raw coverage xml report}" "space-name"

Failure from Log

For steps in your automation process that do not have test cases associated with it, but if it fails you would like it be noted. This suite will have a test case added only in the case of a failure and is best suited to non test processes(install, build, etc.). The suite status is set by adding the process return code as an integer 'N' as metadata to a log file being pushed to Testspace. Any non-zero integer will add a failed test case to the suite.

testspace content.txt{N:metrics.csv:description}

Example of pushing log file and it return code to Testspace.

# In this example $MYPROC is set to the return code of the process that generated myproc.log
# Publish output and exit code
testspace "myproc.log{$MYPROC:my description}" My-Space

Incremental Results

You can invoke the Testspace Client multiple times and aggregate all of the results using the how parameter. This can be useful when the tests run on multiple machines.

For example:

testspace myResults1.xml "space-name?start#my-results"
testspace myResults2.xml "space-name?add#my-result"
testspace "space-name?finish#my-results"
testspace "[Folder1]/path/to/*.xml" "space-name?start#my-results"
testspace "[Folder2]/path/to/*.xml" "space-name?add#my-result"
testspace "space-name?finish#my-results"

How To

1) How to update and manage Standard Metrics
2) For information of how to manage exemptions see how to Exempt Failing Test Cases
3) For a list of supported code coverage file formats see Code Coverage Formats
4) For a list of supported code coverage file formats see Static Analysis Formats
5) A detailed example of using and creating Custom Metrics

Page Tools