Push Results
Files – output from the Continuous Integration process – are pushed to the Testspace Server using a simple command line client; Windows, Linux, and macOS versions available. Adding the Testspace client to a CI workflow is covered in the next section.
To push content to Testspace simply provide the client
a list of files and destination URL:
$ testspace file1 file2 .. my-access-token:@my-org.testspace.com/my-project/my-space
Note: The following examples are used to demonstrate publishing content using different scenarios. It is assumed you have created a Project and a Space and have appropriately configured the Testspace client's destination URL.
File Content
The following types of file content is supported, with additional information available here:
Type | Description |
---|---|
Test Output | Widely adopted test framework output formats such as JUnit , NUnit , Visual Studio TRX , xUnit.net , etc. |
Code Coverage | Common coverage tools output formats such as Cobertura , Clover , Bullseye , Visual Studio Coverage , etc. |
Static Analysis | Large set of widely used formats such as Checkstyle , Klocwork , PMD , Visual Studio FxCop & PREfast , etc. |
Custom Metrics | A CSV (.csv ) file associated with a Text File or a Test Output suite. |
Text File | Plain text (.txt or .log extension), HTML (.html ), and Markdown (.md ) files. |
Test Output
To publish test result simply push a xml
output file generated by the test tool. Refer here for the test output formats supported.
$ testspace tests-results.xml
Some test frameworks produce multiple results files that you may want to aggregate into a single Testspace result set. There are several ways to accomplish this and is demonstrated with examples below.
In each of the separate following examples the result files listed will be aggregated and published as a single result set:
$ testspace myResults1.xml myResults2.xml
$ testspace myResults*.xml
$ testspace /path/to/*.xml ReleaseNotes.txt
Sub-directories
To aggregate result files from multiple sub-directories use the double star recursive search notation:
$ testspace /path/to/results/root/**/*.xml
To aggregate result files from sub-directories use the single star search notation:
$ testspace /path/*/results/root/*.xml
Process Output
For a step in your automation process (install, build, etc.) that do not have test cases associated with it, but if it fails you would like it be noted. The output log (for example) and the process return code could be used to report status. The suite status is set by adding the process return code as an integer 'N' as metadata to a file being pushed to Testspace. Any non-zero integer will make the suite failed
.
Example of pushing log file and it return code to Testspace.
# In this example $MYPROC is set to the return code of the process that generated myproc.log
# Publish output and exit code
$ testspace "myproc.log{$MYPROC:my description}"
Code Coverage
To publish code coverage simply push a xml
output file generated by the coverage tool. Refer here for the code coverage formats supported.
$ testspace code-coverage.xml
Third Party
Testspace supports using the third party coverage tools Coveralls.io and Codecov.io. Both of these tools report lines covered as a percentage. Testspace uses the minimum configuration set by these tools, to update see Codecov configuration or Coveralls configuration. Testspace will automatically create a metrics graph and add the badges for the third party tool to your project upon first push with --link
option.
For example:
$ testspace tests-results.xml --link=codecov
$ testspace tests-results.xml --link=coveralls
Example using a private repo:
$ testspace tests-results.xml --link="TOKEN@codecov"
Static Analysis
To publish static analysis simply push a xml
output file generated by the analysis tool. Refer here for the static analysis formats supported.
$ testspace static-analysis.xml
Some compilers and SA tools report results in plain text. The lint
{meta-tag} identifies the output as analysis results.
$ testspace "build-output.log{lint}"
Custom Content
Preformatted human-readable output files can be published as custom suites. There are 4 types of output files supported .txt
, .log
, .md
, and .html
.
A simple example of publishing a log file.
$ testspace myOutput.log
External Content
Any externally hosted content can be published as a linked suite. The URL of the external entity needs to be stored in a .url
file, e.g. myLink.url
:
InternetShortcut.URL=https://domain.com/whatever
Having that, it could be published as simple as:
$ testspace myLink.url
Custom Metrics
Most applications produce output files containing ancillary data of interest or importance. If formatted as CSV (.csv) files, you can publish these files as custom metrics associated with test results.
To add a custom metric along with your test results you need to push two files
- Output file for content reference.
- Text file containing comma-separated values (i.e.
.csv
file).
Each row in the CSV file is a separate metric, with an optional 1st column that is text, used as the metric's name. Otherwise the name of the metric defaults to metric-n
.
Note that each row in the CSV file is a separate metric, with an optional 1st column that is text, used as the metric's name.
For information on how to customize the user interface associated with the metric refer here.
Log File Metrics
A simple example of publishing a log file with custom metrics.
Contents of the output log file myOutput.log
File myOutput.log: contains some simple timing metrics.
The delay before start was 134 ms
The latency from start to actual start was 65 ms
The duration from actual start to stop was 165 ms
Contents of the CSV file myMetrics.csv
with the extracted relevant data.
Timing Metrics, 134, 65, 165
Publish the output file myOutput.log
with the extracted metrics in myMetrics.csv
:
$ testspace test-results*.xml "myOutput.log{myMetrics.csv:my description}"
Test Suite Metrics
A simple example of publishing custom metrics associated with a well known test suite reported from your test session.
This requires knowing the name of the SUITE
If your test session generates ancillary data. Being stored in a CSV file, e.g. myMetrics.csv
, those data could be published as custom metrics associated with a well known test suite reported in that test session, e.g. MySuiteName
:
$ testspace test-results*.xml "[MySuiteName].{myMetrics.csv:my-description}"
External Service Metrics
A simple example of publishing a link to externally hosted content with custom metrics.
Assuming AWS CloudWatch Logs is used for your application/service logging. Create a link file cloudwatch.url
to the related log with appropriate filter:
BASE_URL="https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEventViewer"
LOG_GROUP="your-group-name"
LOG_STREAM="your-stream-name"
EVENT_FILTER="?ERROR?FATAL?WARN"
echo "InternetShortcut.URL=${BASE_URL}${TASK_ID}:group=$LOG_GROUP;stream=$LOG_STREAM;filter=$EVENT_FILTER" > cloudwatch.url
Create a CSV file cloudwatch.csv
with extracted relevant data from the related log.
aws logs filter-log-events --log-group-name "$LOG_GROUP" --log-stream-names "$LOG_STREAM" --filter-pattern "$EVENT_FILTER" > events.json
errors=$(cat events.json | jq '.events | map(select(.message | test("ERROR|FATAL"))) | length')
warnings=$(cat events.json | jq '.events | map(select(.message | test("WARN"))) | length')
echo "$warnings,$errors" > cloudwatch.csv
Publish the link file cloudwatch.url
with the extracted metrics in cloudwatch.csv
:
$ testspace test-results*.xml "cloudwatch.url{cloudwatch.csv:my-description}"
Annotations
The addition of supplemental data files
(log files, screen shots, etc.) to your result set is easily done using the plus symbol (+
) as a prefix.
Example annotating the root folder:
$ testspace myResults*.xml "+log.txt{this is my log}"
Test Session Annotations
A simple example of annotating a well known test suite reported from your test session.
This requires knowing the name of the SUITE
If your test session generates suplemental data files, e.g. screen.png
, those files associated with a well known test suite reported in that test session, e.g. MySuiteName
:
$ testspace test-results*.xml "[MySuiteName]+/path/to/screen.png{screen shot}"
Folders
Results can be organized into folders via the client and/or the repo's source structure.
Custom
The organization of results into a folder can be done by bracket-surrounded-path prefixing the file being pushed:
$ testspace "[myFolder]myResults*.xml"
When specifying a sub-folder
use the forward-slash (/) as the folder separator.
$ testspace "[myTests/mySubfolder]myResults*.xml"
Similarly, annotations could be associated with any folder:
$ testspace [myFolder]myResults*.xml "[myFolder]+/path/to/screen.png{screen shot}"
One example when using a matrix, a folder
is used to store the results specific to each matric entry:
name: CI
on: push
jobs:
test:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
steps:
..
- name: Publish Results to Testspace
run: testspace "[ ${{ matrix.os }} ]results.xml"
Repo Source
Using your test source directory to organize your results can significantly improve your results organization. With path/to
{metadata} the Testspace client
could be directed to follow the source tree organization:
$ testspace "[Folder1]myResults*.xml{path/to/test-source}"
Using the {path/to/test-source}
report descriptor allows automatic organization of the results based on the tests source directory structure
Notice, if your source location is not in a subdirectory of the current directory you may need to appropriatly specify either absolute (e.g.
{/path/to/test-source}
) or '..' relative (e.g.{../path/to/test-source}
) path.
For example, using the following hypothetical directory structure and assuming you are running from the mydir
directory:
mydir
src
test
controller
test1.php
model
test2.php
test3.php
view
testN.php
reports
results1.xml
results2.xml
coverage.xml
The test
source directory can be provided as {metadata} and the Testspace Client will organize all results based on this hierarchy.
$ testspace "[Tests]test/reports/results*.xml{test}"
Content List
Instead of listing the files to aggregate explicitly on the command line you can provide them via a "content-list" file.
For example having a content-list.txt
file:
Tests.xml
Coverage.xml
Analysis.xml
Text.txt
$PATH_TO/thisfile.txt
When specifying a file containing a list of files use the at symbol (@
):
$ testspace @content-list.txt
Note if a file is missing it will be captured as an error annotation in the results root folder
Incremental Pushing
You can invoke the Testspace Client multiple times and aggregate all of the results using the how parameter. This can be useful when the tests run on multiple independent environments.
$ testspace myResults1.xml "space-name?start#my-results"
$ testspace myResults2.xml "space-name?add#my-result"
$ testspace "space-name?finish#my-results"
$ testspace "[Folder1]/path/to/*.xml" "space-name?start#my-results"
$ testspace "[Folder2]/path/to/*.xml" "space-name?add#my-result"
$ testspace "space-name?finish#my-results"