Main Content

Output Results for Continuous Integration Systems

You can create model tests that are compatible with continuous integration (CI) systems such as Jenkins®. To create CI-compatible results, run your Simulink® Test™ files using MATLAB® Unit Test.

To run CI-compatible tests, follow this general procedure:

  1. Create a test suite from the MLDATX test file.

  2. Create a test runner.

  3. Create plugins for the test output or coverage results.

    • For test outputs, use the TAPPlugin or XMLPlugin.

    • For model coverage, use the ModelCoveragePlugin and CoberturaFormat. When collecting model coverage in Cobertura format:

      • Only top model coverage is reflected in the Cobertura XML.

      • Only model Decision coverage is reflected, and it is mapped to Condition elements in Cobertura XML.

  4. Create plugins for CI-compatible output.

  5. Add the plugins to the test output or coverage results.

  6. Add the test output plugins or coverage result plugins to the test runner.

  7. Run the test.

Test a Model for Continuous Integration Systems

This example shows how to test a model, publish Test Manager results, and output results in TAP format with a single execution.

You use MATLAB® Unit Test to create a test suite and a test runner, and customize the runner with these plugins:

The test case creates a square wave input to a controller subsystem and sweeps through 25 iterations of parameters a and b. The test compares the alpha output to a baseline with a tolerance of 0.0046. The test fails on those iterations in which the output exceeds this tolerance.

1. Open the Simulink® Test™ test file.

testfile = fullfile('f14ParameterSweepTest.mldatx');
sltest.testmanager.view;
sltest.testmanager.load(testfile);

2. In the Test Manager, configure the test file for reporting.

Under Test File Options, select Generate report after execution. The section expands, displaying several report options. For more information, see Save Reporting Options with a Test File.

3. Create a test suite from the Simulink® Test™ test file.

import matlab.unittest.TestSuite

suite = testsuite('f14ParameterSweepTest.mldatx');

4. Create a test runner.

import matlab.unittest.TestRunner

f14runner = TestRunner.withNoPlugins;

5. Add the TestReportPlugin to the test runner.

The plugin produces a MATLAB Test Report F14Report.pdf.

import matlab.unittest.plugins.TestReportPlugin

pdfFile = 'F14Report.pdf';
trp = TestReportPlugin.producingPDF(pdfFile);
addPlugin(f14runner,trp)

6. Add the TestManagerResultsPlugin to the test runner.

The plugin adds Test Manager results to the MATLAB Test Report.

import sltest.plugins.TestManagerResultsPlugin

tmr = TestManagerResultsPlugin; 
addPlugin(f14runner,tmr)

7. Add the TAPPlugin to the test runner.

The plugin outputs to the F14Output.tap file.

import matlab.unittest.plugins.TAPPlugin
import matlab.automation.streams.ToFile

tapFile = 'F14Output.tap';
tap = TAPPlugin.producingVersion13(ToFile(tapFile));
addPlugin(f14runner,tap)

8. Run the test.

Several iterations fail, in which the signal-baseline difference exceeds the tolerance criteria.

result = run(f14runner,suite);
Generating test report. Please wait.
    Preparing content for the test report.
    Adding content to the test report.
    Writing test report to file.
Test report has been saved to:
 /tmp/Bdoc24b_2679053_1875055/tp232742f2/simulinktest-ex40056435/F14Report.pdf

A single execution of the test runner produces two reports:

  • A MATLAB Test Report that contains Test Manager results.

  • A TAP format file that you can use with CI systems.

sltest.testmanager.clearResults
sltest.testmanager.clear
sltest.testmanager.close

Model Coverage Results for Continuous Integration

This example shows how to generate model coverage results for use with continuous integration. Coverage is reported in the Cobertura format. You run a Simulink® Test™ test file using MATLAB® Unit Test.

1. Import classes and create a test suite from the test file AutopilotTestFile.mldatx.

import matlab.unittest.TestRunner

aptest = sltest.testmanager.TestFile('AutopilotTestFile.mldatx');
apsuite = testsuite(aptest.FilePath);

2. Create a test runner.

trun = TestRunner.withNoPlugins;

3. Set the coverage metrics to collect. This example uses decision coverage. In the Cobertura output, decision coverage is listed as condition elements.

import sltest.plugins.coverage.CoverageMetrics

cmet = CoverageMetrics('Decision',true);

4. Set the coverage report properties. This example produces a file R13Coverage.xml in the current working folder. Ensure your working folder has write permissions.

import sltest.plugins.coverage.ModelCoverageReport
import matlab.unittest.plugins.codecoverage.CoberturaFormat

rptfile = 'R13Coverage.xml';
rpt = CoberturaFormat(rptfile)
rpt = 
  CoberturaFormat with no properties.

5. Create a model coverage plugin. The plugin collects the coverage metrics and produces the Cobertura format report.

import sltest.plugins.ModelCoveragePlugin

mcp = ModelCoveragePlugin('Collecting',cmet,'Producing',rpt)
mcp = 
  ModelCoveragePlugin with properties:

    RecordModelReferenceCoverage: '<default>'
                 MetricsSettings: [1x1 sltest.plugins.coverage.CoverageMetrics]
             ScopeToRequirements: 0

6. Add the coverage plugin to the test runner.

addPlugin(trun,mcp)

% Turn off command line warnings:
warning off Stateflow:cdr:VerifyDangerousComparison
warning off Stateflow:Runtime:TestVerificationFailed

7. Run the test.

APResult = run(trun,apsuite)
APResult = 
  TestResult with properties:

          Name: 'AutopilotTestFile > Basic Design Test Cases/Requirement 1.3 Test'
        Passed: 0
        Failed: 1
    Incomplete: 0
      Duration: 0.7437
       Details: [1x1 struct]

Totals:
   0 Passed, 1 Failed, 0 Incomplete.
   0.7437 seconds testing time.

8. Reenable warnings.

warning on Stateflow:cdr:VerifyDangerousComparison
warning on Stateflow:Runtime:TestVerificationFailed

See Also

| | | | |

Related Topics