Main Content

Manage Design Artifacts for Analysis in the Model Maintainability Dashboard

When you develop software units and components using Model-Based Design, use the Model Maintainability Dashboard to assess the status and quality of your models. The Model Maintainability Dashboard analyzes the design artifacts in your project and provides detailed metric measurements on the size, architecture, and complexity of these design artifacts.

Model Maintainability Dashboard showing results for unit cc_ControlMode

Each metric in the dashboard measures a different aspect of the quality of your design and reflects guidelines in industry-recognized software development standards, such as ISO 26262. To monitor the maintainability of your models in the Model Maintainability Dashboard, maintain your artifacts in a project and follow these considerations. For more information on using the Model Maintainability Dashboard, see Monitor the Complexity of Your Design Using the Model Maintainability Dashboard.

Manage Artifact Files in a Project

To analyze your requirements-based testing activities in the Model Maintainability Dashboard, store your design and testing artifacts in a MATLAB® project. The artifacts that the model design metrics analyze include:

  • Simulink® models

  • Libraries that the models use

  • Stateflow® charts

  • MATLAB code

For information on how the dashboard traces dependencies between project files, see Digital Thread.

When your project contains many models and model reference hierarchies, you can configure the dashboard to recognize the different testing levels of your models. You can specify which entities in your software architecture are units or higher-level components by labeling them in your project and configuring the Model Maintainability Dashboard to recognize the labels. The dashboard organizes your models in the Artifacts panel according to their testing levels and the model reference hierarchy. For more information, see Categorize Models in a Hierarchy as Components or Units.

Trace Artifacts to Units and Components

To determine which artifacts are in the scope of a unit or component, the dashboard analyzes the traceability links between the artifacts, software units, and components in the project. The Project panel lists the units, organized by the components that reference them.

Project panel showing units under a component

When you select a unit or component in the Project panel, the Artifacts panel shows the artifacts that trace to the selected unit or component. Traced artifacts include:

  • Functional Requirements

  • Design Artifacts

  • Tests

  • Test Results

Artifacts panel showing traced artifacts for a unit

To see the traceability paths that the dashboard found, click the Trace View button in the toolstrip. For more information, Explore Traceability Information for Units and Components.

In the Artifacts panel, the folder Trace Issues contains unexpected requirement links, requirements links which are broken or not supported by the dashboard, and artifacts that the dashboard cannot trace to a unit or component. To help identify the type of tracing issue, the folder Trace Issues contains subfolders for Unexpected Implementation Links, Unresolved and Unsupported Links, Untraced Tests, and Untraced Results. For more information, see Resolve Missing Artifacts, Links, and Results.

If an artifact returns an error during traceability analysis, the panel includes the artifact in an Errors folder. Use the traceability information in these sections to check if the artifacts trace to the units or components that you expect. To see details about the warnings and errors that the dashboard finds during artifact analysis, click Artifact Issues in the toolstrip. For more information, see View Artifact Issues in Project.

Functional Requirements

The folder Functional Requirements shows requirements of Type Functional that are either implemented by or upstream of the unit or component.

When you collect metric results, the dashboard analyzes only the functional requirements that the unit or component directly implements. The folder Functional Requirements contains two subfolders to help identify which requirements are implemented by the unit or component, or are upstream of the unit or component:

  • Implemented — Functional requirements that are directly linked to the unit or component with a link Type of Implements. The dashboard uses these requirements in the metrics for the unit or component.

  • Upstream — Functional requirements that are indirectly or transitively linked to the implemented requirements. The dashboard does not use these requirements in the metrics for the unit or component.

If a requirement does not trace to a unit or component, it appears in the Trace Issues folder. If a requirement does not appear in the Artifacts panel when you expect it to, see Requirement Missing from Artifacts Panel.

Use the Requirements Toolbox™ to create or import the requirements in a requirements file (.slreqx).

Design Artifacts

The folder Design shows:

  • The model file that contains the block diagram for the unit or component.

  • Models that the unit or component references.

  • Libraries that are partially or fully used by the model.

  • Data dictionaries that are linked to the model.

Tests

The folder Tests shows tests and test harnesses that trace to the selected unit.

When you collect metric results for a unit, the dashboard analyzes only the tests for unit tests. The folder Tests contains subfolders to help identify whether a test is considered a unit test and which test harnesses trace to the unit:

  • Unit Tests — Tests that the dashboard considers as unit tests. A unit test directly tests either the entire unit or lower-level elements in the unit, like subsystems. The dashboard uses these tests in the metrics for the unit.

  • Others — Tests that trace to the unit but that the dashboard does not consider as unit tests. For example, the dashboard does not consider tests on a library to be unit tests. The dashboard does not use these tests in the metrics for the unit.

  • Test Harnesses — Test harnesses that trace to the unit or lower-level elements in the unit. Double-click a test harness to open it.

If a test does not trace to a unit, it appears in the Trace Issues folder. If a test does not appear in the Artifacts panel when you expect it to, see Test Missing from Artifacts Panel. For troubleshooting tests in metric results, see Fix a test that does not produce metric results.

Create tests by using Simulink Test™.

Test Results

When you collect metric results for a unit, the dashboard analyzes only the test results from unit tests. The folder Test Results contains subfolders to help identify which test results are from unit tests:

  • The subfolders for Model, SIL, and PIL contain simulation results from normal, software-in-the-loop (SIL), and processor-in-the-loop (PIL) unit tests, respectively. The dashboard uses these results in the metrics for the unit.

    The following types of test results are shown:

    • Saved test file icon Saved test results — results that you have collected in the Test Manager and have exported to a results file.

    • Temporary test results iconTemporary test results — results that you have collected in the Test Manager but have not exported to a results file. When you export the results from the Test Manager the dashboard analyzes the saved results instead of the temporary results. Additionally, the dashboard stops recognizing the temporary results when you close the project or close the result set in the Simulink Test Result Explorer. If you want to analyze the results in a subsequent test session or project session, export the results to a results file.

  • Others — Results that are not simulation results, are not from unit tests, or are only reports. For example, SIL results are not simulation results. The dashboard does not use these results in the metrics for the unit.

If a test result does not trace to a unit, it appears in the Trace Issues folder. If a test result does not appear in the Artifacts panel when you expect it to, see Test Result Missing from Artifacts Panel. For troubleshooting test results in dashboard metric results, see Fix a test result that does not produce metric results.

Trace Issues

The folder Trace Issues shows artifacts that the dashboard has not traced to any units or components. Use the folder Trace Issues to check if artifacts are missing traceability to the units or components. The folder Trace Issues contains subfolders to help identify the type of tracing issue:

  • Unexpected Implementation Links — Requirement links of Type Implements for a requirement of Type Container or Type Informational. The dashboard does not expect these links to be of Type Implements because container requirements and informational requirements do not contribute to the Implementation and Verification status of the requirement set that they are in. If a requirement is not meant to be implemented, you can change the link type. For example, you can change a requirement of Type Informational to have a link of Type Related to.

  • Unresolved and Unsupported Links — Requirements links that are either broken in the project or not supported by the dashboard. For example, if a model block implements a requirement, but you delete the model block, the requirement link is now unresolved. The dashboard does not support traceability analysis for some artifacts and some links. If you expect a link to trace to a unit or component and it does not, see the troubleshooting solutions in Resolve Missing Artifacts, Links, and Results.

  • Untraced Tests — Tests that execute on models or lower-level elements, like subsystems, that are not on the project path.

  • Untraced Results — Results that the dashboard cannot trace to a test. For example, if a test produces a result, but you delete the test, the dashboard cannot trace the results to the test.

The dashboard does not support traceability analysis for some artifacts and some links. If an artifact is untraced when you expect it to trace to a unit or component, see the troubleshooting solutions in Trace Issues.

Artifact Errors

The folder Errors appears if artifacts returned errors when the dashboard performed artifact analysis. These are some errors that artifacts might return during traceability analysis:

  • An artifact returns an error if it has unsaved changes when traceability analysis starts.

  • A test results file returns an error if it was saved in a previous version of Simulink.

Open these artifacts and fix the errors. The dashboard shows a banner at the top of the dashboard to indicate that the artifact traceability shown in the Project and Artifacts panels is outdated. Click the Trace Artifacts button on the banner to refresh the data in the Project and Artifacts panels.

Artifact Issues

To see details about artifacts that cause errors, warnings, and informational messages during analysis, click the Artifact Issues button in the toolstrip. You can sort the messages by their type: Error, Warning, and Info.

The messages show:

  • Modeling constructs that the dashboard does not support

  • Links that the dashboard does not trace

  • Test harnesses or cases that the dashboard does not support

  • Test results missing coverage or simulation results

  • Artifacts that return errors when the dashboard loads them

  • Information about model callbacks that the dashboard deactivates

  • Files that have path traceability issues

  • Artifacts that are not on the path and are not considered during tracing

Collect Metric Results

The Model Maintainability Dashboard can collect metric results for each unit and component listed in the Project panel. Each metric in the dashboard measures a different aspect of the quality of your model maintainability and reflects guidelines in industry-recognized software development standards, such as ISO 26262. For more information about the available metrics and the results that they return, see Model Maintainability Metrics.

As you edit and save the artifacts in your project, the dashboard detects changes to the artifacts. If the metric results might be affected by your artifact changes, the dashboard shows a warning banner at the top of the dashboard to indicate that the metric results are stale. Affected widgets have a gray staleness icon . To update the results, click the Collect button on the warning banner to re-collect the metric data and to update the stale widgets with data from the current artifacts. If you want to collect metrics for each of the units and components in the project, click Collect > Collect All.

See Also

Related Topics