Main Content

recordMetrics

Record metric values in experiment results table and training plot

Since R2021a

    Description

    recordMetrics(monitor,step,metricName=metricValue) records the specified metric value for a trial in the Experiment Manager results table and training plot.

    recordMetrics(monitor,step,metricName1=metricValue1,...,metricNameN=metricValueN) records multiple metric values for a trial.

    example

    recordMetrics(monitor,step,metricStructure) records the metric values specified by the structure metricStructure.

    example

    Examples

    collapse all

    Use an experiments.Monitor object to track the progress of the training, display information and metric values in the experiment results table, and produce training plots for custom training experiments.

    Before starting the training, specify the names of the information and metric columns of the Experiment Manager results table.

    monitor.Info = ["GradientDecayFactor","SquaredGradientDecayFactor"];
    monitor.Metrics = ["TrainingLoss","ValidationLoss"];

    Specify the horizontal axis label for the training plot. Group the training and validation loss in the same subplot.

    monitor.XLabel = "Iteration";
    groupSubPlot(monitor,"Loss",["TrainingLoss","ValidationLoss"]);

    Specify a logarithmic scale for the loss. You can also switch the y-axis scale by clicking the log scale button in the axes toolbar.

    yscale(monitor,"Loss","log")

    Update the values of the gradient decay factor and the squared gradient decay factor for the trial in the results table.

    updateInfo(monitor, ...
        GradientDecayFactor=gradientDecayFactor, ...
        SquaredGradientDecayFactor=squaredGradientDecayFactor);

    After each iteration of the custom training loop, record the value of training and validation loss for the trial in the results table and the training plot.

    recordMetrics(monitor,iteration, ...
        TrainingLoss=trainingLoss, ...
        ValidationLoss=validationLoss);

    Update the training progress for the trial based on the fraction of iterations completed.

    monitor.Progress = 100 * (iteration/numIterations);

    Use a structure to record metric values in the results table and the training plot.

    structure.TrainingLoss = trainingLoss;
    structure.ValidationLoss = validationLoss;
    recordMetrics(monitor,iteration,structure);

    Input Arguments

    collapse all

    Experiment monitor for the trial, specified as an experiments.Monitor object. When you run a custom training experiment, Experiment Manager passes this object as the second input argument of the training function.

    Custom training loop step, such as the iteration or epoch number, specified as a numeric scalar or dlarray object. Experiment Manager uses this value as the x-coordinate in the training plot.

    Metric name, specified as a string or character vector. This name must be an element of the Metrics property of the experiments.Monitor object monitor.

    Data Types: char | string

    Metric value, specified as a numeric scalar or dlarray object. Experiment Manager uses this value as the y-coordinate in the training plot.

    Metric names and values, specified as a structure. Names must be elements of the Metrics property of the experiments.Monitor object monitor and can appear in any order in the structure.

    Example: struct(TrainingLoss=trainingLoss,ValidationLoss=validationLoss)

    Data Types: struct

    Tips

    • Both information and metric columns display values in the results table for your experiment. Additionally, the training plot shows a record of the metric values. Use information columns for text and for numerical values that you want to display in the results table but not in the training plot.

    • Use the groupSubPlot function to define your training subplots before calling the function recordMetrics.

    Version History

    Introduced in R2021a