Control the epochs while training a neural network
조회 수: 3 (최근 30일)
이전 댓글 표시
I am trying to train a BP neural network with the following codes. I mean to run the iterations for 1000 epochs. However, when the net.trainParam.goal = 0 is achieved, the training process will stop, which is much less than 1000.
How can I set some parameters so that I can train the neural network for 1000 times. I can to plot MSE VS. epoch.
Thanks!
%%%%%%%%%%%%%%%%%%%% clear all; close all; clc;
% Number of Inputs(n), Outputs(r) and neurons in hidden layer(m) n = 1; r = 1; m = 12; % Number of training values (epochs) epochs = 1000; % Input value range x_min = -1; x_max = 1; for k = 1:n x_train(k,:) = x_min + (x_max-x_min)* rand(epochs,1); end % Desired values for random vector x_train y_des = 2*x_train.^2 + 1;
net = feedforwardnet(m);
net.trainParam.epochs = 1000; % epoch net.trainParam.show = 10; % show frequency net.trainParam.goal = 0; % objective MSE
net = train(net, x_train, y_des);
댓글 수: 0
채택된 답변
Greg Heath
2012년 9월 5일
The purpose of training is to reduce mse to a reasonably low value in as few epochs as possible. When training is sufficiently long, the plot of mse will asymptotically decrease to a horizontal straight line at mse = 0.
Therefore, your request makes no sense to me.
In fact, if the training target is standardized (zero-mean/unit-variance rows) via the functions zscore or mapstd, there is no practical reason to reduce mse below
~mean(var(transpose(target)))/100.
Hope this helps.
Greg
댓글 수: 1
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!