필터 지우기
필터 지우기

An error occurred while simulating mode with the reinforcement learning agent

조회 수: 16 (최근 30일)
Atriya Biswas
Atriya Biswas 2019년 10월 18일
편집: Walter Roberson 2022년 1월 31일
I’m using reinforcement learning toolbox to train energy management controller of hybrid electric vehicle (HEV) model. I am using 5 (five) continuous state variables as input to the RL agent and 3 continuous action variables are output from the RL agent. Hence, I have used a DDPG agent since DDPG agent can handle both continuous state and action variables. My environment is comprised of vehicle plant model, reference drive speed profile, and immediate cost calculation.
For reference, I have followed the “rlwatertank” example from RL toolbox help guide since DDPG agent is also used in that example too.
Before training my agent, I made sure the whole model (agent and environment) runs without any error and warning with the command sim(‘QLearningEMSdiscreteDelay’). It ran smoothly. But, when I’m trying to train the agent through training options, MATLAB is giving the error as following:
Error using rl.train.seriesTrain (line 16)
An error occurred while simulating “QLearningEMSdiscreteDelay” the agent “agent”
Error in rl.train.TrainingManager/train (line 244)
Rl.train.seriesTrain(this);
Error in rl.train.TrainingManager/run (line 150)
train(this);
Error in rl.agent.AbstractAgent/train (line 54)
TrainingStatistics = run(trainMgr)
The only differences my model has from the “rlwatertank” example are following:
  1. My environment has a time-series of reference speed compared to a single value of water level in “rlwatertank” example
  2. There is no randomization of this reference time-series at the starting of each episode
  3. Time-step of my agent and environment is 0.1 seconds compared to 1 second in “rlwatertank” example
Why can't the agent start the simulation through training while it can be simulated manually?
Any help will be appreciated.
  댓글 수: 2
Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2019년 10월 25일
Can you share the full error message? The script where you set everything up would be helpful too.
Thanks!
Hajar hammouti
Hajar hammouti 2019년 11월 5일
Wondering if you have solved this error. I obtain the same error when training my qAgent and I don't know from where it comes.

댓글을 달려면 로그인하십시오.

답변 (2개)

Atriya Biswas
Atriya Biswas 2019년 11월 5일
편집: Walter Roberson 2022년 1월 31일
Now my agent is able to train the Simulink model. But the training stops after only one training episode. The trainingOptions in the rlwatertank example are same as mine, but my training stops after one episode only. Can anybody help on this?
Here is the code for agentOptions and TrainingOptions respectively:
agentOpts = rlDDPGAgentOptions(...
'SampleTime',Ts,...
'TargetSmoothFactor',1e-3,...
'DiscountFactor',1.0, ...
'MiniBatchSize',64, ...
'ExperienceBufferLength',1e6);
agentOpts.NoiseOptions.Variance = 0.3;
agentOpts.NoiseOptions.VarianceDecayRate = 1e-5;
agentEMS = rlDDPGAgent(actor,critic,agentOpts);
maxepisodes = 20;
maxsteps = ceil(Tf/Ts);
trainOpts = rlTrainingOptions(...
'MaxEpisodes',maxepisodes, ...
'MaxStepsPerEpisode',maxsteps, ...
'ScoreAveragingWindowLength',20,...
'Verbose', false, ...
'Plots','training-progress',...
'StopTrainingCriteria','AverageReward',...
'StopTrainingValue',1000);
  댓글 수: 1
Thoriq Fauzan
Thoriq Fauzan 2020년 8월 17일
excuse me, does anyone already got some ideas about the cause of this? I have an almost similar problem

댓글을 달려면 로그인하십시오.


Atriya Biswas
Atriya Biswas 2019년 11월 5일
Actually I contacted for MATLAB technical support and I sent MATLAB the whole model so that they can reproduce and do the root-cause analysis. After the root cause analysis they found that source of the unexpected error.
According to MATLAB, "The error is due to a bug in Simulink R2019a Configuration Parameters which do not recognize variables for Simulation start time". In my Simulink model: the 'sim_start' variable was used as the Simulation start time and that was the source of error. MATLAB suggested me to use numerical value "0" as the start time instead of "sim_start" variable.
If anyone is using MATLAB 2019a, it is advised not to use any variable for start time of the simulation.

제품


릴리스

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by