Saving Trained RL Agent after Training
이전 댓글 표시
Hi All,
I trained a RL agent, the environment output was acceptable, my plan was to initially validate the agent in the simulation after training finished with the following code.
As i was concerned that I would restart training on the agent when I ran the script to run the 'sim' function, my IsDone flag in the simulation was manually set to 1 (previously 0 to permit training) and additionally commented out the 'training' function.
%trainingStats = train(agentSS,env,trainingOpts)
rng(0)
simOptions = rlSimulationOptions('MaxSteps',maxsteps);
experience = sim(env,agentSS,simOptions);
There was no ouput from the simulation, with no warnings, I then reset the IsDone flag back to 0, and reran the script, now the ouput was 0 on all scopes.
Did I lose the trained agent data when I set the IsDone flag to 1 after training?.
My next step was to try to save the trained agent with adding the following code found in the documentation, but still joy. My thoughts are I have overwritten and lost the trained data!
save("initialAgent.mat","agentSS")
load('initialAgent.mat')
rng(0)
simOptions = rlSimulationOptions('MaxSteps',maxsteps);
experience = sim(env,agentSS,simOptions);
How can I add code to ensure the trained agent data is saved automatically via 'RLTrainingOptions' after training has been completed, such as when maxepisodes are reached? Do not want to make the same mistake.
Is this correct?
trainingOpts = rlTrainingOptions(...
'MaxEpisodes',maxepisodes, ...
'MaxStepsPerEpisode',maxsteps, ...
'StopTrainingCriteria','AverageReward',...
'StopTrainingValue',-100,...
'ScoreAveragingWindowLength',100,...
'SaveAgentCriteria',"EpisodeCount",...
'SaveAgentValue',maxepisodes,...
'SaveAgentDirectory',"savedAgents")
Thanks
Patrick
채택된 답변
추가 답변 (0개)
카테고리
도움말 센터 및 File Exchange에서 Training and Simulation에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!