ExperienceBuffer has 0 Length when i load a saved agent and continue training in reinforcement training
조회 수: 2 (최근 30일)
이전 댓글 표시
Hi all,
I'm trying to train a saved agent further. In the training option of this saved agent, the SaveExperienceBufferWithAgent is set to true. But when I load the saved_agent and open the property ExperienceBuffer I noticed the Length is 0. I tried to look in the documentation of such property but the there is no information on it. If I stop a training and directly check the property "Length" of the agent in the workspace, it has some value.
My question would be what does this "Length" mean? If it's 0, when I perform training further with a saved agent like in https://de.mathworks.com/matlabcentral/answers/495436-how-to-train-further-a-previously-trained-agent?s_tid=answers_rc1-2_p2_MLT , does it really continue training with saved agent and with saved expeirence buffer?

Yours
댓글 수: 0
채택된 답변
Takeshi Takahashi
2021년 4월 20일
Length 0 means there isn't any experience in this buffer. I think it didn't save the experience buffer due to this bug. Please set agent.AgentOptions.SaveExperienceBufferWithAgent = true immediately before saving the agent.
댓글 수: 2
Dmitriy Ogureckiy
2023년 1월 12일
Can I ask you, does networks weights saved when agent saved between simulations?
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Introduction to Installation and Licensing에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!