Load data into experience buffer: DDPG agent

I am using RL toolbox version 1.1 with Matlab R2019b and using the DDPG agent to design a controller. Is there a way to load in data (state, action, reward, next state) collected from real experiments into the experience buffer before startting training?

답변 (2개)

JiaZheng Yan
JiaZheng Yan 2020년 3월 31일

1 개 추천

I find a way to show the Memory of the experience buffer.
You can open the file "ExperienceBuffer.m", which is in "...\Matlab\toolbox\rl\rl\+rl\+util".
In this file, you can the property value of the variable Memory. For example:
Then you set:
agentOpts.SaveExperienceBufferWithAgent = true;
agentOpts.ResetExperienceBufferBeforeTraining = false;
After your training, you can get the data in ''agent.ExperienceBuffer.Memory''
This also means that you can modify and use the training data.
I hope this method works for you : )

댓글 수: 8

Ao Liu
Ao Liu 2020년 7월 3일
你好,我现在用的2020a的工具箱也遇到了这个问题,按照你的这样改,好像不行,请问有什么解决方法吗!感谢!
我尝试了一下,Memory变量的属性设置在所示的两个位置均可实现其显示及调用。
记得在创建agent的训练选项预设的代码中,加入以下代码:
agentOpts.SaveExperienceBufferWithAgent = true; %(这一句是关键)
agentOpts.ResetExperienceBufferBeforeTraining = false;
训练完成后,你应该就可以看见Memory元素。
Memory的长度可能会影响数据显示,通过抽样赋值可以看见更详细的数据:
a=agent.ExperienceBuffer.Memory{1}%取一个元素查看
分别表示 (state, action, reward, next state,is_done)
保存这个agent,在下次的训练时加载agent,就可以实现agent的反复训练
Fabian Hart
Fabian Hart 2020년 7월 21일
Thanks for your answer!
Unfortunately I have problems to write on Matlab system files. When I try to do that the following message appears:
"Error writing ExperienceBuffer.m .... Acces denied"
Could you please tell me how you managed that? (Windows 10)
JiaZheng Yan
JiaZheng Yan 2020년 7월 23일
Sorry, I hope you can provide a more detailed description or a screenshot of the error report, because I have never made such an error.
(I guess it's a file path problem)
zhou jianhao
zhou jianhao 2021년 10월 25일
Hi, Jiazheng.
Your method is working, thanks a lot!
Still one thing bother, is there any method that I can access the memory buffer during training, I mean if I want to use prioritized experience replay.
Hope to hear from you soon.
Thanks!
Regards!
zhou jianhao
zhou jianhao 2021년 10월 25일
I have to say temperally the RL toolbox in matlab is easy to use but hard to obtain satisfactory performance, so far from python platform.
您好,我可以用这个方法查看buffer里的数据,但是如何修改或者删除buffer里的数据?
Arman Ali
Arman Ali 2022년 8월 1일
have you found the answer? if yes please guide?

댓글을 달려면 로그인하십시오.

Priyanshu Mishra
Priyanshu Mishra 2020년 2월 26일

0 개 추천

Hi Daksh,
You may find following link useful for your answer.

댓글 수: 2

Daksh Shukla
Daksh Shukla 2020년 2월 26일
Hello Priyanshu,
Thanks for your response.
However, the link does not exactly resolve the problem I am having. The link talks about running a lot of initial simulations and saving the agent with the experience buffer. But, what I would like to do is use data from "real experiments" and NOT simulations. I would like to add this data to the experience buffer or the replay memory to kick start the DDPG learning.
Based on all my reading and trying to access experience buffer in Matlab, it seems like experience buffer object is a hidden property and I cannot upload data to it directly from an external source.
I would really appreciate if you could let me know a direct way to upload data to the experience buffer, if there is one.
Any updates on this? @Daksh Shukla?

댓글을 달려면 로그인하십시오.

카테고리

도움말 센터File Exchange에서 Programming에 대해 자세히 알아보기

제품

릴리스

R2019b

질문:

2020년 2월 23일

댓글:

2023년 12월 20일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by