I am getting an error when trying to train an RL agent in matlab, I am using MATLAB 2024a
조회 수: 28 (최근 30일)
이전 댓글 표시
I am trying to run the following example in Matlab:
openExample('rl/TrainTD3AgentForPMSMControlExample')
It gives me the following error:
Error in 'mcb_pmsm_foc_sim_RL/Current Control/Input Scaling/ Calculate Position and Speed/Speed Measurement': Failed to evaluate mask initialization commands.
out = nestedRunEpisode(policy);
result = run_internal_(this);
result = run_(this);
trainResult = run(trainer);
result = run_(this);
trainingResult = run(tm);
Caused by:
Cannot change property 'Enabled' of 'mcb_pmsm_foc_sim_RL/Current Control/Input Scaling/ Calculate Position and Speed/Speed Measurement' while simulation is running
댓글 수: 0
답변 (1개)
Sreeram
2024년 11월 15일 6:27
This looks like a bug to me. However, here is a workaround to unblock this:
Replace “Speed Measurement” block in ‘mcb_pmsm_foc_sim_RL/Current Control/Input Scaling/ Calculate Position and Speed’ with “Speed Measurement” block from ‘Motor Control Blockset HDL Support/Sensor Decoders’.
Make sure to set all the block parameters to be exactly same as that of the original “Speed and Measurement” block before commenting it out.
I hope this helps!
댓글 수: 0
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!