Is the error stated below a result of using a different version of MATLAB than the one the code in question was created with?

조회 수: 8 (최근 30일)
To preface this question, I tried to follow the steps in the video below and use an RL algorithm to simulate a walking robot using MATLAB R2020a
After downloading the resources in the video from his link in the description ( https://www.mathworks.com/matlabcentral/fileexchange/64227-matlab-and-simulink-robotics-arena-walking-robot?s_eid=PSM_15028 ), I tried to run the "createDDPGNetworks" file as he does as 7:07 in his video, and I got the error:
Unrecognized function or variable 'numObs'.
Error in createDDPGNetworks (line 12)
imageInputLayer([numObs 1 1],'Normalization','none','Name', 'observation')
While others have used his model and had plenty of success, I and a few others have gotten this same error. I was wondering if this is a result of him using MATLAB R2019a and me using MATLAB R2020a.

답변 (2개)

Yahya Madhi
Yahya Madhi 2020년 9월 27일
Hello Sammy
Not sure if you have resolved the issue, but if you have not, follow the following instructions:
  1. Run the script startupWalkingRobot.m
  2. Open the simulink model named walkingRobotRL2D.slx
  3. Run the script robotParametersRL.m
  4. In the script createDDPGNetworks.m edit line 53 to critic = rlQValueRepresentation(criticNetwork,observationInfo,actionInfo,'Observation',{'observation'},'Action',{'action'},criticOptions);
  5. Also in the sript named createDDPGNetworks.m edit line 85 to actor = rlDeterministicActorRepresentation(actorNetwork,observationInfo,actionInfo,'Observation',{'observation'},'Action',{'ActorTanh1'},actorOptions);
  6. Thereafter open the script titled createWalkingAgent2D.m, select the appropriate Speedup Options for your system (lines 6-8). Once changed, run this script.
Hope this helps.

Cam Salzberger
Cam Salzberger 2020년 7월 13일
Hello Sammy,
If you notice the workspace before Sebastian runs the script, it already has many variables defined. The createDDPGNetworks script is making use of some of those variables when setting up its neural networks. If you check out the createWalkingAgent2D (or 3D, I'm not sure which he was using), you can see that numObs is defined there.
-Cam
  댓글 수: 2
Sammy Rossberg
Sammy Rossberg 2020년 7월 14일
Hello Cam,
Since I aksed this question I have defined a few variables and solved a few errors, including the one above.
Unfortunately, now when I try running both createWalkingAgent2D or 3D, I get errors referring back to createDDPGNetwork. The error I get can be seen below:
Error using rlRepresentation (line 70)
rlRepresentation will be removed in a future release. Unable to automatically convert rlRepresentation to new representation
object. Use the new representation objects rlValueRepresentation, rlQValueRepresentation, rlDeterministicActorRepresentation, or
rlStochasticActorRepresentation instead.
Error in createDDPGNetworks (line 51)
critic = rlRepresentation(criticNetwork,criticOptions, ...
Error in createWalkingAgent2D (line 31)
createDDPGNetworks;
I went through the MATLAB page that explains these actors and from what I understood I should be able to replace “rlRepresentation” with something like “rlValueRepresentation” while still leaving “rlRepresentationOptions” where it appears. However, when I do that, I still get errors specifically saying:
Error using rlValueRepresentation (line 43)
Too many input arguments.
Error in createDDPGNetworks (line 51)
critic = rlValueRepresentation(criticNetwork,criticOptions, ...
Hopefully this can be easily solved and there aren’t more errors to follow. Thank you for your help so far.
- Sammy
Cam Salzberger
Cam Salzberger 2020년 7월 14일
I'm sorry, but I'm not very experienced with the Reinforcement Learning Toolbox. However, if you do a search through the release notes for the removed function, it links to recommendations for how to replace the functionality. Hopefully this will be able to get you started.
-Cam

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Reinforcement Learning에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by