How can I use RL Agent with PX4 Host Target?

조회 수: 5 (최근 30일)
Unmanned Aerial and Space Systems
답변: Ankur Bose 2023년 1월 24일
Hi, I have a question related with Pixhawk and Reinforcement Learning Toolbox. I wanna use PX4 Host Target to implement RL training algorithm before deployin the RL algorithm into the real Pixhawk board. But, when I started training MATLAB shows me that following statements. How can I use RL Agent and PX4 Host Target together?
-----------------------------------------------------------------------------------------------------------------------
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.PWM System object has private or protected properties, but does not implement both the
saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Publisher System object has private or protected properties, but does not implement both the
saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
-----------------------------------------------------------------------------------------------------------------------
  댓글 수: 5
Ankur Bose
Ankur Bose 2022년 5월 20일
I dont think these warnings are responsible for RL issue you are facing. I suggest reach out to MathWorks Tech Suport https://www.mathworks.com/support/contact_us.html
Unmanned Aerial and Space Systems
편집: Unmanned Aerial and Space Systems 2022년 5월 20일
I have been trying to figure out this problem for three weeks, but I didn't get any feedback from contact mail.

댓글을 달려면 로그인하십시오.

답변 (1개)

Ankur Bose
Ankur Bose 2023년 1월 24일
Manually closing this question as user has been recommended to reach out to MathWorks Tech Support

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by