- The default PID gains on the Iris drone in PX4 may not align with the dynamics learned by the RL agent in Simulink. You may need to manually tune the PID gains to better match the expected performance.
- Adjust the PID gains incrementally by focusing on one axis at a time (roll, pitch, yaw, and throttle) to understand the impact of each parameter.
- Ensure that the dynamics of the multirotor model in MATLAB/Simulink closely match those of the Iris drone in PX4. You might need to adjust the model parameters to better reflect the real drone’s behaviour.
- Consider the impact of sensor noise and communication delays in the ROS/Gazebo environment, which might not be present in the Simulink model.
- Retrain the RL agent with domain randomization techniques to make it more robust to variations in the environment.
- Ensure that the control loop timing in ROS is consistent with what the RL agent expects. Latency or timing mismatches can degrade performance.
- Use Gazebo to simulate various scenarios and validate the RL agent's performance before deploying it on the actual hardware.
Using Reinforcement Learning Agent with PX4 in ROS/Gazebo for Iris Drone - PID Gain Issue?
조회 수: 30 (최근 30일)
이전 댓글 표시
Hi everyone,
I have trained a reinforcement learning (RL) agent using the UAV Toolbox's multirotor model in MATLAB/Simulink, and the training was successful. The agent can effectively control the multirotor in the simulation environment.
Now, I am trying to deploy the same RL agent to control an Iris drone in ROS/Gazebo with PX4. The configuration of the Iris drone is set to default. However, when I attempt to control the drone using the RL agent, it fails to perform as expected.
I suspect that the issue might be related to the PID settings on the Iris drone in PX4. Do I need to tune the PID gains, or are there other factors that could be affecting the agent's performance in the new environment? Has anyone encountered a similar issue when transitioning from MATLAB to PX4/ROS?
Any guidance on adjusting the PID gains or other relevant tips would be greatly appreciated!
Thanks in advance for your help!
댓글 수: 0
답변 (1개)
Kothuri
2024년 10월 17일
Hi Gaurav,
I understand that you are facing an issue while deploying the RL agent developed using the UAV Toolbox's multirotor model in MATLAB/Simulink to control an Iris drone in ROS/Gazebo with PX4.
You can follow the below steps:
You can refer the below documentation for more info
댓글 수: 0
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!