I am getting this error when I try to train a TD3 RL agent.
Thanking You
Apoorv Pandey

댓글 수: 1

Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2023년 3월 24일
If you share a reproduction model it would be easier to debug

댓글을 달려면 로그인하십시오.

 채택된 답변

Cris LaPierre
Cris LaPierre 2023년 3월 24일

0 개 추천

When defining your rlQValueFunction, include the ActionInputNames and OvservationInputNames name-value pairs.
% Observation path layers
obsPath = [featureInputLayer( ...
prod(obsInfo.Dimension), ...
Name="netObsInput")
fullyConnectedLayer(16)
reluLayer
fullyConnectedLayer(5,Name="obsout")];
% Action path layers
actPath = [featureInputLayer( ...
prod(actInfo.Dimension), ...
Name="netActInput")
fullyConnectedLayer(16)
reluLayer
fullyConnectedLayer(5,Name="actout")];
%<snip>
critic = rlQValueFunction(net,...
obsInfo,actInfo, ...
ObservationInputNames="netObsInput",...
ActionInputNames="netActInput")

댓글 수: 2

Apoorv Pandey
Apoorv Pandey 2023년 3월 27일
I have used the exact same code as mentioned in the link and still getting the error. Please help
Cris LaPierre
Cris LaPierre 2023년 3월 27일
Please share your data and your code. You can attach files using the paperclip icon. If it's easier,save your workspace variables to a mat file and attach that.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

질문:

2023년 3월 24일

댓글:

2023년 3월 27일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by