Using tapped delays in the process of training an artificial neural network for the purpose of dynamic system modeling.
조회 수: 1 (최근 30일)
이전 댓글 표시
Hello,I'm trying to use an artificial neural network for creating a model for the system : ![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390249/image.png)
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390249/image.png)
from what I have got so far first I should get the response of the system to an arbitrary input and use that data for training my network. after discretisation this second order diffrence equation (
) would describe the dynamic behaviour of system.
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390254/image.png)
Now what is the method to implement this delays in my neural network? should I just give these delayed outputs and inputs of the plant as a input to my neural network or is there an easier way to achive this?
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!