transition probability matrix for markov
조회 수: 2 (최근 30일)
이전 댓글 표시
how to solve if the summation of each row in transition probability matrix in markov chain not equal to one?
댓글 수: 0
채택된 답변
Ameer Hamza
2020년 10월 8일
If you just want to make each row sum to one, then you can try this
M % matrix
M_new = M./sum(M,2)
I am not sure if this is the theoretically correct way to solve this problem.
댓글 수: 4
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Markov Chain Models에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!