How to simulate basic markov chain
조회 수: 2 (최근 30일)
이전 댓글 표시
Hi,
I'm fairly new to matlab. Would anybody be able to show me how I would simulate a basic discrete time markov chain?
Say for example I have a transition matrix with 3 states, A, b and C, how could I simulate say 20 steps starting from state A?
A B C
A .3 .2. .5
B .2 .1. .7
C .1 . 5 .4
Any help would be greatly appreciated.
Regards
John
댓글 수: 0
답변 (1개)
Doug Hull
2012년 10월 12일
편집: Doug Hull
2012년 10월 12일
Are you looking to do a simple matrix multiply?
v = [1 0 0]
m = [0.3 0.2 0.5; 0.2 0.1 0.7; 0.1 0.5 0.4]
v = v * m
You can also do this in a loop.
If you want to square the matrix element by element
m = m.^2
but more likely you wish to square the matrix
m = m^2
You can do this for higher powers:
m = m^20
And putting it together:
>> v = v*m^20
v =
0.1652 0.3217 0.5130
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Markov Chain Models에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!