필터 지우기
필터 지우기

Need help in making a Markov Probability Matrix

조회 수: 2 (최근 30일)
Ashwin Chettiar
Ashwin Chettiar 2020년 8월 28일
편집: Ashwin Chettiar 2020년 9월 6일
Consider a new particle, located on an 8 × 8 grid. The particle begins in the bottom left-hand corner, which is given the label square 1. The particle can move horizontally or vertically randomly. This system has 64 different states, corresponding to the particle being in each of the 64 squares. Use for loops in Matlab to efficiently create the Markov transition matrix (you are not expected to create a 64 × 64 matrix by hand!).
So far, I have been able to create the state vector and a blank probability matrix. The problem that I am having is that I do not understand how to use for loops to generate a Markov matrix with the constraints that I have been given.
My code so far:
%Create a state vector s with 64 different states
s = zeros(1,64);
s(1,1) = 1;
%Create a blank 64x64 probability matrix P
P = zeros(64,64);
%Use for loops to create the Markov Probability Matrix

채택된 답변

Pratyush Roy
Pratyush Roy 2020년 8월 31일
Assuming random values for the Markov probability matrix, the following snippet might be helpful for defining a Markov Probability Matrix using for loops:
%Create a state vector s with 64 different states
s = zeros(1,64);
s(1,1) = 1;
%Create a blank 64x64 probability matrix P
P = zeros(64,64);
%Use for loops to create the Markov probability Matrix.
for i=1:64
for j=1:64
P(i,j)=rand; %Generates a random number from a uniform distribution in range(0,1)
end
P(i,:) = P(i,:)/sum(P(i,:)); % This forces the sum across columns to be unity
end
You can also refer to the following documentation for defining a Markov Probability Matrix without using for loops:
  댓글 수: 3
Jeremy Jenkins
Jeremy Jenkins 2020년 9월 5일
Hi Ashwin, I was wondering how you varied your method to account for the specific probabilities?
Ashwin Chettiar
Ashwin Chettiar 2020년 9월 6일
편집: Ashwin Chettiar 2020년 9월 6일
By making a very specific probability matrix using for loops.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Markov Chain Models에 대해 자세히 알아보기

제품


릴리스

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by