필터 지우기
필터 지우기

error in the code for markov chain

조회 수: 1 (최근 30일)
dpr
dpr 2012년 3월 4일
I am trying to simulate a markov chain with the following code. I keep on having this error:
In an assignment A(I) = B, the number of elements in B and I must be the same. Error in ==> Markov_Chain at 61 Chain(1) = cur_state;
How can I solve it? thanks
function [Chain] = Markvok_Chain( P, initial_state, n )
P = [ 0.85 0.15; 0.05 0.95 ];
initial_state = [0.4 0.6];
n=20;
sz = size(P);
cur_state = round(initial_state);
% % Verify that the input parameters are valid % if (sz(1) ~= sz(2)) error('Markov_Chain: Probability matrix is not square'); end num_states = sz(1);
if (cur_state < 1) | (cur_state > num_states) error('Markov_Chain: Initial state not defined in P') end
for i=1:num_states if (sum(P(i,:)) ~=1 ) error('Markov_Chain: Transition matrix is not valid') end end
% % Create the Markov Chain
Chain(1) = cur_state; for i = 1:n
cur_state = Rand_Vect(P(cur_state,:), 1);
Chain(i) = cur_state;

채택된 답변

Razvan
Razvan 2012년 3월 4일
I guess you want a matrix Chain which has in the end on each row some probabilities... You try to put a vector curr_state into 1 element Chain(1) that's why you get an error. You should change that to Chain(1, 1:2) = cur_state; and the same 2 lines bellow: Chain(i, 1:2) = cur_state;

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Markov Chain Models에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by