Monte Carlo simulation of a linear regression model with a lagged dependet variable
조회 수: 4 (최근 30일)
이전 댓글 표시
I have a linear regression model with a lagged dependet variable: y_t=beta_0 + beta_1 * y_{t-1} + u_t The initial starting point is y_0=2 and I know the real coefficients of beta_0=2 and beta_1=1. How can I perform a Monte Carlo Simulation that´s estimating the bias of the OLS coefficients?
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Linear and Nonlinear Regression에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!