- Define your log regression model as you have mentioned.
- specify the log likelihood function assuming the error in normal as log(normpdf(ln(y), alpha + Beta*ln(x), sigma))
- Specify the prior distribution using ‘bayeslm’ function.
- Run the maximum likelihood to estimate the parameters.
Estimate Linear regression, then estimate normal learning model and see how parameters update over time
조회 수: 3 (최근 30일)
이전 댓글 표시
I want to estmate a log linearized regression (ln(y) = alpha + Beta*ln(x) +e), and then see how particular parameters (alpha, Beta) update over time given observations via a normal bayesian learning model. I am new to normal learning models, and use matlab infrequently.
Do I need to run maximum likelihod on a log likelihood function then run 'bayeslm', or do I run 'bayeslm/empiricallm' and then 'estimate' for the posterior?
Additionally, do I set up a log likelihood, prior, and then estimate, or just the log likelihood and then define the functions?
I have read around some of the mathworks documents, but would like verification for this process before proceeding. Thank you!
댓글 수: 0
답변 (1개)
Balaji
2023년 9월 22일
Hi Joshua
I understand that you want to estimate a log-linearized regression model using a normal Bayesian learning approach in MATLAB. For this you can
For more information on ‘bayeslm’ function I suggest you refer to :
Hope this helps
Thanks
Balaji
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Regression에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!