Shannon entropy (information theory)

조회 수: 6 (최근 30일)
Saravanan Mani
Saravanan Mani 2019년 7월 3일
댓글: Akira Agata 2019년 7월 4일
I want to calculate the shannon entropy. X transmits random binary sequense (e.g 1000110010) and Y received (e.g 1000100010) with probability of 2%. Could some explain me how can I calculate the shannon entropy.
  댓글 수: 1
Akira Agata
Akira Agata 2019년 7월 4일
Do you mean 'Channel capacity' based on the Shannon-Hartley theorem assuming 2% BER?
You don't need to use received binary sequense Y to calculate Shannon entropy, which can be determined by the probability of '0' and '1' in the transmitted binary sequense.

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Biomedical Imaging에 대해 자세히 알아보기

태그

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by