Shannon entropy (information theory)
조회 수: 6 (최근 30일)
이전 댓글 표시
I want to calculate the shannon entropy. X transmits random binary sequense (e.g 1000110010) and Y received (e.g 1000100010) with probability of 2%. Could some explain me how can I calculate the shannon entropy.
댓글 수: 1
Akira Agata
2019년 7월 4일
Do you mean 'Channel capacity' based on the Shannon-Hartley theorem assuming 2% BER?
You don't need to use received binary sequense Y to calculate Shannon entropy, which can be determined by the probability of '0' and '1' in the transmitted binary sequense.
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Biomedical Imaging에 대해 자세히 알아보기
제품
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!