I managed to understand the amount of information and entropy, but I don't understand the formula of the source coding theorem
Do you mean I can think of it as the average code length L = entropy (H)?
I am referring to this site
http://sun.ac.jp/prof/hnagano/houkoku/h24information-04.html
information-theory
Do you mean I can think of it as the average code length L = entropy (H)?
For one source S, I think it is correct to interpret that the "shortest" average code length L equals entropy H(S).
As H(S) LL
© 2024 OneMinuteCode. All rights reserved.