I managed to understand the amount of information and entropy, but I don't understand the formula of the source coding theorem
Do you mean I can think of it as the average code length L = entropy (H)?
I am referring to this site
http://sun.ac.jp/prof/hnagano/houkoku/h24information-04.html
information-theory
Do you mean I can think of it as the average code length L = entropy (H)?
For one source S, I think it is correct to interpret that the "shortest" average code length L equals entropy H(S).
As H(S) LL
2022-09-29 21:33
If you have any answers or tips
Popular Tags
python x 4647
android x 1593
java x 1494
javascript x 1427
c x 927
c++ x 878
ruby-on-rails x 696
php x 692
python3 x 685
html x 656
© 2025 OneMinuteCode. All rights reserved.