I managed to understand the amount of information and entropy, but I don't understand the formula of the source coding theorem
Do you mean I can think of it as the average code length L = entropy (H)?
I am referring to this site
http://sun.ac.jp/prof/hnagano/houkoku/h24information-04.html
information-theory
Do you mean I can think of it as the average code length L = entropy (H)?
For one source S, I think it is correct to interpret that the "shortest" average code length L equals entropy H(S).
As H(S) LL
709 When building Fast API+Uvicorn environment with PyInstaller, console=False results in an error
548 rails db:create error: Could not find mysql2-0.5.4 in any of the sources
547 Who developed the "avformat-59.dll" that comes with FFmpeg?
546 Understanding How to Configure Google API Key
537 Uncaught (inpromise) Error on Electron: An object could not be cloned
© 2024 OneMinuteCode. All rights reserved.