For half-precision floating-point and single-precision floating-point numbers,
How many digits can I express in decimal?
The number I would like to express is about 5 decimal digits (999.99).
Is semi-precision acceptable or single-precision required?
I don't know how to make a decision.
For semi-precision, the mantissa is 10 bits, and the decimal number that can be expressed as 10 bits is
Since it can be up to 1024, can't you express five decimal digits with half-precision
Is the idea wrong?
In terms of semi-precision, thanks to the so-called stingy expression, there is an increase of 1 bit and a real 11 bit accuracy, but it can only be expressed up to 2048, so the conclusion is that it is about 3 decimal digits.
(Refer to the decimal equivalent digits in the table in IEEE754 Basic Format on Wikipedia)
© 2024 OneMinuteCode. All rights reserved.