import decimal
A=100000000000000000000000
B=1
print(decimal.Decimal(A/B))
The output value at code execution is 999999999999991611392
. I wonder what the error is.
In Python3, the division is a floating point operation, unlike Python2. As for floating point constraints, https://docs.python.org/3/tutorial/floatingpoint.html Please refer to .
In [1]: 10/3
Out[1]: 3.3333333333333335
So the solution is to divide the whole number by Python3. If you use two //s as shown below, it is an integer division.
In [1]: import decimal
...: ...: A=100000000000000000000000
...: ...: B=1
In [1]: print(decimal.Decimal(A//B))
100000000000000000000000
© 2024 OneMinuteCode. All rights reserved.