Hello, I'm a student studying Python. I made a watch and made a program to run it in the cmd window. Time is expressed in 0.01 seconds, such as '02:03:12.23'.
class Timer:
def __init__(self):
self._hour = 0
self._minute = 0
self._second = 0
self._hundi_second = 0
def run(self):
while self._hour < 3:
the_time = "{:02}".format(self._hour)+":"+ "{:02}".format(self._minute) \
+":"+ "{:02}".format(self._second) +"."+ "{:02}".format(self._hundi_second)
print(the_time+'\r', end="")
self._hundi_second += 1
if self._hundi_second == 100:
self._second += 1
self._hundi_second = 0
if self._second == 60:
self._minute += 1
self._second = 0
if self._minute == 60:
self._hour += 1
self._minute = 0
time.sleep(0.01)
print("Ending the program in 3 hours. Thank you.")
if __name__ == "__main__":
new = Timer()
new.run()
It was nice to go back, but... The joy is a little... I think it's a little slow crying When I measured the time with my phone, I found that it was about 10 seconds slower based on 30 seconds. Come to think of it, I think it's because the calculation process went in. How should I implement it the way I want it to be? Please give me some advice. (^)
python
How about this one? I'm not good at coding either, so I just roughly made it... This is Python 2.x standard!
At first, I also
print "{:02}:{:02}:{:02}.{:02}".format(self._hour, self._minute, self._second, self._hundi_second)
I tried to do it like you said, but it slowed down a little... So I did it like the bottom!
import time
import datetime
import os
def cls():
os.system('cls' if os.name == 'nt' else 'clear')
default = time.time()
while True:
ts = time.time() - default + 3600*15
st = datetime.datetime.fromtimestamp(ts).strftime("%H:%M:%S.%f")
cls()
print st
I added cls() to print the time on the screen, erase the screen, and print it again.
The time.time() received the current standard time, so I was going to subtract the time from the start point and start at 00:00:00.00, but the output started at 09:00:00.00, so I added 3600*15 to match 00!
But there are six digits of microseconds! Please slice it
while True:
ts = time.time() - default + 3600*15
st = datetime.datetime.fromtimestamp(ts).strftime("%H:%M:%S.%f")
st = st[0:-4]
cls()
print st
If you do it like this, it will be printed as 00:00:00.00! If you do 3, it's 00:00:00.000
It's such an interesting question, so I'm writing an answer.
You know the exact cause of the problem.
I think the above answer will solve the problem.
It's true that it's slowed down because there. Another time.Sleep() made the error bigger. Anyway, this doesn't matter. Depending on your computer, it can be fast or slow. I said 30 seconds and 10 seconds wrong, right? You can add 10 more seconds every 30 seconds. Take it out if it's fast. You can add 5 seconds every 15 seconds. And please accept some error.
The bottom is miscellaneous talk. The timer and clock have different error correction methods.
To adjust the time difference on the computer, you can take the UTC time and overwrite it.
However, the timer is slightly different from the same maker and the same specification computer. There's an oscillator in the computer, and this guy controls processing speed and time. This guy has an error, too. Also, the programs running in memory are all wrong. The error will be reduced on fast, precise computers.
On slow computers, the error will increase.^ They say there's a nanomputer, but if you use it, there'll be little error.
If you use C language, the error will be reduced a little.
In conclusion, the timer corrects the error. If you want to make it universal and available to anyone, you can also make it possible to set the error before using it. Alternatively, lower precision is indicated until 1/10th of a second.
There's a code that stops the program for 0.01 seconds The problem is that the code that counts seconds every 0.01 seconds stops for 0.01 seconds
© 2025 OneMinuteCode. All rights reserved.