n=int(input("user input :"))
numlist= [2];
listnum=1;
def numberOfPrime (n):
for i in range(3,n):
if (n % i == 0):
numlist.append(n)
listnum+=1;
return numlist
return listnum
else:
return print ("Output X because it is not a decimal")
print(numberOfPrime(n))
If it's a decimal, add it to the numlist Or I'm trying to make a def that says no What's the problem?
python
n=int(input("user input :"))
numlist= [2];
listnum=1;
def numberOfPrime (n):
global numlist
global listnum
for i in range(3,n):
if (n % i == 0):
numlist.append(n)
listnum+=1;
return numlist ,listnum
else:
return "Output X because it is not a decimal"
print(numberOfPrime(n))
Grammarly, I think we need to revise this much. This does not mean that the algorithm is correct.
If you do not use the keyword global, the listnum in the function is considered a regional variable. However, an error occurs because it is used for calculation without initializing.
Interestingly, even if you omit the global numlist, the numlist is understood as a global variable.
In numlist.append(n), numlist is understood as an instance that already exists, so it is found in the global variable because it is not found in the local area.
If
numlist = numlist + [n] ( numlist += [n] )
This results in the same error as listnum+=1.
549 Uncaught (inpromise) Error on Electron: An object could not be cloned
554 PHP ssh2_scp_send fails to send files as intended
760 When building Fast API+Uvicorn environment with PyInstaller, console=False results in an error
548 GDB gets version error when attempting to debug with the Presense SDK (IDE)
© 2024 OneMinuteCode. All rights reserved.