Google Collection RAM crashes.
The code is
all_data=pd.get_dummies(all_data)
all_data.head()
However, I suddenly ran out of RAM.
Is there any way to avoid crashes such as removing the upper limit of RAM?
By the way, I am currently using PRO+.
google-colaboratory
Unlike the regular Colab version, Colab Pro, Pro+ has
The following are available (although not unlimited)
Note: (https://colab.research.google.com/notebooks/pro.ipynb)
To take advantage of more memory, select High Memory from the Runtime > Change Runtime Type menu in the Runtime Specification pulldown (what is written on the reference page)
The above page contains a verification code to see if you are using high-RAM runtime and some precautions, so it is recommended that you read it first.
As you can see on the above page, simply prepare and execute the following cells
to verify that you are using high-RAM runtime.
You can see the specific numbers by looking at the contents of ram_gb
from psutil import virtual_memory
ram_gb = virtual_memory().total/1e9
print('Your runtime has {:1f} gigabytes of available RAM\n'.format(ram_gb))
ifram_gb<20:
print('Not using a high-RAM runtime')
else:
print('You are using a high-RAM runtime!')
In fact, if you can confirm that it is high-RAM runtime and the number is over 50,
Even if you use 50GB, it means you don't have enough memory, so you may need to doubt if you really don't have enough memory.
In that case, it may be better to show the data and code (about the question)
© 2024 OneMinuteCode. All rights reserved.