For example, when reading data locally,
def dataLoadBatch(self, num_samples):
X = [ ]
Y = [ ]
for h in range (0, num_samples):
I=io.imread("data/images/salMap_{:05d}.jpg".format(h))
X.append(I)
labels=io.imread("data/salMap/salMap_{:05d}.jpg".format(h)) [:, :, 0]
In order to do this with colab, it seems dull to upload the data to the drive.How can colab automatically load local data sets with serial numbers as shown above?
python google-cloud google-colaboratory
The official Colaboratory sample "External data: Drive, Sheets, and Cloud Storage" is helpful.
If you are uploading files locally without using Google Drive, you can upload multiple files from the file selection dialog by doing the following:
from google.colab import files
uploaded=files.upload()
Uploaded files are stored in the current directory of the Collaborative at the same time, so the rest can be treated as if they were local.
If you want to see a list of uploaded files, you can do !ls
.
However, if you upload it this way, you will find it troublesome to divide the uploaded files into directories later.For this reason, I personally think it would be easier to manage the dataset by placing it on Google Drive and accessing it via PyDrive or Google-drive-ocamlfuse.Qiita's How to load files in Google Collaboration is also helpful.
© 2024 OneMinuteCode. All rights reserved.