How do I get Google colab to automatically load learning data sets?

Asked 2 years ago, Updated 2 years ago, 81 views

For example, when reading data locally,

def dataLoadBatch(self, num_samples):
    X = [ ]
    Y = [ ]
    for h in range (0, num_samples):
        I=io.imread("data/images/salMap_{:05d}.jpg".format(h))
        X.append(I)
        labels=io.imread("data/salMap/salMap_{:05d}.jpg".format(h)) [:, :, 0]

In order to do this with colab, it seems dull to upload the data to the drive.How can colab automatically load local data sets with serial numbers as shown above?

python google-cloud google-colaboratory

2022-09-30 18:17

1 Answers

The official Colaboratory sample "External data: Drive, Sheets, and Cloud Storage" is helpful.

If you are uploading files locally without using Google Drive, you can upload multiple files from the file selection dialog by doing the following:

 from google.colab import files
uploaded=files.upload()

Uploaded files are stored in the current directory of the Collaborative at the same time, so the rest can be treated as if they were local.

If you want to see a list of uploaded files, you can do !ls.

However, if you upload it this way, you will find it troublesome to divide the uploaded files into directories later.For this reason, I personally think it would be easier to manage the dataset by placing it on Google Drive and accessing it via PyDrive or Google-drive-ocamlfuse.Qiita's How to load files in Google Collaboration is also helpful.


2022-09-30 18:17

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.