I'd like to save a lot of images for machine learning, but is there any way to save images on the web at once or efficiently?
I'm using a Mac.
The questions are vague, so the answers are vague.
Generally, we can know where the image is. If we know where the image is, we can save it. Therefore, if we can know where each of the many images we want, we can save them in order.
There are many ways to get images. For example, flickr has a searchable api, so you can use it. If you want to collect images displayed on any site, you can get the url of the images you want in various ways.
There are so many ways to write about storage in each language. Simply call wget
or curl
, for example, python can throw a lot of urlllib.request.urlretrieve
.I think I'm going to be ruined.
However, if you want to try machine learning, it's usually best to use published datasets (select images, check if you need anything, add annotations if necessary). If you're using some introduction, you'll often find a recommended dataset. MNist is also famous.https://research.googleblog.com/2016/09/introducing-open-images-dataset.html"rel="nofollow noreferrer">It looks like they're doing something, and there seems to be a list of articles in English Wikipedia.
Generally speaking, you will need a book to explain how to implement and how to behave.
There is a pinpoint book, so I will introduce it to you.
I didn't read it, so I can't guarantee the contents.Also, there are other books with the same theme.
© 2024 OneMinuteCode. All rights reserved.