I made selenium crawl using Python and am using it.
As a result, the implementation structure is def function_name(url): Function Contents
It looks like above.
url = 'www.siteurl.com'
function_name(url)
I'm doing it like this How can I run multiple pages at the same time?
I entered different addresses in functionion_name(url) and executed them, so it became sequential I want to crawl in parallel, not in sequence
python function concurrent-execution crawling selenium
In a very simple way, if you need to crawl for url from 0 to 100,
You can simply spin multiple scripts.
Copy the source code you made and make another one, and change the page only.
In browser 1, 0-25, Number 2 is 26 to 50 ....
© 2024 OneMinuteCode. All rights reserved.