The following scripts, which have worked well so far, now have errors:
import requests
import mechanismize
b=mechanize.Browser();
b.set_handle_robots (False);
b.set_handle_refresh(False);
url="http://info.finance.yahoo.co.jp/history/?code=9783";
h = b. open(url);
Other url (for example, http://www.google.com
) works.
Why?
The errors are as follows:
File"/usr/local/lib/python 2.7/dist-packages/mechanize/_mechanize.py", line 203, in open
return self._mech_open(url,data,timeout=timeout)
File"/usr/local/lib/python 2.7/dist-packages/mechanize/_mechanize.py", line 255, in_mech_open
raise response
mechanize._response.httperror_seek_wrapper:HTTP Error 999:Unable to process request at this time -- error 999
It seems to be self-resolved, but I have checked it, so I will write it down.
According to Rate Limits for Yahoo Finance (999 error codes)?-YDN Forums there are examples of 999 errors returning when accessed 400 times per hour from the same IP address.
I think it will be solved by suppressing the number of accesses per hour.
© 2024 OneMinuteCode. All rights reserved.