The following error occurred when I ran v3.py for deep-learning-models of learned models:
Using TensorFlow backend.
Traceback (most recent call last):
File "acception_v3.py", line 400, in<module>
model=InceptionV3 (include_top=True, weights='imagenet')
File "v3.py", line386, in InceptionV3
model.load_weights(weights_path)
File"/home/pi/.virtualenvs/cv/lib/python 3.5/site-packages/keras/engine/topology.py", line 2646, inload_weights
raise ImportError('load_weights require h5py.')
ImportError: load_weights require h5py.
I was able to confirm that h5py is installed in the pip list, but when I checked it in python interaction mode, I found out that
import h5py
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File"/home/pi/.virtualenvs/cv/lib/python 3.5/site-packages/h5py/__init__.py", line 26, in<module>
from.import_errors
ImportError: libhdf5_serial.so.100: cannot open shared object file: No such file or directory
and could not be imported.
I reinstalled it with the command below, but the result was the same.
pip3 uninstall h5py
pip3 install -- no-cache-dir h5py
How can I import h5py?
python raspberry-pi tensorflow keras
Based on the error, the libhdf5-serial library is missing, so try installing it.
Ubuntu can be installed using the following commands:
sudo apt-get install libhdf5-serial-dev
© 2024 OneMinuteCode. All rights reserved.