I'm a beginner who started learning deep learning on his own.
Am I correct in understanding the following three methods of learning in the title?
for 1000 training images with 20 epochs
·Online learning - Since all data is used in one study, one study within one epoch
·Batch learning - To update weights one sheet at a time, study 1000 times within one epoch
· Mini-batch learning - In case of batch size 10, the training images are divided into 100 sets, so within one epoch, learning is done 100 times
1 I understand that epochs complete one study with all (equivalent to ) training data.
Thank you for your cooperation.
Strictly speaking, batch, online, and mini-batch are when to update the network parameters (weights).
Batch: Learn all and then update (no impact on the order of learning, tends to increase memory capacity)
Online: Update every time you learn data (the last data has a greater impact than the first)
Mini-batch: Learn and update N pieces of data (about 10 times larger than the number of classes you want to classify)
The network parameters (weights) are initially random and are updated (learned) to reduce loss. You can get output that reflects the characteristics of the learning data.
batch learning :Learn using all sample data
Online learning: Choose only one sample data (randomly) to learn
Mini-batch learning:
minibatch size of 10
Choose 10 of the sample data (randomly) to learn
Suppose the total sample data is 1000 sheets
Batch learning: 1 epoch after 1 study
Online learning: 1 epoch after 1000 studies
Mini-batch learning: 1 epoch after 100 studies
© 2024 OneMinuteCode. All rights reserved.