When training neural networks (NN) A and B with different configurations simultaneously with the same training data (batch), the inference accuracy of B is always much better than that of A in the first half, but the evolution rate of B decreases and the inference accuracy is gradually eliminated by A.
machine-learning deep-learning
I think it is common to judge whether it is overlearning or not based on the test data.
I can't say for sure that the questioner's perception of "overlearning" seems to be different.(According to the questionnaire, the difference in learning rates seems to affect the training course.)
© 2024 OneMinuteCode. All rights reserved.