0
Could these lead to Neural Network's not learning?
[the reason I put the question in a code is because I was unable to post the question, it was saying No Internet Connection though I had] https://code.sololearn.com/cTUxt4liiloE/?ref=app
5 ответов
+ 1
In real world such small differences really don't matter. There is a limited space for neural networks tuning anyway. So in order not to waste time and resources, you stop the training when accuracy results are "business viable" - so when the revenue from further training is less that the cost associated with it.
+ 1
Alexander Thiem Dynamic learning rate is surely something worth considering when fine-tuning the model. In the end though it's always a matter of balance between the time/cost spent versus the revenue gained :)
0
Kuba Siekierzyński Isn’t it perfect if there is a learning rate, which gets higher if there is no changing or only a small changing of the loss and if the loss gets higher the last learning step is made backwards and the learning rate gets smaller. and if then the result is good enaugh it stops???
0
Kuba Siekierzyński Thank you for the good answer