0

Could these lead to Neural Network's not learning?

[the reason I put the question in a code is because I was unable to post the question, it was saying No Internet Connection though I had] https://code.sololearn.com/cTUxt4liiloE/?ref=app

14th Jun 2020, 8:29 AM
Mustafa K.
Mustafa K. - avatar
5 Respostas
+ 1
In real world such small differences really don't matter. There is a limited space for neural networks tuning anyway. So in order not to waste time and resources, you stop the training when accuracy results are "business viable" - so when the revenue from further training is less that the cost associated with it.
15th Jun 2020, 10:43 AM
Kuba Siekierzyński
Kuba Siekierzyński - avatar
+ 1
Alexander Thiem Dynamic learning rate is surely something worth considering when fine-tuning the model. In the end though it's always a matter of balance between the time/cost spent versus the revenue gained :)
15th Jun 2020, 5:47 PM
Kuba Siekierzyński
Kuba Siekierzyński - avatar
15th Jun 2020, 10:31 AM
Mustafa K.
Mustafa K. - avatar
0
Kuba Siekierzyński Isn’t it perfect if there is a learning rate, which gets higher if there is no changing or only a small changing of the loss and if the loss gets higher the last learning step is made backwards and the learning rate gets smaller. and if then the result is good enaugh it stops???
15th Jun 2020, 4:56 PM
Alexander Thiem
Alexander Thiem - avatar
0
Kuba Siekierzyński Thank you for the good answer
15th Jun 2020, 6:00 PM
Alexander Thiem
Alexander Thiem - avatar