+ 22
I'm learning an new concept in machine learning can you help me to figure out which one is more optimal to choose and why??
Which method would you prefer among L2 regularization, dropout technique, early stopping to deal with the over fitting issue of your deep learning model ?
3 Antworten
+ 9
Well I don't know much though it's in my fourth year..but as far as I can understand..
There are two ways to approach an overfit model:
>Reduce overfitting by training the network on more examples.
>Reduce overfitting by changing the complexity of the network.
+ 7
Without knowing your model, it's hard to give any proper advice. But early stopping seems to be kind of a universal weapon of choice in trying to handle overfitting issues. Combinations of methods are not uncommon though.
This might be helpful: https://machinelearningmastery.com/introduction-to-regularization-to-reduce-overfitting-and-improve-generalization-error/
0
What's your model?
- L2 is ridge regression to eliminate co-linearity
- Dropout is drop neurones randomly to make the model simpler
- Early stopping stops the iterations before overfitting because your algorithm runs well under certain number of iterations.
What's your case?