0

How to use ReLU?

I am trying to make a simple neural network with c++. After a time I have noticed that there are different Activationfunctions. I want to use ReLu but I have no idear how to make the neural network non linear. In the first layer I have a linear Activationfunction, in the second ReLU and in the third again a linear one. I get only linear(except 0 and below) functions. What I am doing wrong. Thanks for answering

22nd Jan 2020, 12:11 PM
ld programs
2 Respostas
+ 2
Have you used sigmoid?
29th Jan 2020, 10:57 PM
George Ryan
George Ryan - avatar
+ 2
R(z) = max(0, z) thats relu. ReLu is a straight line when graphed on the positive plane(axis?). It is not differentiable at 0. For more go to https://medium.com/@kanchansarkar/relu-not-a-differentiable-function-why-used-in-gradient-based-optimization-7fef3a4cecec
29th Jan 2020, 10:57 PM
George Ryan
George Ryan - avatar