+ 3
Neural Network Questions
I am REALLY struggling learning about neural networks. Here are some questions which appear to have no simple answer What is the dot product? What does sigmoid do? What is a node? What is error? What is bias? What is logistic regression? How do you which activation function to use? It has been a very long time with no progress, this is so stupidly hard for me to get, everyone understands it but me and I don't know what to do
15 Antworten
+ 3
What you should be looking for is a basic statistics course, not python.
+ 2
The problem here seems to be you keep looking at the complicated stuff and never find the basic stuff. You could find the andwers to some of these questions just by googling the exact question as phrased in your post here. For example, google âwhat is the dot productâ and it will explain what the dot product is and how to multiply matrices with it. Matrices are used to manipulate large amounts of data at once by stuffing it into one single large array-like structure and performing things such as the dot product on it.
The sigmoid function, again, is something you can google. What it does, assuming x is the input variable, is it for values of x between 1 and 0 returns a similar value as the input. For values of x larger than 1, it returns values close to 1, and for values of x below 0 it returns values near 0. In other words, it converts a range from negative infinity to positive infinity to a range from 0 to 1.
+ 2
[continued]
A node is a neuron in a neural network.
Error is how far a neural network deviates from the correct answer when answering a question.
Bias is a measure of how biased a node is towards outputting 1 or 0. A nodeâs output is calculated like this: it has a number of inputs stored as a 1D matrix (basically an array). It multiplies all those inputs by their corresponding weights. Each input may have a weight. These weights are stored as another 1D matrix (another array). Then the inputs which have been multiplied by their individual weight are summed up. Taking the dot product of the inputs and their weight does all this in one operation called the dot product. Then the bias is added to the dot product of the inputs and the weights. Then the sigmoid function is applied to make this output a value in the range of 0 to 1 instead of the range of negative infinity to positive infinity.
+ 2
[continued #2]
I forgot what logistic regression is đ
but it might be related to backpropogation which is the process of learning for a neural network.
The activation function Iâve been talking about is the sigmoid function and the sigmoid function is the most common activation function. There are other activation function for specific purposes but I donât know of any specific ones.
For further understanding I recommend the book (technically a website) at www.neuralnetworksanddeeplearning.com as it introduces the reader to neural networks and assumes no prior knowledge of neural networks and needs no advanced math (that of course depends on your definition of advanced math. Familiarity with matrices is recommended but not required. If you want, think of them as arrays or sometimes arrays of arrays, aka 2D arrays, and the arithmetic functions including the dot product as functions you apply to these arrays that return either a number or another array). If you have trouble with something, for example some mathematical symbol you donât recognize, you should google it instead of throwing up your hands and saying you have no understanding of anything. As surprising as it may be, you actually arenât expected to know anything and everything and its totally normal to not understand something or to have questions. Just make sure to ask google things like âwhat is this weird Z symbol in mathâ (the symbol Iâm mensioning here is the capital sigma, the symbol for taking a sum of something, such as all elements in a matrix. I recommend the website âMath is Funâ for help with all things math related, such as the summation function).
+ 2
Jason Stone Thank you so much! I kinda feel guilty now, you took so much time helping me and writing responses with huge detail, you didn't have to
+ 2
çĆĆŻÄÄșÄĆĆ ÄÞðĂȘĆ Feeling with you. Understanding ML and especially Neural Networks is pretty tough for me, too. I had to completely learn about matrices and derivations etc. All of those fancy mathematical signs meant nothing to me. I am not fully there yet. But after taking some online courses it is starting to make sense. (As always) I would recommend to different sources, especially online courses, to learn it. The link by Jason Stone also seems nice. It seems a bit outdated but that won't matter if you just want to learn the basics.
The logistic function is actually the sigmoid function. So I think that you could say that a logistic regression is what happens in the single neurons of the network. You probably don't use that term because other activation functions like Relu and tanh are more widely used (that's at least what the MITx course on ML says). While a perceptron does a hard classification (1 or 0), the sigmoid (or logistic regression) returns a likelihood between 0 an 1 (see Koder King s post).
+ 1
çĆĆŻÄÄșÄĆĆ ÄÞðĂȘĆ donât worry, it was fun! I enjoy helping others. And make sure to check out that link. In my opinion itâll be essential for understanding neural nets, and it was quite an intersting read for me.
+ 1
Koder King in order to understand this, one needs to know what in the world a logistic function is or what dependant variables vs independant variables means. I donât know these things.
+ 1
Koder King ok. I probobly wonât tho, unless I need to explain it to someone, since Iâm not really that interested in learning new things about neural nets at the moment. Instead, I think Iâll go play some Terraria now :D
+ 1
Jason Stone Iâve actually never seen sigmoid activation function used. The most common one Iâve seen is probably ReLU, or Rectified Linear Unit, but Iâve also seen SELU, or Scaled Exponential Linear Unit, softmax, and hyperbolic tangent.
+ 1
çĆĆŻÄÄșÄĆĆ ÄÞðĂȘĆ LOL If you use TensorFlow, you donât really have to know that much about them, especially with Keras, because you just tell it which one to use and it does all the work for you. In Keras, it looks something like this:
tf.keras.layers.Dense(128, activation = tf.nn.relu)
So itâs really not THAT bad. Especially because TensorFlow tutorials are pretty helpful.
+ 1
Iâve never tried TensorFlow.js, but from what I read on TensorFlowâs website, it seems like most of the training and stuff should really be done in Python, and TensorFlow.js is mainly for adding pretrained models to your web page/app. I didnât really look into it though, so I could be wrong. I would definitely recommend looking more into the Keras API, because itâs extremely simple and quite powerful, too.
0
Rora well, perhaps sigmoid isnât the most common, but its the simplest. At the very least its the one I was introduced to first.
- 1
Rora I have absolutely no idea what any of those are. Those terms just by the name are scary đ
- 1
Rora If you look at my profile you'll see I spend way too much time using Javascript. Do you think it's worth settling on Tensorflow.js once I understand neural networks?