+ 4
Machine learning with linear regression help
I would like to refine my linear regression algorithm by utilizing the error, im curious if anyone knows how to properly feed the error back into the equation to reduce the overall prediction error. Heres my code https://code.sololearn.com/cAsD4be53sX9/?ref=app
4 Réponses
+ 4
I could be wrong but after looking at your code I think you are doing something different. It seems you are determining a regression line from a set of random points.
When you build a machine learning model, the concept is you have a deterministic set of data (inputs, outputs) and you want to teach your AI algorythm how to reach that output, by trial and error. For a regression line this would be like
y = a*x + b
Where x is the input variable, y is the output, and a and b are the parameters or weights that your learning has to guess.
I have never done this either and I am pretty new to the topic but thats how I see it...
+ 3
Have you checked the lessons about machine learning? They have excellent examples code for linear regression, although in Python might be able to see the concept.
https://www.sololearn.com/learn/744/?ref=app
https://www.sololearn.com/learn/716/?ref=app
+ 2
This Code is some different but maybe you find a Hint within:
https://code.sololearn.com/ct7dYXGF9B1X/?ref=app
+ 1
Help needed please