3.4: Linear Regression with Gradient Descent - Intelligence and Learning
In this video I continue my Machine Learning series and attempt to explain Linear Regression with Gradient Descent.
My Video explaining the Mathematics of Gradient Descent: • 3.5: Mathematics of Gr...
This video is part of session 3 of my Spring 2017 ITP "Intelligence and Learning" course (github.com/shiffman/NOC-S17-2...)
Support this channel on Patreon: / codingtrain
To buy Coding Train merchandise: www.designbyhumans.com/shop/c...
Send me your questions and coding challenges!: github.com/CodingTrain/Rainbo...
Contact:
Twitter: / shiffman
The Coding Train website: thecodingtrain.com/
Links discussed in this video:
Session 3 of Intelligence and Learning: github.com/shiffman/NOC-S17-2...
Nature of Code: natureofcode.com/
kwichmann's Linear Regression Diagnostics: kwichmann.github.io/ml_sandbo...
Linear Regression on Wikipedia: en.wikipedia.org/wiki/Linear_...
Source Code for the all Video Lessons: github.com/CodingTrain/Rainbo...
p5.js: p5js.org/
Processing: processing.org
For More Coding Challenges: • Coding Challenges
For More Intelligence and Learning: • Intelligence and Learning
Help us caption & translate this video!
amara.org/v/7Yh8/
📄 Code of Conduct: github.com/CodingTrain/Code-o...
Пікірлер: 172
This is the amount of enthusiasm I need from my professor. Keep up the good work, sir!
You are a really great teacher. Watching you, we are feeling that you re-discover what you already knows with us ! I think it is the perfect way to learn people knowledges !
got stuck on gradient descent from the andrew ng coursera course, so as always, I'm back here for more digestable explanations. love your teaching style!
Excellent. Love the way you present - enthusiastic, excited, but totally at ease.
Hey, I am watching your channel for the first time and I am amazed how good you explain things! I am a teacher myself and I find you very inspiring!
2:35 spoiler for Avengers: Infinity War
Great videos Daniel! Thank you! I started a IA course at college this semester (it's almost ending now), and this helped me to settle what I was studying. Keep it up!
man, this series with both board and coding together is really the best from yt, congrats
Really awesome video! Thank you for making machine learning and math so much fun!!
Dan i love how you get so excited to explain things.. So much to say! 😅 super cute. Plus so informative. I m glad I found this channel.
Keep up the good work. Your teaching is the best, especially when it comes to complicated topics.
This is the most intuitive explanation of linear regression. Thank you sir!
Its incredible when you display the error and guess values, my next try is to make a learning rate which changes depending on the numbers behind the comma. This tutorial is awesome!!
How awesome is this explanation, theory + programming is the way to go Coding train
Thank you Dan. Really you made this topic so easy to understand. Keep up the good work.
Thank you for this. I was taking a coursera course on machine learning and got stuck on week one (incredibly frustrating!!) because half the math instructions didnt make sense. I had no idea it was so simple! I just passed week one. Thank you.
Thank you so much! I've been wanting to go over statistics to start diving into ml and mv you've just made my day!
@TheCodingTrain
7 жыл бұрын
I'm so glad to hear, thank you!
Thank you for making these! Very informative!
@TheCodingTrain
7 жыл бұрын
You're welcome!
@st101k
7 жыл бұрын
I agree ;)
This was a great visual representation of SGD, thank you!
You single handedly made me go into cs. Thank you for your inspiration.
Great videos Dan keep up the good work. The code really helps getting a handle on the theory.
@TheCodingTrain
7 жыл бұрын
That's great to hear.
Awesome cool..... What a teaching style I really love it you made my day by understanding linear regression with simple story really love you man
Shiffman is always nice man. Love you Guru !
Hey Dan, thank you so much for making all these videos (: You're amazing!
This channel is really an amazing place to learn high programming algorithm. thank you for the videos Mr shiffman.
@TheCodingTrain
7 жыл бұрын
Thank you!
You're really amazing! Thank you so much. Really enjoyed the way you explain things.
you are really incredibly awesome teaching Sir!!!!... there is no words say....
Great videos! You are good at making videos by just being yourself and explaining in the best way possible. :))
Sooo impressed by the white board being magically erased! I watched the live stream and thought it would be a total disaster; well, I'm beyond impressed - some fine editing there! :) Loving the ML series so far Dan.
Dude thank you so much for the intuition! many ppl don't bother going through that
the snap was cool ..... but we saw the truth in livestream lol😁
It worked yay haha. I was waiting for it. I was watching at the time though.
you're the boss. Very good explanation, loved it!
That is an awesome use of DOM man!
Dan is wearing a funky t-shirt! looks good!
You are hilarious man! Best teacher on youtube for machine learning
thank you for showing me how to implement multivariable calculus in programming!
Very interesting !
I must say i like the way you teach . You're a nice man God bless .
you are like my coding guru lol thanks so much mr dan for your help!
All videos by you are rocking
Hi, I love your videos...I think they are amazing! I'm Italian and don't understand many words😕 you are great!
@TheCodingTrain
7 жыл бұрын
Thank you! I need to get more language subtitles!
great explanation
Nice tutorial channel!
you are a good man. thank u
Thank you for your awesome and easy to understand explanations! :) But I have a question regarding the code from 18:08 Why can we see the line moving instead of being just in its final position? So far, as I can see it in the code, the drawline() method is called after the gradientDescent() method. What am I missing here?
Hi Dan, I really enjoy your movies, I'm a self thought programmer and your movies give a real good insight in different kind of algorithms. Maybe nice to know... I'm actually a railtrack (P-Way) engineer and we use for example the least square method quite a lot. Keep up the great work! Ps. If your interested in some actual train datasets (from the Dutch Rail Network) leave a message.
@TheCodingTrain
7 жыл бұрын
Oh yes, that could be good!
Very helpful
Wow! machine learning!.. you gave understanding how they work and how they write by line by line without package unlike package like tensor flow XD Wow..thank u
You need separate learning rates for m and b. Then set the learning rate for b higher than the one for m so it would rotate faster, but move up and down slower.
Great videos! :D You are the best! Do you recommend going with "Intelligence and Learning" sessions after p5.js introduction for someone who wants to get into Machine learning?
GREAT Video .. Thanks a Lot
@jasdeepsinghgrover2470
7 жыл бұрын
had a small doubt, shouldn't the change in slope be error/x instead of error*x?as it is rise / run
Are you going over gradient descent because it's used by the back propagation algorithms for neural networks? Because I can't wait to watch you do stuff with NN's.
@TheCodingTrain
7 жыл бұрын
That's right!
I love this video Great Tnx Nice logo on your shirt
wonderful. nobody can teach better than you.
@TheCodingTrain
7 жыл бұрын
Thank you so much!
Really superb explanation of Gradient Descent. Is there any book which you refer or suggest us for Machine Learning ?
Would you have two separate learning rates for m and b? Seems like weighting the slope change higher could be beneficial.
Hi Dan, greate video. I had watch most of your videos and I would be glad if you could make video about addEventListener and what advantages and disadvantages over onclick, onblur, onmouseover... thank you in advance
So my guess on an explanation on these lines: m = m + (error * x) * learning_rate; b = b + (error) * learning_rate; First line: think about the question "when I change m, how does that affect y?". This is what calculus is used for, more specifically differentiation. The answer to the question is written in math as dy/dm, if our line expression is defined as: y = m * x + b. dy/dm = D(m * x + b, m) = x. This is why the error should by multiplied by x. For the second line same thing! Change of y when changing b? dy/db = D(m * x + b, b) = 1. We could multiply error by 1, or leave it out as Shiffman did. What does the D function do? It differentiates the expression with regards to the second parameter passed. To calculate this you can either use a calculator, use a lookup table of rules or derive the answer yourself following the proof.
@troatie
7 жыл бұрын
This isn't quite right I don't think? Shouldn't you divide by x? Let's say your error was 1. So you want to change y by 1. If you change m by 1 you'll get a change of x! out of that. If you change m by 1/x you'll get the 1 out that you want. Or maybe written out... e1 = y - m1 * x - b1 e2 = y - (m1 + m_change) * x - b1 if you want e2 to be 0, then you get 0 = y - m1 * x - m_change * x - b1 = y - m1 * x - b1 - m_change * x = e1 - m_change * x m_change = e1 / x
@Contradel
7 жыл бұрын
I'm not sure I'm following you. But if, for one of the datapoints, the error is 1, you want to adjust the parameters (m and b) a small amount (learning_rate), weighted by error, so that for all your datapoints you get closer to a best fit.
Would it be possible Applying PID control scheme to the learning rate, so it will accelerate our learning process?
So the steer on the graph, it would be vertical line between Yguess and Yactual as a difference?
velocity in this example doesn't mean speed? but instead means heading?
Where is the code ??i am not finding it in github?
So the so called steer is the delta of the weights? So called the change of weights in each iteration/epoch?
can anyone help me out in what way should I solve a system of over determined non linear equations?
Hello, there is a traduction of your description and your title in french. I live in France and I can't disable this, how to do it please ?
That "come back to me" ... hahahahaha
Awesome
why you multiply error * x by learning_rate??
what is the best book for machine learning?
is the desired velocity given ? , when i already know which direction is my target.. why would i choose other side and steer it ? Could someone please shed some light on desired velocity
hey, nice video. could you explain why you normalize the values between 0 and 1 and what it does? i tried not normalizing them and i got some really wacky results using gradient descent even though it worked fine with the Ordinary Least Squares method. do you know why that happens?
@gonengazit
7 жыл бұрын
Julian atlasovich but it didn't work without normalization
What you're describing here is effectively a kalman filter?
Make a video on lasso regression without library as did for linear regression
Your snap has inspired Thanos :D
Cost function in this video is mean sqaure error?
Dear sir, if you give any suggesion to understand that formula : "DELTA_m = error * x " , I will be very greatful .
which platform does he using?
cool video
Hey Dan! I really like your videos, but sometimes you seem so lonely in that studio. :D Wouldn't be something like a co-op coding challenge awesome?
@TheCodingTrain
7 жыл бұрын
Hah, love this idea!
@BinaryReader
7 жыл бұрын
great stuff Dan, this stuff is invaluable for anyone starting out in ML. top stuff.
@stefanoslalic2199
6 жыл бұрын
can you host me?
Hey! Great Video! But...how is it possible that the line is self adjusting...According to the code..
I think you are awesome 😊😊
I would love to see the snapping of the fingers live :pp
Would you please elaborate the implementation of Gradient Descent Algorithm using vectorization method in python?
@TheCodingTrain
4 жыл бұрын
Our Coding Train Discord is a great place to get help with coding questions ! discord.gg/hPuGy2g - The Coding Train Team
Sir, I wanted what they seem to have
0:05 Hahahah my complete life in 1 question
I need a code for linear regression for n variables in java
Brother , Do you have any slack Channel or discord?
The concepts in this are very similar to the perceptron model
What if multiple regression?
Well explained. it will be nice to see the code. Cant find it on github
@TheCodingTrain
6 жыл бұрын
github.com/CodingTrain/website/tree/master/Courses/intelligence_learning/session3 (Need to figure out a way for things to be more findable!)
great video!! where can I get the code?
@TheCodingTrain
Жыл бұрын
Apologies that it is missing, please file an issue here! github.com/CodingTrain/thecodingtrain.com/issues
Really rookie right now... Gotta progress fast!
Maybe it would be cool if you made an AI for a simple game like noughts and crosses with a minimax algorithm
For a more complete and in depth discussion of Linear Regression with Gradient Descent check out Professor Andrew Ng of Stanford series of machine learning videos: kzread.info/dash/bejne/goSA0dJtfJXLd84.html
could you please share me this gradient descent code.
How old are you and how old you was when u started programming
Hey Thanks for The awesome video, i dont understand why not calculate the correct line directly ?
@DaSodaPopCop
6 жыл бұрын
The reason for this is because he is not simply writing a program that finds the correct line. He specifically is writing this program in such a way that implements and showcases the idea of back propagation. Calculating the line directly would be the most efficient way to write this program, but that's not the point of the video. There will be instances with much higher dimensional data where prognostication is much more efficient than doing what you suggest, such as in a Neural Network.
@DaSodaPopCop
6 жыл бұрын
look at 19:14 for his explanation
@blackdedo93
6 жыл бұрын
makes sense Thanks. but can u give examples or reference on why would i need this learning process
PID ? As always thx Dan...
You have done the snap even before Thanos have done that =D