Squared error of regression line | Regression | Probability and Statistics | Khan Academy
Courses on Khan Academy are always 100% free. Start practicing-and saving your progress-now: www.khanacademy.org/math/stat...
Introduction to the idea that one can find a line that minimizes the squared distances to the points
Watch the next lesson: www.khanacademy.org/math/prob...
Missed the previous lesson?
www.khanacademy.org/math/prob...
Probability and statistics on Khan Academy: We dare you to go through a day in which you never consider or use probability. Did you check the weather forecast? Busted! Did you decide to go through the drive through lane vs walk in? Busted again! We are constantly creating hypotheses, making predictions, testing, and analyzing. Our lives are full of probabilities! Statistics is related to probability because much of the data we use when determining probable outcomes comes from our understanding of statistics. In these tutorials, we will cover a range of topics, some which include: independent events, dependent probability, combinatorics, hypothesis testing, descriptive statistics, random variables, probability distributions, regression, and inferential statistics. So buckle up and hop on for a wild ride. We bet you're going to be challenged AND love it!
About Khan Academy: Khan Academy offers practice exercises, instructional videos, and a personalized learning dashboard that empower learners to study at their own pace in and outside of the classroom. We tackle math, science, computer programming, history, art history, economics, and more. Our math missions guide learners from kindergarten to calculus using state-of-the-art, adaptive technology that identifies strengths and learning gaps. We've also partnered with institutions like NASA, The Museum of Modern Art, The California Academy of Sciences, and MIT to offer specialized content.
For free. For everyone. Forever. #YouCanLearnAnything
Subscribe to KhanAcademy’s Probability and Statistics channel:
/ @khanacademyprobabilit...
Subscribe to KhanAcademy: kzread.info_...
Пікірлер: 41
Mind blown. I wish I had seen this at the beginning of the semester!
This video series is completely amazing, thank you!
Omg, I wish I had you as a professor!
wonderful...thanks for the clear explanation!
Thanx so much for making this kinda videos!!!
Great video! Thank you!
just out of curiousity Mr. Khan, do you yourself review how to do these on your own before you show us? like do you have to get ready? or does it just come off the top of your head? be honest now..... ;) math is magic!
@Inquiett agree with you that Squaring always gives a positive value, so the sum will not be zero using absolute value is also possible and it's actually used as well however, another benefit of squaring is that squaring emphasizes larger differences (which is good and bad)
@ahmadhuseynli2073
6 жыл бұрын
Ahmed Omara thanks for the comment. i was gonna post it as a question, and i already so your comment as an answer :)
@TheMustufa123
4 жыл бұрын
Another thing is that Derivation of Absolute function is not continuous that's why we can't use it in minimize Error.
Thank You
Thank you very much 💗💗💗💗💗💗
Most simlistic explaination of Mean Squared Error
why we square errors?
The 'm' and 'b' variable of the regression line can also be solved using Linear Algebra.
@MO-xi1kv
4 жыл бұрын
A much cleaner if not more abstract way to go about.
Thank you for this video. Why is the squared error a property that determines how good a line is ?
"Minimizes my probability of a mistake" I see what you did there. :D
@thechosenone2004B
Жыл бұрын
lol
@dalcde Yes, its the same thing. Bet you've finished the stats class already. :D
why we are adding the squared errors in order to generate the error function? Is there any strong reason behind that? Thanks
Is this the method of least squares?
Why we are taking square at 05:16
@Tibetan-experience
5 жыл бұрын
can you see the points above the line and bellow the line ? change in y value in these lines will give you both positive and negative values. We need the see how will line fits in the graph so that y distance from all points is the minimum. To see that negative value dose not make sense. thats why square all change in y values which you get total sum of change in y values form the POINTS.
@ThePritt12
5 жыл бұрын
@@Tibetan-experience This does not explain taking the square (one could take the absolute value instead). He takes the square because he wants to derive the sum of "squared error". that simple.
why we take the vertical distance why not perpendicular?
@leojin5151
2 жыл бұрын
same question..
The only thing that I don't get is why squared... why not just take the distance of point to the line? The only reason I can come up with is that it will remove the negative sign for each distance?
thank you for this video ,but have a doubt ,,error 1 same as you taught "(mx1+b)-y1 , but is error 2 "y2 _(mx2+b)" ?,as these points are lying in two sides of that line y=mx+b .kindly help me
@ahmadomara
8 жыл бұрын
difference is squared, so it's the same if: (mx+b)-y or y-(mx+b)
@jakerfle
3 жыл бұрын
yeah technically error 1 = (mx1+b) - y1 but because it is going to be squared anyways later on it doesn't matter
Don't they use a similar type of method to this (linear regression) in machine learning?
@CHOSO93
5 жыл бұрын
indeed.
woooow
This method seems less useful than taking the perpendicular distance to the lines. The math is easier to work out this way though.
getting confused between error and residual
Very discouraged, at no fault to the instructor or video at all. I Have an extremely difficult time with numbers. Takes me a very long time to comprehend formulas and their explanation. With that being said, I'm still very confused on how to work the formula to determine the squared error.
Completely lost me at 3:40. UPDATE: I see now. m = slope. So: Y = 3x + 4 3 is the slope.