What is Least Squares?

Ғылым және технология

A quick introduction to Least Squares, a method for fitting a model, curve, or function to a set of data.
TRANSCRIPT
Hello, and welcome to Introduction to Optimization. This video provides a basic answer to the question, what is Least Squares?
Least squares is a technique for fitting an equation, line, curve, function, or model to a set of data. This simple technique has applications in many fields, from medicine to finance to chemistry to astronomy. Least squares helps us represent the real world using mathematical models.
Let’s take a closer look at how this works. Imagine that you have a set of data points in x and y, and you want to find the line that best fits the data. This is also called regression. For a given x, you have the y value from your data and the y value predicted by the line. The difference between these values is called the error, or the residual.
This residual value is calculated between each of the data points and the line. To give only positive error values, the error is usually squared. Next the individual residuals are summed to give the total error between the data and the line, the sum of squared errors.
One way to think of least squares is as an optimization problem. The sum of squared errors is the objective function, and the optimizer is trying to find the slope and intercept of the line that best minimizes the error.
Here we’re trying to fit a line, which makes this a linear least-squares problem. Linear least squares has a closed form solution, and can be solved by solving a system of linear equations. It’s worth noting that other equations, such as parabolas and polynomials can also be fit using linear least squares, as long as the variables being optimized are linear.
Least squares can also be used for nonlinear curves and functions to fit more complex data. In this case the problem can be solved using a dedicated nonlinear least squares solver, or a general purpose optimization solver.
To summarize, least squares is a method often used for fitting a model to data. Residuals express the error between the current model fit and the data. The objective of least squares is to minimize the sum of the squared error across all the data points to find the best fit for a given model.

Пікірлер: 29

  • @randolfshemhusain2298
    @randolfshemhusain22984 ай бұрын

    Visual learning makes things so much better.

  • @mymanager3662
    @mymanager36622 жыл бұрын

    excellent animation and explanation, simple to follow and understand!

  • @MultiJx2
    @MultiJx2 Жыл бұрын

    compact and thorough at the same time. thanks !

  • @SwanPrncss
    @SwanPrncss Жыл бұрын

    Omg, your explanation is better than other youtube videos and my teacher because I'm a visual learner.

  • @DiditCoding

    @DiditCoding

    Жыл бұрын

    Noice

  • @raminbohlouli1969
    @raminbohlouli196910 ай бұрын

    Simple yet extremely informative👍

  • @tonmoysharma5758
    @tonmoysharma5758 Жыл бұрын

    Excellent video and also quite easy to understand

  • @plep-m555ww
    @plep-m555ww7 ай бұрын

    Great video and helpful channel! Khan academy and the organic chemistry guy are getting old and less helpful as school curriculums develop. Super grateful for these simple, direct explanations

  • @anshisingh1915
    @anshisingh1915 Жыл бұрын

    lovely brooo, such good animation, now i have the concept in my head.

  • @mhd112211
    @mhd1122114 ай бұрын

    Thanks a lot, I have the curse of being a visual learner and this was amazing.

  • @VictoriaOtunsha
    @VictoriaOtunsha Жыл бұрын

    Thank you for simplifying this

  • @funfair-bs7wf
    @funfair-bs7wf Жыл бұрын

    This is a great little video !

  • @ernstuzhansky
    @ernstuzhansky Жыл бұрын

    Excellent! Thank you.

  • @mesfindelelegn1165
    @mesfindelelegn1165Ай бұрын

    clear and brief idea

  • @tomerweinbach4059
    @tomerweinbach4059 Жыл бұрын

    great explanation!

  • @rohiniharidas
    @rohiniharidas2 жыл бұрын

    All videos are excellent

  • @fabianb.7429
    @fabianb.7429 Жыл бұрын

    Just perfect. Thanks

  • @yoshitha12
    @yoshitha12 Жыл бұрын

    Thank you... ❤

  • @ahmedshalaby9343
    @ahmedshalaby93438 ай бұрын

    in 2 mins just you explained everything

  • @Keyakina
    @Keyakina Жыл бұрын

    But residual != error?

  • @Jkauppa
    @Jkauppa2 жыл бұрын

    fit a function f(x) to data with normal noise, f(x) can be a line, or a polynomial, etc, includes outlier handling, least squares is very sus

  • @Jkauppa

    @Jkauppa

    2 жыл бұрын

    line with normal noise is a better answer than just a line

  • @Jkauppa

    @Jkauppa

    2 жыл бұрын

    constant std additive normal noise assumed

  • @Jkauppa

    @Jkauppa

    2 жыл бұрын

    think like you are removing a base line function from the data points, either linear (like pca, f(x)=kx+c) or polynome (nonlinear pca, f(x)=...+ax^2+bx+c), then checking if the noise is from a normal distribution, ie, trying to make the noise after removing the base line as normal as possible, if you do linear, the noise might not be normal, so you get only a partial pca component fit, kinda

  • @alice20001
    @alice20001 Жыл бұрын

    Why use the squares instead of the absolute values?

  • @laraelnourr

    @laraelnourr

    Жыл бұрын

    because they are easier to compute and deal with mathematically. But we can use absolute values too!

  • @faheemrasheed9967

    @faheemrasheed9967

    Жыл бұрын

    because it gives more clear picture if we have error of ,1 and if we square it it will give 0,01 which is kind of scaled.

  • @AchiragChiragg

    @AchiragChiragg

    6 ай бұрын

    ​@@faheemrasheed9967actually it's the other way around. It's better to use absolute value instead of squares as it can amplify the outliers and influence the final fit.

  • @rudypieplenbosch6752
    @rudypieplenbosch6752 Жыл бұрын

    no example, pretty useless

Келесі