Linear Systems of Equations, Least Squares Regression, Pseudoinverse

Ғылым және технология

This video describes how the SVD can be used to solve linear systems of equations. In particular, it is possible to solve nonsquare systems (overdetermined or underdetermined) via least squares regression and the pseudoinverse.
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com/Data-Driven-Sc...
Book Website: databookuw.com
Book PDF: databookuw.com/databook.pdf
Brunton Website: eigensteve.com
This video was produced at the University of Washington

Пікірлер: 90

  • @dragoncurveenthusiast
    @dragoncurveenthusiast3 жыл бұрын

    This is so cool! When I started this lecture series in order to understand PCA better, I had no idea it would also relate to least squares regression! This blew my mind! Thank you so much for making these. They must be a lot of work, but they are so appreciated!

  • @neoblackcyptron
    @neoblackcyptron3 жыл бұрын

    Thank you for explaining hard to grasp concepts in a filtered simple manner for us to understand. Your lectures are a great complement to prof strangs both high quality content.

  • @philmccavity
    @philmccavity6 ай бұрын

    Exceptionally clear explanation, crisp hand-written notes, wonderful!

  • @leo1fun
    @leo1fun3 жыл бұрын

    I'm always impressed by how clean the board is, so it looks like there's nothing at all.

  • @macmos1
    @macmos14 жыл бұрын

    please keep going with the numerical linear algebra/numerical analysis/scientific computation/applied math stuff thanks :)

  • @chengkeattan6571
    @chengkeattan65712 жыл бұрын

    Thank you very much for the clear explanation of pseudo-inverse.

  • @pranjalsahu
    @pranjalsahu3 жыл бұрын

    Excellent Lecture ! So clear to understand! Thank You !

  • @idkravitz
    @idkravitz4 жыл бұрын

    Am I right, that you write on real glass in front of camera and the image is just mirrored by editing? If so its brilliant.

  • @bastianian2939

    @bastianian2939

    3 жыл бұрын

    Damn i just concluded he was an expert at writing mirrored

  • @sihebi973

    @sihebi973

    3 жыл бұрын

    The real video will look like this: www.mirrorthevideo.com/watch?v=PjeOmOz9jSY He is also using his left hand.

  • @PunmasterSTP
    @PunmasterSTP Жыл бұрын

    Pseudoinverse? More like "Super videos for us!" Thank you so much for making all of them.

  • @3d_chip
    @3d_chip4 күн бұрын

    My god, you just explained what my professor is trying to explain for 5 lectures

  • @mauzaomin3872
    @mauzaomin38724 жыл бұрын

    Thank you Steve for video. We make the assumption that it is an economy SVD at time 6:28. Then, how can we guarantee that V multiplies with V* will become identity matrix, especially for the under-determined system?

  • @zachfang4424
    @zachfang44243 жыл бұрын

    Thank you for the awesome materials and everything is well explained! One question I have is how we calculate the inverse of the singular matrix which is a non-square matrix? Isn't that back to the problem, i.e. inverse the non-square A matrix, we had at the first place?

  • @reihanehvafadar6139
    @reihanehvafadar61394 жыл бұрын

    can't thank you enough for sharing your knowledge with entire world

  • @amodamatya
    @amodamatya3 жыл бұрын

    Thank you Professor for this valuable lecture

  • @u2coldplay844
    @u2coldplay8443 жыл бұрын

    I loves your lectures! it is so clear and save us from mist of information. I noticed that you wrote Sigma matrix is a invertible, if A is not invertible, shouldn't the Sigma matrix also is not invertible, hence Pseudo inverse of Sigma matrix? Thank you for clarification

  • @Atlas-ck9vm
    @Atlas-ck9vm4 жыл бұрын

    Absolutely Great Content

  • @1985lama
    @1985lama2 жыл бұрын

    In the case of over-determined matrix X, why VVT is equal to identity since we are using economy matrices?

  • @mikatshow3932
    @mikatshow39322 жыл бұрын

    Great explanation, sir!

  • @FelipeCondo
    @FelipeCondo3 жыл бұрын

    Thanks for the video Professor. Could you help me with something please? I am trying to fit a sinusoidal surface the depends on (x,y,t) but in my case b is not a vector but a matrix, what can I do in this case? Thank you

  • @ehkim1977
    @ehkim19773 жыл бұрын

    This is really nice lecture I've ever seen. I want to recommend this lecture for engineering graduate student basic class!! :-) Thank you so much~ I'll buy the book! ^-^

  • @Eigensteve

    @Eigensteve

    3 жыл бұрын

    Awesome, thanks so much, and hope you like the book!

  • @amaniarman460
    @amaniarman4603 жыл бұрын

    am I just missing something, why is n

  • @utatistics9293
    @utatistics92934 жыл бұрын

    having second thoughts about doing master cuz your videos are just too helpful

  • @patf9770

    @patf9770

    3 жыл бұрын

    with all the amazing resources on the internet, it seems like higher ed is turning into mostly gatekeeping

  • @giorgoschristopoulos9087
    @giorgoschristopoulos90873 жыл бұрын

    In an underdetermined case, if you use the economy SVD, is V*V' equal to an identity?

  • @JohnJTraston
    @JohnJTraston Жыл бұрын

    Yeah. This is a very good and useful lecture.

  • @georgeyu7987
    @georgeyu79874 жыл бұрын

    i know this is just about the notation, but i think the majority of linear algebra text use m by n, rather than n by m. It's sometimes a little confusing here...

  • @annflintoft9222
    @annflintoft9222 Жыл бұрын

    Wonder if Steve Brunton can cover for under/over determined systems the nontrivial solutions of Ax = 0, using SVD.

  • @jay89boy
    @jay89boy3 жыл бұрын

    Great content !

  • @EngineeringChampion
    @EngineeringChampion3 жыл бұрын

    Since A has fewer rows than columns, then A is an m×n matrix with m

  • @ahsanahmed2505

    @ahsanahmed2505

    2 жыл бұрын

    so, he made a mistake?

  • @FortranCastle

    @FortranCastle

    9 ай бұрын

    @@ahsanahmed2505 He is using n by m in the videos and in his book instead of the usual convention of m by n which is quite confusing and contrary to most linear algebra resources.

  • @noahbarrow7979
    @noahbarrow79792 жыл бұрын

    knowing that the pseudoinverse exists makes me feel really powerful

  • @julianschmitz9525
    @julianschmitz9525Ай бұрын

    great video

  • @drditup
    @drditup3 жыл бұрын

    I'm amazed. This is so clearly explained!!

  • @ahsanahmed2505
    @ahsanahmed25052 жыл бұрын

    I am confused with the dimension of A matrix. in overperformed, shouldn't overdetermined have m>n

  • @Tyokok
    @Tyokok2 жыл бұрын

    Thanks for the great video! One question: 7:50, if you have zero singular value, how do you compute sigma inverse? Thank you!

  • @diogenescruz-figueroa2719

    @diogenescruz-figueroa2719

    Жыл бұрын

    This might be a year too late, but the Moore-Penrose Invert satisfies (AB)+ = B+A+. So I think there was a mistake, and it should have been Sigma+, not Sigma^{-1}, so you separate it from U and V. Since U and V are unitary, they are invertible, and thus their Moore-Penrose inverse is the regular inverse. As for Sigma, since it has only elements on the diagonal, its Moore-Penrose inverse is just the transpose, and then have the reciprocals for each of its elements (an "d" in the diagonal becomes a "1/d").

  • @surajchess3114
    @surajchess31142 жыл бұрын

    If we have x and b vectors how to find A matrix ?

  • @ankitsinha4124
    @ankitsinha41243 жыл бұрын

    Man. It was really awesome..

  • @maettu102
    @maettu1023 жыл бұрын

    Thanks for the video was really nice! There are 2 points which seem to be important to me. The invers of Σ is actually not always computable (if there exist a single value =0) so the more nice Expression would also be Σ+ . Where Σ+ is the matrix where every non zero single value is inverted but the zeroes are left as they are. And why not follow the convention of naming matrizes? Normally a matrix is called a mxn matrix. It seems you use a nxm matrix here which i think is a bit confusing at first glance.

  • @drickquantum5216

    @drickquantum5216

    2 жыл бұрын

    Thank you for the Single-Value-Matrix-Plus thing. Now I undertand the video.

  • @sexwax4191

    @sexwax4191

    2 жыл бұрын

    The inverse is not defined by inverting every element, but that its multiplication with the matrix yields the identity matrix. In other words if S_inverse is the inverse of S, then S*S_inverse = 1, where 1 is the identity matrix. Therefore the inverse is computable even with zeros in the diagonal.

  • @eddiechen6389
    @eddiechen63892 жыл бұрын

    Wonderful lecture

  • @wojtekskaba9757
    @wojtekskaba97573 жыл бұрын

    Why in definition of A-dagger you put Sigma-inverse rather than Sigma-dagger? Sigma is non-square and has only pseudo-inverse rather than inverse.

  • @deathtrick
    @deathtrick2 жыл бұрын

    How does this board work? Are you writing in opposite direction?

  • @ukaszorpik3913
    @ukaszorpik39132 жыл бұрын

    Damn, thank you, I finally got it thanks to you after 3h research xD

  • @michaellewis7861
    @michaellewis78614 жыл бұрын

    Wouldn’t the error be ||X-x dagger||_2 not just ||x dagger||_2 in itself?

  • @drickquantum5216
    @drickquantum52162 жыл бұрын

    I get a bit lost when he constructs the left pseudo-inverse. I cannot grasp why the single value matrix is guaranteed to have an inverse.

  • @chengeng6472
    @chengeng64724 жыл бұрын

    Thanks for your video. However, I'm confused about the proof of the theorem(why the norm is minimized), can you give a simple proof? Thanks.

  • @sergiohuaman6084

    @sergiohuaman6084

    3 жыл бұрын

    I believe it has to do with Eckart-Young theorem. you can check it out on KZread.

  • @uralmutlu4320
    @uralmutlu43207 ай бұрын

    I have a set of 3D positions and vectors. I try to find their intersection, so I linearized the equations like written in academic papers, but I dont get the results. I dont have a clue what could be wrong. I have Ax=b I tried: 1) (AA')^-1*A'*b 2) pinv(A)*b 3) A\b 4) I''ve just tried the SVD method, I get the same results with all. I checked my values 100 times. The vectors and directions are correct, I manually calculated them and confirmed with the matbal code.

  • @amirkhazama7464
    @amirkhazama746410 ай бұрын

    what if we had a singular value of 0 ? ...cant that happen ? when A has dependent columns? ....in that case we wouldnt have Sigma inverse correct ? ....do we first idk,... drop some columns of A so they are all independent /.?

  • @julijangrajfoner1730
    @julijangrajfoner1730Ай бұрын

    How can sigma be invertible when it's a rectangular matrix?

  • @subhadeepbej3241
    @subhadeepbej32413 жыл бұрын

    Excellent

  • @kangningwei934
    @kangningwei934 Жыл бұрын

    Thanks a lot

  • @mrgledjan3091
    @mrgledjan30915 ай бұрын

    It is not Sigma^-1 but Sigma^-T so the traspose of the inverse ?????

  • @michalwisler9616
    @michalwisler96164 жыл бұрын

    Hi Sir, I am quite curious about the name of the 'transparent' blackboard, I want to buy one, where I can get it? Thank you.

  • @ahmaddarawshi91

    @ahmaddarawshi91

    4 жыл бұрын

    I think this is not a "transparent blackboard". It is just glass and behind him is a black wall.

  • @hanyingjiang6864

    @hanyingjiang6864

    3 жыл бұрын

    @@ahmaddarawshi91 I'm always curious how he writes. He must write all stuff in a reverse direction - I mean, b would look like d from his perspective.

  • @ahmaddarawshi91

    @ahmaddarawshi91

    3 жыл бұрын

    @@hanyingjiang6864 he writes normally as he would write on a whiteboard but then the video is flipped (reflected) digitally.

  • @puneetgupta22
    @puneetgupta22 Жыл бұрын

    Forget about everything what visual do you use to write and record....

  • @Trubripes
    @Trubripes4 ай бұрын

    LOL So moor-penrose pseudoinverse is just inverting one SVD term at a time, and since they U and V are orthogonal they are just transposed. Well that saved me a lot of effort looking into where it comes from.

  • @lukf8347
    @lukf83473 жыл бұрын

    Hello, thank you for this nice video series. This is so helpful and I use it with your book for my masters thesis. But while going through the equations, one question popped to my head: at 7:32 you use the inverse of Sigma, but for the SVD Sigma is not a quadratic matrix, rather than a nxm - Matrix and as such not invertable (in the "classic" sense of invertable matrices). So while I understand that if I use the economic SVD, this Matrix would be a mxm - Matrix, but I don't understand it for the case of nxm. Is there a video or a page in the book, where this case is discussed? Other than that, thank you very much for saving my masters degree :D

  • @sexwax4191

    @sexwax4191

    2 жыл бұрын

    Sigma might be a nxm Matrix but not all rows and columns are non zero. If you remove all zero columns and rows you will get a quadratic matrix. So its essentially a quadratic matrix padded with zeros.

  • @Catwomen4512

    @Catwomen4512

    2 жыл бұрын

    He replied to another person saying "Usually we will invert the first "m x m" sub-block, which is square, and then only use the first "m" columns of "U". Or, we could be even more aggressive and only invert the first "r x r" sub-block of Sigma, and only use the first "r" columns of "U" and "V", where r is much less than m." However, if you just consider when nm then we would use economy SVD (seen in his next video) so sigma would also be a square matrix.

  • @DiegoAndresAlvarezMarin

    @DiegoAndresAlvarezMarin

    Жыл бұрын

    I had the same question

  • @JoaoBarbosa-pq5pv
    @JoaoBarbosa-pq5pv3 жыл бұрын

    Extremely nice lectures Steve Brunton, thank you very much for all the effort of creating and sharing them! do any of you - or anyone that reads this :) - know of any reference that explores the math of why you get min |x|2 in the underdetermined case? thank you in advance!

  • @Andres186000

    @Andres186000

    Жыл бұрын

    This might be late. The reason for this min |x|2 is that any other solution would be this x hat plus something in the null-space of A. That addition would be orthogonal to this x hat and thus only be able to increase the magnitude of x hat. I am essentially reading this right out of pages 404 to 405 of Gilbert Strang's "Introduction to Linear Algebra" fourth edition.

  • @somethingnew7538
    @somethingnew75383 жыл бұрын

    Great!!

  • @michaelvandeborne9382
    @michaelvandeborne93824 жыл бұрын

    Sir, by solving for x, I understand that A dagger cannot be a true inverse of A, but since the SVD is an exact egality of A, and since multiplying by U transpose S-1 and V is again an exact operation, I don't understand where the approximation step was introduced in the calculus.

  • @Eigensteve

    @Eigensteve

    4 жыл бұрын

    Great question. I address this exact question in the first 2 minutes of the next video: kzread.info/dash/bejne/YmaFpdaBfZTXkpc.html

  • @michaelvandeborne9382

    @michaelvandeborne9382

    4 жыл бұрын

    Indeed, thanks :)

  • @nami1540

    @nami1540

    2 жыл бұрын

    Thanks, just had this in mind as well

  • @DiegoAndresAlvarezMarin
    @DiegoAndresAlvarezMarin Жыл бұрын

    Are you using the economy version of the SVD? otherwise you are not able to take the inverse of Sigma, which is a nxm matrix. EDIT YES >> see next video in this list

  • @mmarva3597
    @mmarva35973 жыл бұрын

    waw!! Merci beaucoup

  • @harsha123144
    @harsha123144 Жыл бұрын

    Hi sir, Can you solve one linear regression problem using svd and upload in KZread.? Please it helps me lot and others as well.!

  • @Eigensteve

    @Eigensteve

    Жыл бұрын

    Check the playlist, I already have an example

  • @harsha123144

    @harsha123144

    Жыл бұрын

    @@Eigensteve Respected sir, can you send me the link? Sorry, for troubling...!

  • @alexyang6755
    @alexyang67553 жыл бұрын

    I FOUND GOLD.

  • @user-or7ji5hv8y
    @user-or7ji5hv8y3 жыл бұрын

    I don’t know why such was never explained when I took econometrics.

  • @periklisdrakousis6537
    @periklisdrakousis65372 жыл бұрын

    why U trans U cancel to identity. Don't they result in a square matrix???

  • @fununterhaltung6556

    @fununterhaltung6556

    Жыл бұрын

    Because U is a orthogonal Matrix

  • @arjunkc3227
    @arjunkc32276 ай бұрын

    This is cool but computational cost to determine svd is nightmare

  • @tanjamikovic2739
    @tanjamikovic2739 Жыл бұрын

    does this guy actually writes backward?

  • @forooghfarajzade8206
    @forooghfarajzade8206 Жыл бұрын

    when you say thank you i emphasize thank you!!!! (not me) :)))

  • @hyperduality2838
    @hyperduality28384 жыл бұрын

    Under is dual to over, left is dual to right, up is dual to down, in is dual to out. Thesis is dual to anti-thesis -- The Generalized or time independent Hegelian dialectic. Alive is dual to not alive -- Schrodinger's/Hegel's cat. Duality creates reality.

Келесі