Linear Algebra: Singular Value Decomposition (Full lecture)
Жүктеу.....
Пікірлер: 50
@amirhosseindaraie56223 жыл бұрын
Dear Dr. Hower, you are an amazing teacher. I enjoy watching your lectures.
@DrValerieHower
3 жыл бұрын
Hi thank you. Perhaps you answered your own question. Thank you for the feedback.
@Spacexioms Жыл бұрын
This is the greatest video I’ve seen on svd
@DrValerieHower
Жыл бұрын
Thank you so much!!
@masakmemasak3193 жыл бұрын
the best lecture for SVD right now for me! as you mention every step clearly in detail and flow..it really help me who is not into this topic during my bachelor.. thank you Dr! :D
@DrValerieHower
3 жыл бұрын
Thank you so much for your feedback!
@spencerfradkin37623 жыл бұрын
Dr. Hower, My name is Spencer and I was a student in your Calc 2 class at FAU a couple Summers ago. Your calc 2 class was my favorite class in undergrad. I was searching youtube for svd videos for a graduate class I'm taking and I can't believe I came across your channel. This video is exactly what I needed and it's explained as well as your calc 2 lectures. Thanks!!!
@DrValerieHower
3 жыл бұрын
Spencer! It is so wonderful to hear from you. I really appreciate your feedback and hope everything is going well for you. :)
@matthewchunk36894 жыл бұрын
Excellent topic. Thanks!
@bashiruddin38913 жыл бұрын
The sort of lecture we wish we could have found at the start of semester.Thanks a lot
@DrValerieHower
3 жыл бұрын
Thanks so much for your feedback.
@menugrg37083 жыл бұрын
The best explanation of Singular Value Decomposition.. many thanks Dr. Hower.
@DrValerieHower
3 жыл бұрын
You are very welcome. I appreciate the feedback :)
@BharathSaiS2 жыл бұрын
The first time you see a math teacher with a smile..
@DrValerieHower
2 жыл бұрын
:)
@mustafizurrahman56993 ай бұрын
Splendid video on SVD
@DrValerieHower
3 ай бұрын
Thank you!
@redouaneabegar54903 жыл бұрын
Best numerical application of SVD I've ever found on KZread. Thank you ma'am
@DrValerieHower
3 жыл бұрын
Thank you so much. I appreciate your feedback!
@MiguelSantos-vi3giАй бұрын
Love your class and attitude
@DrValerieHower
Ай бұрын
Thank you so much!!!
@TheTacticalDood4 жыл бұрын
Thanks, very nice lecture!
@electrocrats11002 жыл бұрын
nice explanation mam, Respect from INDIA
@DrValerieHower
2 жыл бұрын
Thank you for your feedback!
@amshudharvadla6482 Жыл бұрын
It's amazing 👏 Love from India
@DrValerieHower
Жыл бұрын
Thank you!
@donaldduck40423 жыл бұрын
The best lecture on youtube so far
@DrValerieHower
3 жыл бұрын
Thank you!
@moin2163 Жыл бұрын
Keep up the energy! Thank you for this video.
@DrValerieHower
Жыл бұрын
You are welcome. Thank you for the comment :)
@rexmagat40518 ай бұрын
Thanks. Doctor. Great
@DrValerieHower
8 ай бұрын
You are welcome! Thank you for the comment :)
@rafaeljabbour25026 ай бұрын
At 39:00 you used ker(row(V)) to find the third unit vector, is it always the case that you can use that? or do you sometimes need to use Gram-Schmidt process?
@mihirparab66203 жыл бұрын
Cool and awesome teaching ma'am👍Thank you very much
@DrValerieHower
3 жыл бұрын
You are welcome! Thank you for your feedback.
@rashmisharma58212 жыл бұрын
amazingly elegant :)
@DrValerieHower
2 жыл бұрын
Thank you!
@ungarlinski7965 Жыл бұрын
@21:00 So if I wanted my Sigma matrix to have the columns switched so that is was not not diagonal anymore so that the sigma1 and sigma2 values were on the off diagonal, I could just use this same procedure and solve for the new U? I know this would no longer be SVD, but I'm curious about this kind of decomposition too.
@DrValerieHower
Жыл бұрын
I'll speak to the 2x2 case here in which case Sigma is square. Yes correct we would not have the SVD. But if you swap the two columns of Sigma so that 0s are along the diagonal. sigma1 is in the (1,2) entry and sigma 2 is in the (2,1) entry. It is still the case that Transpose(Sigma) times Sigma is diagonal and a matrix that is similar to Transpose(A) times A. You can take U and V orthogonal but take care in the order of the columns.
@MegalaSrinivasan7 күн бұрын
Thank you so much mam.
@DrValerieHower
7 күн бұрын
You are welcome!!
@madhumitanath32752 жыл бұрын
Excellent explanation ma'am.
@DrValerieHower
2 жыл бұрын
Thanks so much!
@climitod8524Ай бұрын
31:50 I am very confused how you got v here. I did the ker(A^(T)a -9I) = {,) and I did it for 81. I got the same vectors as you but you associated them differently I did the identity matrix for v because that's the order that makes sense and how its defined in the text with v1*k1 for k=eigenvalues.
@climitod8524
Ай бұрын
Its it because we switch the order of eigenvalues Im really confused.
@DrValerieHower
Ай бұрын
Careful in your notation. A kernel is a subspace which would not be a two element set. But to answer your question, each vector in V is a unit (hence nonzero) eigenvector for ATA. The order comes from looking in Sigma. We put singular values along diagonal of Sigma in a nonincreasing order. Then the vectors in V must match the order. is an eigenvector for ATA with eigenvalue 81. hence it corresponds to singular value sqrt(81)=9. We put this in the first column of V.
@russellsharpe2883 жыл бұрын
Thank-you for this clear exposition. However, I thought you were going to prove that every matrix has such a decomposition. But at 12:45 you simply assume it does, and then derive facts about sigma and V based on this assumption. That is not sufficient to show that such a decomposition exists, is it?
@DrValerieHower
3 жыл бұрын
Hi. My discussion is constructive, meaning I discuss how to find the decomposition. The Spectral theorem states that a matrix is orthogonally diagonalizable if and only if it is symmetric. I show A^TA is symmetric. Sigma and V come from its orthogonal diagonalization.
@russellsharpe288
3 жыл бұрын
@@DrValerieHower Thanks. I think the penny has dropped now. As you say, the Spectral Theorem gives A*A = VDV*, and then you construct columns of U from the normalised nonzero A-images of the columns of V (this is the bit I was somehow missing). These U-columns are orthogonal because the V-columns are (and using A*A= VDV*), and we can further extend to a complete orthogonal basis if necessary (as you do in fact in the penultimate example). Then sigma(i).u(i) = Av(i) pretty much by construction and US=AV falls out immediately. Got it. I was confused because I am working my way through Axler's book, and he has a complicated proof of Polar Decomposition from which he deduces SVD as a corollary, but in fact I now see it is much easier the other way around. Thanks again for your help on this. Great channel.
Пікірлер: 50
Dear Dr. Hower, you are an amazing teacher. I enjoy watching your lectures.
@DrValerieHower
3 жыл бұрын
Hi thank you. Perhaps you answered your own question. Thank you for the feedback.
This is the greatest video I’ve seen on svd
@DrValerieHower
Жыл бұрын
Thank you so much!!
the best lecture for SVD right now for me! as you mention every step clearly in detail and flow..it really help me who is not into this topic during my bachelor.. thank you Dr! :D
@DrValerieHower
3 жыл бұрын
Thank you so much for your feedback!
Dr. Hower, My name is Spencer and I was a student in your Calc 2 class at FAU a couple Summers ago. Your calc 2 class was my favorite class in undergrad. I was searching youtube for svd videos for a graduate class I'm taking and I can't believe I came across your channel. This video is exactly what I needed and it's explained as well as your calc 2 lectures. Thanks!!!
@DrValerieHower
3 жыл бұрын
Spencer! It is so wonderful to hear from you. I really appreciate your feedback and hope everything is going well for you. :)
Excellent topic. Thanks!
The sort of lecture we wish we could have found at the start of semester.Thanks a lot
@DrValerieHower
3 жыл бұрын
Thanks so much for your feedback.
The best explanation of Singular Value Decomposition.. many thanks Dr. Hower.
@DrValerieHower
3 жыл бұрын
You are very welcome. I appreciate the feedback :)
The first time you see a math teacher with a smile..
@DrValerieHower
2 жыл бұрын
:)
Splendid video on SVD
@DrValerieHower
3 ай бұрын
Thank you!
Best numerical application of SVD I've ever found on KZread. Thank you ma'am
@DrValerieHower
3 жыл бұрын
Thank you so much. I appreciate your feedback!
Love your class and attitude
@DrValerieHower
Ай бұрын
Thank you so much!!!
Thanks, very nice lecture!
nice explanation mam, Respect from INDIA
@DrValerieHower
2 жыл бұрын
Thank you for your feedback!
It's amazing 👏 Love from India
@DrValerieHower
Жыл бұрын
Thank you!
The best lecture on youtube so far
@DrValerieHower
3 жыл бұрын
Thank you!
Keep up the energy! Thank you for this video.
@DrValerieHower
Жыл бұрын
You are welcome. Thank you for the comment :)
Thanks. Doctor. Great
@DrValerieHower
8 ай бұрын
You are welcome! Thank you for the comment :)
At 39:00 you used ker(row(V)) to find the third unit vector, is it always the case that you can use that? or do you sometimes need to use Gram-Schmidt process?
Cool and awesome teaching ma'am👍Thank you very much
@DrValerieHower
3 жыл бұрын
You are welcome! Thank you for your feedback.
amazingly elegant :)
@DrValerieHower
2 жыл бұрын
Thank you!
@21:00 So if I wanted my Sigma matrix to have the columns switched so that is was not not diagonal anymore so that the sigma1 and sigma2 values were on the off diagonal, I could just use this same procedure and solve for the new U? I know this would no longer be SVD, but I'm curious about this kind of decomposition too.
@DrValerieHower
Жыл бұрын
I'll speak to the 2x2 case here in which case Sigma is square. Yes correct we would not have the SVD. But if you swap the two columns of Sigma so that 0s are along the diagonal. sigma1 is in the (1,2) entry and sigma 2 is in the (2,1) entry. It is still the case that Transpose(Sigma) times Sigma is diagonal and a matrix that is similar to Transpose(A) times A. You can take U and V orthogonal but take care in the order of the columns.
Thank you so much mam.
@DrValerieHower
7 күн бұрын
You are welcome!!
Excellent explanation ma'am.
@DrValerieHower
2 жыл бұрын
Thanks so much!
31:50 I am very confused how you got v here. I did the ker(A^(T)a -9I) = {,) and I did it for 81. I got the same vectors as you but you associated them differently I did the identity matrix for v because that's the order that makes sense and how its defined in the text with v1*k1 for k=eigenvalues.
@climitod8524
Ай бұрын
Its it because we switch the order of eigenvalues Im really confused.
@DrValerieHower
Ай бұрын
Careful in your notation. A kernel is a subspace which would not be a two element set. But to answer your question, each vector in V is a unit (hence nonzero) eigenvector for ATA. The order comes from looking in Sigma. We put singular values along diagonal of Sigma in a nonincreasing order. Then the vectors in V must match the order. is an eigenvector for ATA with eigenvalue 81. hence it corresponds to singular value sqrt(81)=9. We put this in the first column of V.
Thank-you for this clear exposition. However, I thought you were going to prove that every matrix has such a decomposition. But at 12:45 you simply assume it does, and then derive facts about sigma and V based on this assumption. That is not sufficient to show that such a decomposition exists, is it?
@DrValerieHower
3 жыл бұрын
Hi. My discussion is constructive, meaning I discuss how to find the decomposition. The Spectral theorem states that a matrix is orthogonally diagonalizable if and only if it is symmetric. I show A^TA is symmetric. Sigma and V come from its orthogonal diagonalization.
@russellsharpe288
3 жыл бұрын
@@DrValerieHower Thanks. I think the penny has dropped now. As you say, the Spectral Theorem gives A*A = VDV*, and then you construct columns of U from the normalised nonzero A-images of the columns of V (this is the bit I was somehow missing). These U-columns are orthogonal because the V-columns are (and using A*A= VDV*), and we can further extend to a complete orthogonal basis if necessary (as you do in fact in the penultimate example). Then sigma(i).u(i) = Av(i) pretty much by construction and US=AV falls out immediately. Got it. I was confused because I am working my way through Axler's book, and he has a complicated proof of Polar Decomposition from which he deduces SVD as a corollary, but in fact I now see it is much easier the other way around. Thanks again for your help on this. Great channel.
Mam your the cutest