Solving Systems of Differential Equations with Eigenvalues and Eigenvectors
Ғылым және технология
We now show how to solve a generic matrix system of linear ordinary differential equations (ODEs) using eigenvalues and eigenvectors. This is one of the most powerful techniques in linear systems theory, with applications in stability theory and control.
Code examples are given in Python and Matlab.
Playlist: • Engineering Math: Diff...
Course Website: faculty.washington.edu/sbrunto...
@eigensteve on Twitter
eigensteve.com
databookuw.com
This video was produced at the University of Washington
%%% CHAPTERS %%%
0:00 Overview and Recap of Eigenvalues and Eigenvectors
2:58 Eigenvalues in Matlab
4:40 Eigenvalues in Python
5:55 Setting up the Problem
15:25 The Full Solution
17:00 Intuitive Interpretation
Пікірлер: 50
You know that it is going to be a great lecture series when Eigensteve is teaching you about eigenvalues and eigenvectors
This is truly a piece of art. Years of frustration healed in three videos. A hero of education.
I'm going to speak for everyone here, but we all really appreciate the effort you put into these videos. Especially considering they are probably at least tertiary to teaching and research. Thank you, Steve.
@macmos1
Жыл бұрын
definitely
So grateful for KZread and EigenSteve! I barely earned a physics degree back in the ‘90’s, but never really internalized math. Thanks to Dr Kutz lecture on SVD, and your series on same, I have been hooked on math videos for years now. In particular, thank you for not skipping steps, and spelling things out for the non-geniuses among us. After all, if we didn’t need things spelled out, we wouldn’t need some one to show us in the first place. After many years of struggle, I am finally starting to understand the language of math. Thanks to KZread, you have left a legacy for thousands to benefit from. I know it takes patience to keep from skipping steps… I can see you struggle…but from your KZread students point of view it’s well worth it!
When I learned this the first time my advanced control systems professor absolutely butchered the explanation. He robbed me of discovering such an amazing discovery about eigenvalues and eigenvectors. I never understood how or why this works. Thank you for explaining so clearly, Steve, this series is amazing!
What I like about these videos is that they are much better and concise version of the material I learned in college. My notes weren't very good so I'm really glad there is a quick and easily accessible reference to this kind of material.
It is just under 50 years ago when I covered this, on the Monday one lecturer boringly introduced us to eigenvalues, on the Thursday a control engineer took us from “equations of motion” of a plane through matrix representation and onto eigenvalues. Never forgotten the second approach. Your videos are great.
this lecture is soooo good. I never really understood it when I was self-studying this from a ODE book recently. Now I have a clear and intuitive understanding. Thanks!
@whootoo1117
Жыл бұрын
It is ime to get stirred though 😂😂
My top list of KZread math teachers 1) Linear Algebra - Pavel Grinfeld 2) Engineering Math - Steve Brunton 3) Diff Geo- Keenan Crane
Absolutely amazing. Thanks for the video!
Very understandable explanation, amazing how you can simplify things by taking a detour into eigenspaces.
I feel old for those kind of lectures were years ago Great video !
One of the best professors in the world!!! thanks!
The last explanation was AMAZING!
Beautiful. Really wish the best for you!!! Thank you for sharing such wonderful knowledge
This is so beautiful, Now I can tell where is eigen value and vector is actually used.
Thank you Dr. Brunton., amazing series! Just a side note: I believe it would have been nice if showed how the D power series in 14:59 goes to e^(Dt) by showing that each diagonal entry of the matrices in the power series, was its own power series for each diagonal element. This would prove why e^(Dt) equals a matrix with exponentials in the diagonals.
THIS VIDEO IS REALY REALY COOL!
Muito massa nunca tinha aprendido isso na minha vida, a minha faculdade me ensinou de uma maneira muito estranha é essa maneira que você ensina é muito mais simples e intuitiva do que apenas fazer cálculos automáticos usando formulas prontas. Incrível vídeo
Amazing!
Steve, thank you for your work, I'm eternaly thankful for this video series. The most difficult part for me in this topic is notorious Jordan form for non-diagonalizable matrices. It would be a most nice of you if you make a video explaining this topic
@user-wz7ch8vf2p
Жыл бұрын
I'm a donkey. There is already a video that elaborates on this🙃
Suggestion: You could include the lecture number in the video title.
@TNTsundar
Жыл бұрын
Your comment is 11 days old when the video was uploaded only 4 hours back? 😮
@AbrarShaikh2741
Жыл бұрын
@@TNTsundar That is because the entire video playlist is available on the channel but individual publish dates are in the future time (t). To know when the next video will come out you will have to solve the y(t) [youtube] diff eqn or you can change the coordinate system by going to the playlist vector and watch all the videos from future.
@AbrarShaikh2741
Жыл бұрын
Huge thanks sir @Steve Brunton . Future generation of engineers will have real motivation to study eigen values and other fun stuff if they don’t waste their time in PubG or on TikTok. I wish these lectures were there during my college days. All I heard in class for 4 semesters was Cauchy Reimann theorem and Eigen values, Eigen vectors without actually understanding the crux of the matter.
thanks a lot
بی نظیر بود🎉
I know this has been mirrored since the beginning, but it just occurred to me to look up your university page, and confirm that your hair is in fact flipped here. :)
Very good
Thanks for this excellent lecture! I wonder what is the physical meaning of a system of ODEs that thier matrix is not diagonalizable? After all not all matrices can transform into diag matrix via eig values and eigvectors.
When is the book coming out ?
Good lecture, but what about using this solution when we have boundary conditions and not initial conditions?
It’s raining Eigen vectors! 😂
How different is it for discrete systems? And how the limit dt->0 gives same results as a continuous system. Also, I have noticed the general solution for differential equations is of form: y=c1*v1*exp(\lambda1*t)+c2*v2*exp(\lambda2*t), while the P matrix is represented by {v1,v2} where is inv(P)?
Steve, I really like your content. But I think the text for equations and font size for codes are too small, even watching on a big monitor...
Hmm but what if the coefficients are not just simple constants, but functions of `t` instead? Then we won't have just simple numbers in our matrix `A`, but functions of `t` :q Which I suppose that it means that the directions of eigenvectors will be moving (rotating) with time, perhaps even along with their centres (fixpoints), am I right? How can we deal with such equations?
Fascinating stuff! 😂
Why is only one T and T inverse pair canceled when matrix A is squared?
@APaleDot
Жыл бұрын
Matrix multiplication is not commutative. You cannot rearrange the terms in a product as you please. So only the terms which touch each other directly will cancel.
Note for thinking) I think we don't have to expand e^At. - We can just directly transform z(t)=e^(Dt)z(0) to x(t)=Te^(Dt)T^(-1)x(0), since x=Tz and thus T^(-1)x=z. Note2) Oh, and the professor forgot to show us why x(t)=e^(At)x(0)=Te^(Dt)T^-1x(0) is solution of x'=Ax. Just calculate x' and Ax, and using the fact A=TDT^-1, then we see it's the solution! Very trivial but for logical completeness :)
@huangzhao-shen4514
Жыл бұрын
This does confuses me. I think x(t)=e^At works only if A is diagonal.
@huangzhao-shen4514
Жыл бұрын
kzread.info/dash/bejne/gWxpsbl7epXHqdI.html This video answered my question.
@APaleDot
Жыл бұрын
You do need to expand for the proof. How else would you show that e^(At) = Te^(Dt)T^-1?
@starriet
Жыл бұрын
@@APaleDot Thanks for the comment. Well I think it depends on how we think. If we want to express the solution of z'=Dz as the form of "matrix exponential", z=e^(Dt)z(0), then it includes the concept of the expansion of e^(Dt) which is simpler than expanding e^(At) since D is diagonal, and we don't have to directly show that e^(At) = Te^(Dt)T^-1, anyway. But A=TDT^-1 and everything is connected so... it's up to our viewpoint :)
No It Doesn't
at 11:40, does the multiplication between matrices follow transpose(A)*A, instead of A*A? I wonder if A*A is valid.