Least Squares Approximations
Description: We can't always solve Ax=b, but we use orthogonal projections to find the vector x such that Ax is closest to b.
This video is part of a Linear Algebra course taught at the University of Cincinnati.
BECOME A MEMBER:
►Join: kzread.info/dron/9rTsvTxJnx1DNrDA3Rqa6A.htmljoin
MATH BOOKS & MERCH I LOVE:
► My Amazon Affiliate Shop: www.amazon.com/shop/treforbazett
Пікірлер: 11
Dude- you are honestly a God-send, this helped so much, thank you.
This helps so much with my understanding, nice explanation, thank you!
Thank you for explaining the underlying mechanics of the process! This was super helpful!
Hey Dr. Bazett, Thank you so much for this series. I'm a second-year mathematics student at Imperial College and this has really helped to summarise a lot of the material that I have learnt this year! Really appreciate how clear and well-explained this course was so again thank you!
you are amazing teacher, thank you for saving my time and going forward to the point💖💖
Is the orthogonality is needed in the expansion in 1 below the LSA? If the set {a_i} is not orthogonal, then I don't think the right-hand side is the orthogonal projection of v onto W.
On 6min50. Do the vectors a_i need to be orthogonal , why or why not ?
What does the hat on x mean?
Here's the follow up for the people who ended up here without following the playlist: kzread.info/dash/bejne/lHiJ1Y-aopScf6w.html