Least Squares Approximations

Description: We can't always solve Ax=b, but we use orthogonal projections to find the vector x such that Ax is closest to b.
This video is part of a Linear Algebra course taught at the University of Cincinnati.
BECOME A MEMBER:
►Join: kzread.info/dron/9rTsvTxJnx1DNrDA3Rqa6A.htmljoin
MATH BOOKS & MERCH I LOVE:
► My Amazon Affiliate Shop: www.amazon.com/shop/treforbazett

Пікірлер: 11

  • @elizabethautumn630
    @elizabethautumn6305 жыл бұрын

    Dude- you are honestly a God-send, this helped so much, thank you.

  • @vp9387
    @vp93875 жыл бұрын

    This helps so much with my understanding, nice explanation, thank you!

  • @NguyenTran-sn8vt
    @NguyenTran-sn8vt Жыл бұрын

    Thank you for explaining the underlying mechanics of the process! This was super helpful!

  • @tarunmistry1153
    @tarunmistry1153 Жыл бұрын

    Hey Dr. Bazett, Thank you so much for this series. I'm a second-year mathematics student at Imperial College and this has really helped to summarise a lot of the material that I have learnt this year! Really appreciate how clear and well-explained this course was so again thank you!

  • @codercodes7538
    @codercodes75382 жыл бұрын

    you are amazing teacher, thank you for saving my time and going forward to the point💖💖

  • @NguyenHoang-jw8yu
    @NguyenHoang-jw8yu3 жыл бұрын

    Is the orthogonality is needed in the expansion in 1 below the LSA? If the set {a_i} is not orthogonal, then I don't think the right-hand side is the orthogonal projection of v onto W.

  • @alaindevos4027
    @alaindevos40274 жыл бұрын

    On 6min50. Do the vectors a_i need to be orthogonal , why or why not ?

  • @cegalo12
    @cegalo123 жыл бұрын

    What does the hat on x mean?

  • @bodhi_db
    @bodhi_db4 жыл бұрын

    Here's the follow up for the people who ended up here without following the playlist: kzread.info/dash/bejne/lHiJ1Y-aopScf6w.html