Normed Linear Spaces | Introduction, L1 and L2 Norms

In this video, we introduce norms and normed linear spaces (normed vector spaces). These feature heavily in data science and machine learning applications and so we use an example from the data science to highlight the application of normed linear spaces.
Chapters
0:00 - Introduction
0:35 - Normed linear space (definition)
2:43 - Example
6:13 - Formal definition of a normed linear space
8:13 - Comparison of normed linear spaces and metric spaces
11:18 - L1 and L2 norms
The product links below are Amazon affiliate links. If you buy certain products on Amazon soon after clicking them, I may receive a commission. The price is the same for you, but it does help to support the channel :-)
The Approximation Theory series is based on the book "Approximation Theory and Methods" by M.J.D. Powell:
amzn.to/4fd13aM
Errata and Clarifications:
Strictly speaking, at 5:41, Yhat should = a[1,2,3] + b[1,1,1].
This video was made using:
Animation - Apple Keynote
Editing - DaVinci Resolve
Mic - Shure SM58 amzn.to/3gNC801
Supporting the Channel.
If you would like to support me in making free mathematics tutorials then you can make a small donation over at
www.buymeacoffee.com/DrWillWood
Thank you so much, I hope you find the content useful.

Пікірлер: 43

  • @maniam5460
    @maniam54603 жыл бұрын

    These videos are very helpful and they deserve more recognition from the KZread algorithm

  • @DrWillWood

    @DrWillWood

    3 жыл бұрын

    It honestly makes me so happy to hear that people find these videos useful! Thank you!

  • @stevenschilizzi4104
    @stevenschilizzi41042 жыл бұрын

    These presentations are wonderfully clear and pedagogical. I will get my students to use them to get a better understanding of regressions than just blindly pushing buttons, as they are too often taught. I must say I also love that (northern?) British accent! Thanks so much for putting it all together.

  • @hugeturnip3520
    @hugeturnip3520 Жыл бұрын

    you deserve wayyyyyy more attention these videos are insanely good

  • @navjotsingh2251
    @navjotsingh2251 Жыл бұрын

    I would honestly love to see you do more on approximation theory. You are the best in this field. Your videos are awesome.

  • @ALEX-us8fx
    @ALEX-us8fx3 жыл бұрын

    Very good job! The norm of ||y' - y||, where your channel is y' and y is the vector of KZread numerical analysis videos should be minimum! :)

  • @DrWillWood

    @DrWillWood

    3 жыл бұрын

    Thanks a lot! :-D

  • @patmichel4724
    @patmichel47244 ай бұрын

    I discover this chaîne today and it’s an amazing one! I’m wonder why I never see it before!

  • @robin2080
    @robin2080 Жыл бұрын

    I took this class with bamberg in 2016 first time I'm actually understanding this lol

  • @guilhemescudero9114
    @guilhemescudero91142 жыл бұрын

    You did an amazing job!

  • @hey.guitarbjorn
    @hey.guitarbjorn11 ай бұрын

    Great explanation, thank you!

  • @dariosilva85
    @dariosilva8510 ай бұрын

    Wow, you are an awesome teacher. Please make more videos. Thanks

  • @meccamiles7816
    @meccamiles78162 жыл бұрын

    I love your content. Thank you for the wonderful visuals.

  • @DrWillWood

    @DrWillWood

    2 жыл бұрын

    Thanks a lot!

  • @robmarks6800
    @robmarks68003 жыл бұрын

    Would be nice to see some hilbert spaces, fourier transform

  • @olympunk4212
    @olympunk42122 жыл бұрын

    Honestly, I tried to understand from various places, but you have done an amazing job explaining this. I hope you upload more content

  • @DrWillWood

    @DrWillWood

    2 жыл бұрын

    Thanks a lot! glad it was helpful. Of course, plenty more vids to come :-)

  • @olympunk4212

    @olympunk4212

    2 жыл бұрын

    I had a doubt, around 5:40, are we saying that the sey consisting of (1,2,3) and (1,1,1) forms a basis to that 2D vector space ? If yes, then how

  • @DrWillWood

    @DrWillWood

    2 жыл бұрын

    Yeah, although the vectors (1,2,3) and (1,1,1) are 3-D vectors, we only have two of them. So taking any linear combination of a x (1,2,3) + b x (1,1,1) for arbitrary constants a and b will form a 2-D plane. The plane will exist inside a 3-D vector space (as shown graphically in green at 5:40) but the plane itself is 2-D. Hope that helps!

  • @olympunk4212

    @olympunk4212

    2 жыл бұрын

    @@DrWillWood yup I got it. Thanks a lot. Also not really sure how the best fit line works but I think you are taking any two points and forming the line equation right? So shouldn’t the best fit line be y = x? Like why are we getting for x = 3 y hat = 3.1 and not 3?

  • @DrWillWood

    @DrWillWood

    2 жыл бұрын

    I guess I didn't specify what the line of best fit is in this case only tried to look at the problem of finding a line of best through the eyes of an NLS. In this case, if you have N data points it's equivalent to finding the distance from the data Y to the subset of all the approximations you could make of it (the approximations are constrained to a subset of this space eg in this case it was constrained to a straight line). I don't think I was very clear at the motivation looking back so let me have another go! so lets say you have (x,y) data (1,0), (2,3), (3,2). a straight line could never go through these points so Yhat could never equal Y. (Y in this case is [0,3,,2] the vector of y values). This is equivalent to the picture of Y being outside the Yhat plane since its outside the scope of data points which can be perfectly fit by a straight line! I should say too we don't need to worry about the x values because they're the same in the Y and Yhat cases. If we had data points (1,2),(2,4),(3,6). then this could be fit by a straight line y=2x, or in the form of the vector equation at 5:30, Yhat = 2 [1,2,3], and Yhat = Y. :-)

  • @vaishnav4035
    @vaishnav40353 жыл бұрын

    Thank you sir ❤️😊

  • @syedfaizan5841
    @syedfaizan58412 жыл бұрын

    Well done

  • @anthonybernstein1626
    @anthonybernstein162610 ай бұрын

    Ah, so that's the L2 norm! Thank you!

  • @VolumetricTerrain-hz7ci
    @VolumetricTerrain-hz7ci2 ай бұрын

    There are unknown way to visualize subspace, or vector spaces. You can stretching the width of the x axis, for example, in the right line of a 3d stereo image, and also get depth, as shown below. L R |____| |______| TIP: To get the 3d depth, close one eye and focus on either left or right line, and then open it. This because the z axis uses x to get depth. Which means that you can get double depth to the image.... 4d depth??? :O p.s You're good teacher!

  • @thecarlostheory
    @thecarlostheory2 жыл бұрын

    Hello. First of all I want to say that ur 2 first video since I´ve seen them, are amazing. I´m with joy to see the rest. One particular thing I liked a lot it´s the animations. How do you do them?

  • @DrWillWood

    @DrWillWood

    2 жыл бұрын

    Thanks a lot! All the animations are made in Apple Keynote which has lots of functionality for manipulating and animating shapes like arrows, curves, squares etc

  • @thecarlostheory

    @thecarlostheory

    2 жыл бұрын

    @@DrWillWood wow. Amazing!

  • @user-wu8rq7jn8p
    @user-wu8rq7jn8p5 ай бұрын

    Who are u man? U gooood ❤

  • @PS-dw5qo
    @PS-dw5qo3 жыл бұрын

    How does the equation at 5:40 span a plane? Is this not a line for a given a and b? Thanks for the video.

  • @DrWillWood

    @DrWillWood

    3 жыл бұрын

    Excellent spot, you're right! Thanks a lot for pointing that out i'll pin a correction to the top :-)

  • @PS-dw5qo

    @PS-dw5qo

    3 жыл бұрын

    Maybe you meant something else with the equation, since it makes sense that y hat is found in a plane. I was unsure myself : )

  • @PS-dw5qo

    @PS-dw5qo

    3 жыл бұрын

    That y hat is rotated in the direction of the vector (1,1,1) also makes sense, since this is, presumably, the first column vector in the independent variable X and its column space spans the plane in which y hat is found. I’d be grateful for a clarification about the statement regarding the vector equation spanning a plane at 5:22.

  • @PS-dw5qo

    @PS-dw5qo

    3 жыл бұрын

    Maybe it should be a(1,2,3)+b(1,1,1) where a and b are any real numbers?

  • @PS-dw5qo

    @PS-dw5qo

    3 жыл бұрын

    One more small detail, but at 7:37 it says VxV=||x-y||. Isn’t the left hand side a set whereas the right hand side a real number?

  • @abhinavkumarverma1017
    @abhinavkumarverma10172 жыл бұрын

    Hello. Thank you for all the effort you are putting. Can you provide an example of metric spaces that is not normed linear space. This would clarify the difference. Thanks in advance.

  • @Evan490BC

    @Evan490BC

    2 жыл бұрын

    This is easy to see: a metric (distance) does not need to be defined by a norm necessarily.

  • @-minushyphen1two379

    @-minushyphen1two379

    11 ай бұрын

    This example was the one shown in the video, but I’ll repeat it here: Consider the unit disc in the plane. It is a metric space since every two points in it have a defined distance from each other, but it is not a normed linear space since you can scale a vector and it will go out of the disc.

  • @austinbristow5716
    @austinbristow57162 жыл бұрын

    Why is it that the set of all possible Y hat form a 2-D vector space rather than a 3-D vector space?

  • @austinbristow5716

    @austinbristow5716

    2 жыл бұрын

    Is it because of the constraint of Y hat being composed of the linear regression of the set of Y? Does this lower the dimension from 3-D to 2-D? If so, why? Do constraints like these, in general, reduce the dimension of a vector space?

  • @anthonybernstein1626

    @anthonybernstein1626

    10 ай бұрын

    ​@@austinbristow5716A line is fully determined by just 2 values: its slope and y-intercept, so it doesn't have enough "degrees of freedom" to form a 3D vector space.