Generalizing the right angle.

🌟Support the channel🌟
Patreon: / michaelpennmath
Merch: teespring.com/stores/michael-...
My amazon shop: www.amazon.com/shop/michaelpenn
🟢 Discord: / discord
🌟my other channels🌟
Course videos: / @mathmajor
non-math podcast: / @thepennpavpodcast7878
🌟My Links🌟
Personal Website: www.michael-penn.net
Instagram: / melp2718
Randolph College Math: www.randolphcollege.edu/mathem...
Research Gate profile: www.researchgate.net/profile/...
Google Scholar profile: scholar.google.com/citations?...
🌟How I make Thumbnails🌟
Canva: partner.canva.com/c/3036853/6...
Color Pallet: coolors.co/?ref=61d217df7d705...
🌟Suggest a problem🌟
forms.gle/ea7Pw7HcKePGB4my5

Пікірлер: 124

  • @mrnanisissa
    @mrnanisissa Жыл бұрын

    The example at 12:30 is not really an inner product. It can happen that =0 but f is not the zero function. For Example you can take the function that is always 0 on [a,b) and that is 1 in b. This function is integrable on [a,b] and it's such that =0 but it's not the zero function. If you want to actually be an inner product you need to take the quotient of the space V with respect to the equivalence relation "almost everywhere". However the ideas of generalizing the concept of orthogonality can go very far and lead to wonderful concepts like Hilbert spaces. This abstract concept are the the bricks of the Fourier series with which many concrete problems can be solved. when I studied these topics I was impressed by all of this.

  • @BrollyyLSSJ

    @BrollyyLSSJ

    Жыл бұрын

    It's also sufficient to restrict to the subspace of continous functions on [a, b], but you do lose completeness this way.

  • @mrnanisissa

    @mrnanisissa

    Жыл бұрын

    @@BrollyyLSSJ Yeah, that's true

  • @Alex_Deam

    @Alex_Deam

    Жыл бұрын

    @@BrollyyLSSJ What's meant by completeness?

  • @mrnanisissa

    @mrnanisissa

    Жыл бұрын

    @@Alex_Deam The fact that every cauchy sequence is convergent

  • @BrollyyLSSJ

    @BrollyyLSSJ

    Жыл бұрын

    @@Alex_Deam What @Mr.Nani said. As an example, consider sequence of continuous functions on [-1, 1]: f_k(x) = {0 if -1

  • @nullsharp4610
    @nullsharp4610 Жыл бұрын

    This was wild! I've always been given the idea of orthogonality from the top and then down - it's really enlightening and useful to see it from the other direction. In particular, it was amazing to see that the triangle directly led to the dot product. I can't believe I've never seen that before. And of course, naturally in the case of diff eqs and the like, polynomials and trig functions for Fourier series. Love it. I ran into Hermite polynomials while looking into interpolation with derivatives - incredible!

  • @pyropulseIXXI

    @pyropulseIXXI

    Жыл бұрын

    The triangle doesn't lead directly to the dot product. The dot product is a definition, and what you just saw can motivate someone to define the dot product as such. But that doesn't mean the triangle led directly to the dot product

  • @RexxSchneider
    @RexxSchneider Жыл бұрын

    Spoiler alert. For those who want to check their homework. Here's my working to find the family of matrices B that are orthogonal to A = ( 1 3 || 0 2) (I'm using || to indicate a new line inside the matrix): B is a 2x2 matrix, so we can set B = ( a b || c d ). Then we calculate transpose of A multiplied by B: (1 0 || 3 2) ( a b || c d ) = ( a b || 3a+2c 3b+2d ) and then its trace is a + 3b+2d which we set equal to 0. That gives a single condition, for example a = -3b-2d. So B = ( -3b-2d b || c d ) where b, c, d are free values.

  • @kaay8983

    @kaay8983

    Жыл бұрын

    With a little more work you can also find the orthogonal decomposition of this 3-dimentional space, for example: (0,2;2/sqrt(3),-3) (2,0;-2*sqrt(3),-1) and (3,-1;sqrt(3),0)

  • @maxmustermann3938

    @maxmustermann3938

    9 ай бұрын

    c d eez nuts

  • @alienmoonstalker
    @alienmoonstalker Жыл бұрын

    This is nice. I like how you tied it into Legendre poly's. I did some work projecting droplet perimeters onto the Legendre poly's and fit the coefficients to a general linear model to predict new shapes.

  • @kkanden
    @kkanden Жыл бұрын

    great video! i love you algebra videos, they're really interesting!

  • @abrahammekonnen
    @abrahammekonnen Жыл бұрын

    Wow that was an incredibly dense video. Thank you. I'll definitely be rewatching it.

  • @manucitomx
    @manucitomx Жыл бұрын

    This was a very good video. I would have liked a few ideas of applications. I quite like the call outs. Thank you, professor.

  • @weinihao3632
    @weinihao3632 Жыл бұрын

    That was a very nice presentation! Could you also introduce the complex inner product and elaborate on the use of the bilinear form as a broadened definition of the inner product/measure of orthogonality for instance when used in the context of an equipped/rigged Hilbert space?

  • @charlievane
    @charlievane Жыл бұрын

    Thanks

  • @dirk_math6794
    @dirk_math6794 Жыл бұрын

    Maybe it's a good idea to connect the subject of this video to Lebesgue integration as it is a good stepping stone towards Fourier analysis etc.

  • @SurfinScientist
    @SurfinScientist Жыл бұрын

    Orthogonality plays an important role in engineering. For example, the correlation between two time series can be defined as an inner product, and if this is 0, then the time series are defined to be orthogonal. Two noise signals are an example of orthogonal signals. Another example is Orthogonal Frequency-Division Multiplexing (OFDM), which is used in your smart phone.

  • @sharpnova2

    @sharpnova2

    Жыл бұрын

    rofl, so cute when engineers try to wax mathematical. it's like a baby's first words.

  • @1080lights

    @1080lights

    Жыл бұрын

    @@sharpnova2 thanks, Mr. BS Mathematics. Can I get fries with my order?

  • @agustinmiranda3989

    @agustinmiranda3989

    Жыл бұрын

    ​@@sharpnova2 This is a very toxic comment, SurfinScientist is just sharing some knowledge from their field.

  • @mathadventuress

    @mathadventuress

    Жыл бұрын

    @@sharpnova2 they’re applying math The fuck is your problem? Get back to fast food

  • @charleyhoward4594

    @charleyhoward4594

    Жыл бұрын

    then the time series are defined to be orthogonal -- what does that mean that "time series are orthogonal" ?

  • @psychSage
    @psychSage Жыл бұрын

    26:02 b, c, d in R, matrix with such elements: (-3b -2d, b, c, d)

  • @benjaminbrat3922
    @benjaminbrat3922 Жыл бұрын

    [Comment for the engagement algorithm] love it! I never realized this simple transition from Pythagoras to inner product of vectors. For the homework, I thought you would be asking about the Pauli spin matrix, or will that be the exam ? ;)

  • @gregsarnecki7581
    @gregsarnecki7581 Жыл бұрын

    Good use of the board tap at ~9:30!

  • @ahmedamin1557
    @ahmedamin1557 Жыл бұрын

    Finally A Nice Explanation 😍😍😍

  • @edwardlulofs444
    @edwardlulofs444 Жыл бұрын

    Fun, thanks

  • @audreychambers3155
    @audreychambers3155 Жыл бұрын

    Oh, that's why we care about the inner product. It always felt like it came out of nowhere.

  • @garrethutchington1663
    @garrethutchington1663 Жыл бұрын

    I read Gram-Schmidt as Granny Smith.

  • @seslocrit9365
    @seslocrit9365 Жыл бұрын

    Could you do more on deriving the Chebyshev/Hermite polynomial using gram-schmidt?

  • @henrymarkson3758
    @henrymarkson3758 Жыл бұрын

    That was an action packed video. Linear Algebra in a minute

  • @Noam_.Menashe
    @Noam_.Menashe Жыл бұрын

    Could you in theory find a weight such that the orthogonal polynomials will be some unrelated polynomials, say Bernoulli or Stirling? I do understand that it's probably hard if not impossible to calculate it, but is the existence of one sure?

  • @abrahammekonnen
    @abrahammekonnen Жыл бұрын

    15:25 if you didn't want to start with the assumption that the orthogonal function had to be linear you'd have to do a series representation of a polynomial function where n is in the integers. Then you'd have to do a bunch of special cases.

  • @SuperYoonHo
    @SuperYoonHo Жыл бұрын

    awesome

  • @Happy_Abe
    @Happy_Abe Жыл бұрын

    Why did the answer need to be linear? Couldn’t it have an arbitrary dimension?

  • @goodplacetostop2973
    @goodplacetostop2973 Жыл бұрын

    26:09

  • @angeldude101
    @angeldude101 Жыл бұрын

    In a matrix of the form (a b || c d), c can be any constant, while the other 3 must obey the property of a + 3b + 2d = 0. This equation says little more than the definition of the inner product itself, but it is nicely represented as an equation for a plane in 3D space. One thing I'm a little sad wasn't mentioned is how by asserting the Pythagorean theorem, you automatically disqualify hyperbolic and spherical geometry. In the latter, it's completely legal to make an equilateral triangle using 3 right angles. In hyperbolic spacetime, if you try to find the "inner product" of a light-like velocity or momentum vector with itself, you get a value of 0 despite the momentum itself not being the 0 vector. In addition while it's likely impossible in physical spacetime, it's also mathematically possible to describe what would likely be tachyons by using a momentum vector with a _negative_ "inner product" with itself.

  • @angeldude101

    @angeldude101

    Жыл бұрын

    @@angelmendez-rivera351 So everything we do with inner products and orthogonality can't actually apply to physical spacetime? That seems like a serious deficiency. It doesn't seem right to try to generalize things and then still exclude very useful applications that are missing only 1 arbitrary property.

  • @violetsaathoff7111
    @violetsaathoff7111 Жыл бұрын

    I feel like I'm back in quantum, except that I wish I'd seen this first!

  • @maxmusterman3371
    @maxmusterman3371 Жыл бұрын

    What do you do for physical fitness? Are you only climbing, or also in the gym?

  • @abrahammekonnen
    @abrahammekonnen Жыл бұрын

    24:48 The question here is why spend time even calculating all the entries in the matrix. We should probably just calculate the diagonal entries and add them up.

  • @abrahammekonnen

    @abrahammekonnen

    Жыл бұрын

    Lol as soon as I say it you mention it. Great minds think alike ;)

  • @klikkolee
    @klikkolee Жыл бұрын

    Can a vector space have multiple distinct products defined for it which meet the definition of an inner product? And can two functions which both meet the definition of an inner product disagree on whether a pair of vectors are orthogonal?

  • @erikstanton3908
    @erikstanton3908 Жыл бұрын

    This was interesting. How could we generalize orthagonality to non Euclidean geometries?

  • @jamesmosher6912
    @jamesmosher6912 Жыл бұрын

    Isn’t the distance formula derived from the Pythagorean theorem?

  • @cH3rtzb3rg

    @cH3rtzb3rg

    Жыл бұрын

    Once you define what a scalar product is, you can easily show that it defines a norm, via |v| = sqrt , and then you can trivially proof Pythagoras' Theorem: |a|²+|b|² = |a-b|² + 2, i.e., |a|²+|b|² = |a-b|² iif. =0.

  • @minwithoutintroduction

    @minwithoutintroduction

    Жыл бұрын

    تذكير رائع كالعادة.

  • @CM63_France

    @CM63_France

    Жыл бұрын

    I agree with you @James Mosher, the Pythagorian theorem not is really a "good place to start" 😄

  • @oyamonad4179
    @oyamonad4179 Жыл бұрын

    Nice

  • @blurrydeer3694
    @blurrydeer3694 Жыл бұрын

    I kinda want to see how this works out in characteristic 2 (I mean, I'm too lazy to work it out for myself... though maybe that would be a good exercise)

  • @pyropulseIXXI

    @pyropulseIXXI

    Жыл бұрын

    In characteristic 2, 2 = 0, so 2(ac+bd)=0 means that does not have to equal 0 for it to be orthogonal. This means that any values of a, b, and c would lead to a 90 degree angle in characteristic 2.

  • @CTJ2619
    @CTJ2619 Жыл бұрын

    how does this relate to tensors?

  • @vinuthomas7193
    @vinuthomas7193 Жыл бұрын

    I want to make sure I'm not misunderstanding: is the idea that "orthogonal" for lines can be different from "perpendicular" (where, in the example, I would have expected b=-1/6)

  • @Eye-vp5de

    @Eye-vp5de

    Жыл бұрын

    It depends on the definition of inner product. In the video it was defined with an integral, which doesn't seem to have a nice geometrical meaning for me, but you can define inner product as an inner product of vectors of length which lie on the lines to get orthogonality = perpendicularity

  • @MGSchmahl
    @MGSchmahl10 ай бұрын

    Why is the 1st degree Laguerre polynomial written as -(-x-1) instead of just x+1?

  • @Alex_Deam
    @Alex_Deam Жыл бұрын

    14:56 Think there's a mistake here, because if you integrate (4-6x)*(1+6x^2), you will also get 0 i.e. 1+6x^2 (and multiples thereof) is an example of non-linear polynomial that's orthogonal to 4-6x.

  • @vinvic1578

    @vinvic1578

    Жыл бұрын

    why would that be a mistake ? the task is to find a polynomial thats orthogonal to 4-6x. In the video he finds a first degree one, and you find a second degree one. But because R[X] is an infinite dimensional space, every polynomial will have an infinite number of linearly independent orthogonal polynomials. You can therefore find polynomials of any degree that are orthogonal to it :) Does that make sense or did I misunderstand your point ?

  • @Alex_Deam

    @Alex_Deam

    Жыл бұрын

    @@vinvic1578 It's a mistake because Michael says that "we're going to assume it's linear because if you work through it with a polynomial of arbitrary degree, you'll see it has to be linear" which is what I did do and found it wasn't true. It doesn't have to be linear.

  • @vinvic1578

    @vinvic1578

    Жыл бұрын

    @@Alex_Deam Oh yes sorry ! That makes sense. Yeah I'm a bit confused as to why he said that now ! Thanks for clearing it up for me :)

  • @polyhistorphilomath

    @polyhistorphilomath

    9 ай бұрын

    @@vinvic1578 probably because the orthonormal bases enumerated are straightforwardly independent if the initial basis vectors are of strictly increasing degree. IIRC -1, ax+b, cx^2+hx+k showed up at least 3 or 4 times in the examples.

  • @bentationfunkiloglio
    @bentationfunkiloglio Жыл бұрын

    Interesting

  • @byronwatkins2565
    @byronwatkins2565 Жыл бұрын

    Since matrices are not commutative, it is not obvious that = in this case.

  • @APaleDot

    @APaleDot

    Жыл бұрын

    In order for it to be an inner product must equal . If the given operation does not always satisfy that condition, it is not an inner product. Since he states that tr(A^T B) _is_ an inner product, then it must satisfy that condition, although it is a good exercise to check for yourself.

  • @byronwatkins2565

    @byronwatkins2565

    Жыл бұрын

    @@APaleDot Frobenius said it before he did. Neither of these facts make it obvious that the statement is true. A better demonstration would have addressed this unexpected result explicitly... however briefly.

  • @APaleDot

    @APaleDot

    Жыл бұрын

    @@byronwatkins2565 Like I said, it's good to check for yourself, but you can't expect a lecturer to prove every little statement they make especially when he's just running down a list of examples. There's nothing wrong with simply stating that such an operation is an inner product, which necessarily includes the symmetric condition. There is such a thing as an irrelevant detail.

  • @bjornfeuerbacher5514

    @bjornfeuerbacher5514

    Жыл бұрын

    That = in that case is rather obvious - one only has to know that tr(A^T) = tr(A) and that (A^TB)^T = B^TA. However, what is not obvious (at least to me) is that is always non-negative in this case and that = 0 only for v = 0.

  • @vicesimum_phi8123
    @vicesimum_phi8123 Жыл бұрын

    interesting

  • @playgroundgames3667
    @playgroundgames3667 Жыл бұрын

    The PT is only for right angled triangles.

  • @lucachiesura5191
    @lucachiesura5191 Жыл бұрын

    Something about Fourier...

  • @BridgeBum
    @BridgeBum Жыл бұрын

    The flow chart mentions a process for generating families of functions, but what are those family definitions and how are they used? Recreational math fan here.

  • @bjornfeuerbacher5514

    @bjornfeuerbacher5514

    Жыл бұрын

    The families are defined exactly as given in the flowchart: start with the scalar product given there, do a Gram-Schmidt orthonormalization of the polynomials (starting with x^0, then x^1, then x^2 and so on). The polynomials you will get by this process are exactly these families of functions. (There are other ways to define them, too, but that's one way to define them.)

  • @BridgeBum

    @BridgeBum

    Жыл бұрын

    @@bjornfeuerbacher5514 Cool, ok. But what separates these starting values or these families from any other starting values? That is, why are they interesting?

  • @bjornfeuerbacher5514

    @bjornfeuerbacher5514

    Жыл бұрын

    @@BridgeBum I don't know about the Chebyshev polynomials, but all the others one are important in physics: the Legendre polynomials for the spherical harmonics, which are used to describe many spherical symmetric systems (among other things, the angular dependence of electromagnetic waves produced by antenna and the angular dependence of atomic orbitals); the Hermite polynomials for the harmonic oscillator in quantum mechanics; and the Laguerre polynomials for the radial dependence of atomic orbitals.

  • @BridgeBum

    @BridgeBum

    Жыл бұрын

    @@bjornfeuerbacher5514 Thank you, that's a helpful perspective.

  • @stephenhamer8192
    @stephenhamer8192 Жыл бұрын

    Ah, but why is orthogonality so _important_ in Maths? Is it because it hugely simplifies the problem of representing vector in different bases?

  • @APaleDot

    @APaleDot

    Жыл бұрын

    I've been thinking about this for a while and here's the best explanation I can come up with: Orthogonality is important because it represents the _complete_ independence of two things. We say that a set of vectors is linearly independent so long as each one adds something "new" to the set. As soon as you add a vector which can be represented as a linear combination of the other vectors, the set is no longer linearly independent. This new vector can be represented in terms of the other vectors, so it isn't adding any "new" dimension to the set and thus is not independent from the set in an important way. What is this "newness" I'm talking about? Well, any time you have two vectors, you can represent one of the vectors as a combination of components, one which is parallel and one which is orthogonal to the other vector. The parallel component is obviously just a scalar multiple of the other vector, so it's really only the orthogonal component which is "different" from the other vector. This is the "newness" that is added to the set, it is a new direction _orthogonal_ to the rest of the set. This complete independence is very useful when trying to analyze real world events. Consider rolling two dice. In order for the dice to be fair, the two rolls _must_ be independent from each other. The outcome of the rolls should be orthogonal in a very real sense. This is what makes calculating probabilities of such independent events so easy. Calculating probabilities of events which depend on each other is much more difficult because you first have to account for the effect of one event on the other (i.e, you have to remove the parallel component of the "vectors"). In math, this independence means that if you have two subspaces which are orthogonal to each other, you can move a point parallel to one of them without affecting it's component in the other one at all. This obviously makes calculations much simpler when you can consider only the actions happening in a particular subspace because they are completely independent from the rest of the space.

  • @hybmnzz2658

    @hybmnzz2658

    Жыл бұрын

    Spectral theorem

  • @stephenhamer8192

    @stephenhamer8192

    Жыл бұрын

    @@APaleDot Very good answer and nice 2-dice example. Only (the part of) an element orthogonal to the set adds new information - how true that is, not just in Maths but in life generally

  • @stephenhamer8192

    @stephenhamer8192

    Жыл бұрын

    @@hybmnzz2658 Study of. Adding it to my bucket list

  • @abrahammekonnen
    @abrahammekonnen Жыл бұрын

    By characteristic 2 I'll assume mod 2.

  • @super_pigeon

    @super_pigeon

    Жыл бұрын

    He meant this on the field of coefficients: en.wikipedia.org/wiki/Characteristic_(algebra)

  • @abrahammekonnen

    @abrahammekonnen

    Жыл бұрын

    @@super_pigeon oh I see. Thank you

  • @schweinmachtbree1013

    @schweinmachtbree1013

    Жыл бұрын

    @@abrahammekonnen the integers mod 2 are the simplest example of a field of characteristic 2 though

  • @playgroundgames3667
    @playgroundgames3667 Жыл бұрын

    It would of been easier to understand if you drew a right angle triangle.

  • @stevenwilson5556
    @stevenwilson5556 Жыл бұрын

    I got B = {a, b, // c , d}, a = -3t - 2s, b = t, c = w (free), d = s.

  • @M.athematech
    @M.athematech Жыл бұрын

    eRRRmeet not her-mite

  • @CM63_France
    @CM63_France Жыл бұрын

    Hi, I think the easiest way to define a right angle is to say it's half of a flat angle. 15:59 : missing dx 18:27 : what is this Schmidt orthogonalization process? (I am not sure to remember). 18:56 [-1,1] instead of [-1,-1]

  • @sillymel

    @sillymel

    Жыл бұрын

    (18:27) I would recommend just searching the internet for “Gram-Schmidt orthogonalization process.” It’s a bit too convoluted to properly explain in a comment.

  • @sillymel

    @sillymel

    Жыл бұрын

    Since it’s not letting me edit my comment: That’s not to say that the Gram-Schmidt orthogonalization process is particularly difficult, just long.

  • @bjornfeuerbacher5514

    @bjornfeuerbacher5514

    Жыл бұрын

    The basic idea of the Gram-Schmidt orthonormalization procedure is that given a set of already orthonormal basis vectors, one takes a new vector and subtracts every parts from it which are parallel to all the given basis vectors. Hence the vector one arrives at will be orthogonal to all the other given basis vectors. Finally, normalize it. Then take another new vector and do the process again etc.

  • @CM63_France

    @CM63_France

    Жыл бұрын

    About the GSOP : ok, thank you for your answers, it comes back to me now. About the video: would it be possible to demonstrate the Pythagorean theorem starting from this definition of the right angle: half of a flat angle.

  • @Macisordi
    @Macisordi Жыл бұрын

    Aren’t you using Pitagora to prove Pitagora in the vector example?

  • @scp3178

    @scp3178

    Жыл бұрын

    Pitagora = Pythagoras?

  • @wafikiri_

    @wafikiri_

    Жыл бұрын

    It is not proving Pythagoras's theorem that is sought, but a definition of what is orthogonal, based on the above theorem in Euclidean space.

  • @Macisordi

    @Macisordi

    Жыл бұрын

    @@wafikiri_ how do you define distance in a plane without using Pythagoras, p.s. my Italian iPad autocorrect Pitagora

  • @wafikiri_

    @wafikiri_

    Жыл бұрын

    @@Macisordi I wouldn't. The Cartesian co-ordinate system, with orthogonal co-ordinates, practically requires Pythagoras to define a metric. To renounce to Cartesian co-ordinates in a plane would mean to use a tougher metric, whose distance formulae would surely be cumbersome.

  • @adamwho9801
    @adamwho9801 Жыл бұрын

    I thought he was going to use the language of quantum mechanics.... he is REALLY close to doing applied mathematics here.

  • @schweinmachtbree1013

    @schweinmachtbree1013

    Жыл бұрын

    I disagree - inner product spaces, bilinear forms, function spaces, etc. are all things that are taught in courses on a pure math degree. Some of the examples were close to measure theory and Fourier series, but again these are pure math theories; it is when you apply the theories, e.g. to quantum mechanics, that it becomes applied mathematics

  • @pierreabbat6157
    @pierreabbat6157 Жыл бұрын

    /ɛʁ.mit/, not /həɹmait/, and "Chebyshev" should be yoficated and end in stressed "shov". If the field has √-1 and the vector space has at least 2 dimensions, a line can be perpendicular to itself. This is called a null line.