Reproducing Kernels and Functionals (Theory of Machine Learning)

In this video we give the functional analysis definition of a Reproducing Kernel Hilbert space, and then we investigate approximations within this space using moments as data. We draw a comparison with polynomial best approximations over L^2, and get comparable results with a new basis function made from kernels.
//Watch Next
The Real Analysis Survival Guide • The Real Analysis Surv...
The Analyticity of the Laplace transform • Morera's Theorem in Pr...
Introduction to Control Theory • Introduction to Contro...
//Books
Steve Brunton and J. Nathan Kutz - Data Driven Science and Engineering amzn.to/4daHtem
Holger Wendland - Scattered Data Approximation amzn.to/4daHtem
Gregory Fasshauer - Meshfree Approximation Methods with MATLAB amzn.to/3U1KMeM
Gregory Fasshauer - Kernel Based Approximation Methods using MATLAB amzn.to/4d1CwEx
Ingo Steinwart and Andreas Christmann - Support Vector Machines amzn.to/4d5C7km
// Code
BitBucket
bitbucket.org/joelrosenfeld/r...
Try Through MATLAB Online for free (20 hours free use per month)
matlab.mathworks.com/open/git...
//Music Provided by Epidemic Sound
Use this referral link to get a 30 day free trial with Epidemic Sound for your KZread channel:
www.epidemicsound.com/referra...
//Recording Equipment
Canon SL3: amzn.to/3nZ11KU
Canon T6i: amzn.to/3FUpkQh
Rode VideoMic: amzn.to/3lhldGa
Blue Yeti Microphone: amzn.to/3I1y88N
Yeti Nano Microphone: amzn.to/3I1mriA
SanDisc 256GB SD Card: amzn.to/3E3LOOr
Neewer 5600K USB LED Lights: amzn.to/3xvB9cN
Neewer 18 inch Ring Light: amzn.to/2ZvgCsc
Camera Power Adapter: amzn.to/3D3upUu
This content is partially supported by National Science Foundation - Award ID 2027976, Air Force Office Of Scientific Research - Award FA9550-20-1-0127 and FA9550-21-1-0134. I am responsible for all opinions and content, and what I say does not necessarily reflect the sponsoring organizations.
DISCLAIMER: The links above in this description may be affiliate links. If you make a purchase with the links provided I may receive a small commission, but with no additional charge to you :) Thank you for supporting my channel so that I can continue to produce mathematics content for you!
0:00 Start
1:11 Reproducing Kernel Hilbert Spaces
5:01 Two Examples
12:01 Customizing Bases for Approximation
14:22 Comparing Best Approximations
21:03 Wrap up and Watch Next

Пікірлер: 18

  • @positivobro8544
    @positivobro85442 ай бұрын

    Ayo the legend keeps on giving

  • @avigailhandel8897
    @avigailhandel88972 ай бұрын

    I love your videos! I'm the person who posted that I will be starting grad school in the fall at the age of 55. I registered for classes at Montclair State University. Combinatorics, numerical analysis, linear algebra. And I'll be a TA. I am looking forward to being a graduate student in mathematics!

  • @JoelRosenfeld

    @JoelRosenfeld

    2 ай бұрын

    That’s awesome! It sounds like you have a fun schedule too. Congrats and let me know how it goes!

  • @ethandills4716
    @ethandills47162 ай бұрын

    0:44 "if we have the time... and space" lol

  • @idiosinkrazijske.rutine
    @idiosinkrazijske.rutine2 ай бұрын

    The third book at @0:22 is "Meshfree Approximation Methods with MATLAB by Gregory Fasshauer. A good resource for Radial Basis Functions and similar topics..

  • @JoelRosenfeld

    @JoelRosenfeld

    2 ай бұрын

    Yeah it really is. I find Fasshauer does a great job at explaining the topic. Wendland goes into more of the theory, if you want to deeper. I met Fasshauer at a conference last year. Great guy

  • @richardgroff3807
    @richardgroff38072 ай бұрын

    I am missing something important at around 13:39 in the video. I understand the line h_i(t)= , which uses the reproducing kernel to evaluate h_i at time t. The inner product must be the inner product for H for this to work. The next line looks like the standard property of the inner product, i.e. the complex conjugate of the inner product with entries swapped. What confuses me is the next line, which seems to expand out the definition of the inner product, but rather than the inner product for H, it looks like the inner product for L^2 and I can't figure out why that is. Was there an adjoint lurking about somewhere (rather than a property of the inner product)? How do I see it? .

  • @JoelRosenfeld

    @JoelRosenfeld

    2 ай бұрын

    It’s not an inner product. Each h_i represents some functional through the inner product. That expansion is the functional that h_i represents being applied to the kernel function.

  • @richardgroff3807

    @richardgroff3807

    2 ай бұрын

    ​@@JoelRosenfeld Thanks for your response! It finally sunk in. I didn't understand why you were swapping the entries, but it was to make the entries of the inner product match what was used with the Riesz Representation theorem a few lines above. In the next part of the video you do a numerical example where you generate a set of basis functions that are representations of the moments functionals, and then project function f onto that basis functions. By my understanding, the inner product used in your normal equations is the inner product for H, associated with the RBF kernel (defined at 9:00)? Is there an intuitive relationship between the best approximation using the norm associated with that inner product compared to, say, L^2? (I haven't watched your best approximation video, perhaps that question is answered there?)

  • @JoelRosenfeld

    @JoelRosenfeld

    2 ай бұрын

    The “best” approximation depends on the selection of the inner product and Hilbert space. In the best approximation video for L^2 we started with a basis, polynomials, then we selected a space where they reside and computed the weights for the best approximation in that setting. If we change the Hilbert space, then we need to find new weights. However, there is a difficulty that can arise. Perhaps, it’s not so easy to actually compute the inner product for that basis in that particular Hilbert space. This approach avoids that because we take the measurements and THEN select the basis. So we end up with these h’s rather than polynomials. The advantage here is that we never actually have to compute an inner product, we just leverage the Riesz theorem to dodge around it. What I’m setting up here is the Representer Theorem, which we will get to down the line (maybe 5 videos from now?). There it turns out that the functions you obtain from the Riesz theorem are the best basis functions to choose for a regularized regression problem. This was a result of Wabha back in the (80s?)

  • @robn2067
    @robn20672 ай бұрын

    Very interesting video, but can you talk a little slower? Often it is not clear what words you are pronouncing, in particular for theorems.

  • @samueldeandrade8535

    @samueldeandrade8535

    2 ай бұрын

    Man, just change to configurations of the video to watch it slower.

  • @JoelRosenfeld

    @JoelRosenfeld

    2 ай бұрын

    Sorry if I talk too fast. I’ll work on it

  • @samueldeandrade8535

    @samueldeandrade8535

    2 ай бұрын

    @@JoelRosenfeld you don't. You talk just fine.

  • @HEHEHEIAMASUPAHSTARSAGA

    @HEHEHEIAMASUPAHSTARSAGA

    2 ай бұрын

    @@JoelRosenfeld Putting real subtitles on your videos would solve the issue. It's pretty easy these days, just put a transcript in and youtube will match up the times for you. It's probably even quicker to start with the automatic transcription and just fix the errors.

  • @JoelRosenfeld

    @JoelRosenfeld

    2 ай бұрын

    @@HEHEHEIAMASUPAHSTARSAGA in the past the transcripts were pretty bad that were produced by KZread. Premiere has a new AI feature that is actually pretty good at catching math terminology. I’ll give it some thought. Just takes more time