Kernels!

Today Yannic Lightspeed Kilcher and I spoke with Alex Stenlake about Kernel Methods. What is a kernel? Do you remember those weird kernel things which everyone obsessed about before deep learning? What about representer theorem and reproducible kernel hilbert spaces? SVMs and kernel ridge regression? Remember them?! Hope you enjoy the conversation!
00:00:00 Tim Intro
00:01:35 Yannic clever insight from this discussion
00:03:25 Street talk and Alex intro
00:05:06 How kernels are taught
00:09:20 Computational tractability
00:10:32 Maths
00:11:50 What is a kernel?
00:19:39 Kernel latent expansion
00:23:57 Overfitting
00:24:50 Hilbert spaces
00:30:20 Compare to DL
00:31:18 Back to hilbert spaces
00:45:19 Computational tractability 2
00:52:23 Curse of dimensionality
00:55:01 RBF: infinite taylor series
00:57:20 Margin/SVM
01:00:07 KRR/dual
01:03:26 Complexity compute kernels vs deep learning
01:05:03 Good for small problems? vs deep learning)
01:07:50 Whats special about the RBF kernel
01:11:06 Another DL comparison
01:14:01 Representer theorem
01:20:05 Relation to back prop
01:25:10 Connection with NLP/transformers
01:27:31 Where else kernels good
01:34:34 Deep learning vs dual kernel methods
01:33:29 Thoughts on AI
01:34:35 Outro

Пікірлер: 40

  • @thegimel
    @thegimel3 жыл бұрын

    I love how Yannic takes a step back and explains things using his intuition. very helpful!

  • @frankd1156
    @frankd11563 жыл бұрын

    Everything Yanic speaks is gold...I understand instantly

  • @MachineLearningStreetTalk

    @MachineLearningStreetTalk

    3 жыл бұрын

    I know right 😂

  • @rockapedra1130

    @rockapedra1130

    3 жыл бұрын

    I know! He always asks what I want to know, it’s kinda spooky how good of a communicator he is!

  • @freemind.d2714

    @freemind.d2714

    3 жыл бұрын

    Without Yanic, I can't understand a word

  • @clarkd1955
    @clarkd1955 Жыл бұрын

    The contribution of all 3 of you was significantly more than the sum of the parts. Very enjoyable, thanks.

  • @Luck_x_Luck
    @Luck_x_Luck3 жыл бұрын

    best explanation of kernels I've encountered so far, thanks!

  • 2 жыл бұрын

    I love your channel! Though I have to admit that I felt a little lost with all the terminology being thrown around when I first watched this video in particular. I decided to delve deeper into kernels and after intensive research, I have created a 6 hours long playlist on Kernel Methods to summarize my current understanding. If anyone wants to have a crash course on kernels in particular, I'd be delighted to welcome you in my comment section. After these countless hours of self study, I can now follow the conversation fully which is such a nice feeling of accomplishment. Thank you for inspiring me to research this topic in-depth!

  • @machinelearningdojowithtim2898
    @machinelearningdojowithtim28983 жыл бұрын

    I loved this conversation with Alex! We already recorded 2 more casual conversions, we will upload them in the coming days

  • @rockapedra1130
    @rockapedra11303 жыл бұрын

    I loved this discussion! The combination of Alex knowing everything in full mathematical generality and Yannic trying to bring it down to the”real world” really helped me! I’m new at this subject, it really helps to walk through a toy problem such as temperature in a room using a simple basis and describing the vectors formed etc. so that it feels less nebulous to begin with. Granted, I’m an engineer so what’s best for me is first show me a simplified version and how it works concretely, THEN abstractify it to death to make it maximally useful. Thanks to all three of you!!! It amazes me that such great content is just “out there” to be found!

  • @AICoffeeBreak
    @AICoffeeBreak3 жыл бұрын

    Very helpful video, happy it exists! Perhaps the format could have allowed for some slides here and there, since Alex Stenlake has prepared an explanation in advance. Just to avoid him gesticulating the visualizations. And also verbalizing mathematical examples that are easy to understand when written, harder to follow when just spoken out loud. 😊

  • @swarajshinde3950
    @swarajshinde39503 жыл бұрын

    love your videos .

  • @shivamraisharma1474
    @shivamraisharma14743 жыл бұрын

    Top quality content👌👌

  • @abby5493
    @abby54933 жыл бұрын

    Wow! Such good and informative video 😃

  • @MachineLearningStreetTalk

    @MachineLearningStreetTalk

    3 жыл бұрын

    Thank-you Abby!

  • @dome8116
    @dome81163 жыл бұрын

    I love this podcast. Really such a cool idea. I just wanna give some tips that might make it even more better, at least visually. It kind of really annoys to see the bad quality of the people talking. I think it would be so much cooler if everyone would record his camera and audio and afterwards send it to Tim who cuts it together in a way you have it now , where every person is visible at any time, just in way better quality. That way there are also way more options to make the design of the podcast cooler. For example you could put a nice layout over it or something. Also I feel like sometimes it would come in so handy if you would bring some pictures on the screen. A bit like Tim already did where he opened up the papers. It would look so much more professionell to the viewer and Im sure others would like it too. Anyways, I love the show

  • @minghanzhu6082
    @minghanzhu60823 жыл бұрын

    I really wanted to appreciate the efforts but probably only people with already very good understanding about kernels can handle all these verbal discussions with abstract and repeatedly used words. I see that Yannic tried to make it clearer by asking some clarifying questions, though, which helped a little bit.

  • @j.dietrich
    @j.dietrich3 жыл бұрын

    Tim's breakfast bar/kitchen island arrangement is impressive, but the tin of Coffee Mate hurts my soul.

  • @machinelearningdojowithtim2898

    @machinelearningdojowithtim2898

    3 жыл бұрын

    Lol!!! But what are you saying here? 1) You don't like the design on the tin 2) you don't like the manifold of the tin 3) you don't like coffee mate 😂

  • @quebono100
    @quebono1003 жыл бұрын

    Your channel has way to few subscribers. Such good content, im not even a machine learning engineer, just a programer who learn this all stuff at the moment.

  • @Hawkz1600
    @Hawkz16003 жыл бұрын

    Amazing stuff! Also would be cool if you could talk about dimensionality reduction methods to solve the memory inefficiencies of kernel methods with large datasets.

  • @JI77469

    @JI77469

    3 жыл бұрын

    Hawkz1600, it seems that the biggest breakthrough here to fix memory issues is the usage of "random features" to approximate general kernels by random linear kernels. See the paper "Random features for large-scale kernel machines. "

  • @SergeTheGod
    @SergeTheGod3 жыл бұрын

    Great talk guys! Reminds me why I got into ML in the first place, and reevaluate Bishops book 😅

  • @bradleypliam110

    @bradleypliam110

    Жыл бұрын

    Serge, what is the title of this book? I'd like to find myself a copy.

  • @SergeTheGod

    @SergeTheGod

    Жыл бұрын

    @@bradleypliam110 Pattern Recognition and Machine Learning Great book!

  • @bradleypliam110

    @bradleypliam110

    Жыл бұрын

    @@SergeTheGod Thank you for the leg up!!

  • @DavenH
    @DavenH3 жыл бұрын

    "infinite dimensional, or high dimensional, or don't-wanna-compute-able" haha!

  • @DavenH

    @DavenH

    3 жыл бұрын

    "and that's because least-squares is a horrible, blurry loss function" =)

  • @shivamraisharma1474
    @shivamraisharma14743 жыл бұрын

    Just a naive viewpoint/question here, yannic in his video about linformer mentioned about the JL theorem which multiples a high dimensional data distribution with fixed gaussian distribution to lower dimensions while preserving the distance between data point constant. If kernels are also a distance similarity measure, which also kind of projects data from lower dimension to a certain higher dimension ( rewatching the video again i am at 17 min currently ) so pairwise distance measures between data points seems to be a sorta accurate representation for any distribution and any projecting from higher to lower or vice versa dimension must be focused on preserving the distance measure

  • @daryoushmehrtash7601
    @daryoushmehrtash76013 жыл бұрын

    This would have been such a nice presentation if Tim didn't distract the flow of the conversation. Yannic tried a few times to recover the underlying goal of the Alex's talk, but failed. I wish this could be redone with the Alex talk on underlying concept and its application to the Yannic's room temperature model as a specific example.

  • @machinelearningdojowithtim2898

    @machinelearningdojowithtim2898

    3 жыл бұрын

    Sorry! Feedback taken on board

  • @JscottMays
    @JscottMays10 ай бұрын

    Solid

  • @raszagal1000
    @raszagal10003 жыл бұрын

    Around 39 minutes one bit that is missing is that an inner product of two functions is the integral of the functions multiplied together over the domain of their arguments.

  • @oblomist

    @oblomist

    3 жыл бұрын

    Thank you, that makes more sense now. But the result, when evaluated, should still be a scalar, right?

  • @raszagal1000

    @raszagal1000

    3 жыл бұрын

    @@oblomist in this case yes, not sure if that's true in general.

  • @wangyifan1468
    @wangyifan14688 ай бұрын

    11:50 where the kernel talk started

  • @JI77469
    @JI774693 жыл бұрын

    I'd love to know anyone's thoughts on the usefulness/utility of 1) Random Fourier Features (a trick to approximate kernels by certain linear kernels, and thus speed computations up. ) 2) Reproducing kernel Banach spaces (doing kernel methods in a Banach space that promotes sparsity more than doing kernel methods in a Hilbert space setting would, kind of like Lasso regression vs Ridge regression. )

  • @JRAbduallah1986
    @JRAbduallah19862 жыл бұрын

    Why not having a board and writing on it. This make it more interesting. More importantly having fun examples can give audience much better understanding.

  • @AConversationOn
    @AConversationOn3 жыл бұрын

    Talking about highly advanced mathematics without notational & visual support is highly silly. There is no one who can understand the english who cannot understand visuals, and many who could only understand the visuals.

  • @Macatho
    @Macatho3 жыл бұрын

    Lose the shades.

Келесі