Gaussian Processes

The machine learning consultancy: truetheta.io
For Machine Learning, Gaussian Processes enable flexible models with the richest output you could ask for - an entire predictive distribution (rather than a single number). In this video, I break down what they are, how they work and how to model with them. My intention is this will help you join the large group of people successfully applying GPs to real world problems.
SOCIAL MEDIA
LinkedIn : / dj-rich-90b91753
Twitter : / duanejrich
Enjoy learning this way? Want me to make more videos? Consider supporting me on Patreon: / mutualinformation
SOURCES
Chapter 17 from [2] is the most significance reference for this video. That's where I discovered the Bayesian Linear Regression to GP generalization, the list of valid ways to adjust a kernel and the Empirical Bayes approach to hyperparameter optimization. Also, it's where I get most of the notation. (In fact, for all my videos, Kevin Murphy's notation is what I follow most closely.)
[1] is a very thorough practical and theoretical analysis of GPs. When I first modeled with GPs, this book was a frequent reference. It offers a lot of practical advice for designing kernels, hyperparameter optimization and interpreting results.
[5] offers a useful tutorial on how to design kernels. I attribute this source for my intuitive understanding of how to combine kernels.
Neil's talks ([4]) on GPs were also influential. They've helped me develop much of my intuition on how GPs work.
[3] is an beautiful tutorial on GPs. I'd recommend it to anyone learning about GPs for the first time.
---------------------------
[1] C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning. MIT Press, 2006.
[2] K. P. Murphy. Probabilistic Machine Learning (Second Edition), MIT Press, 2021
[3] J. Görtler, et al., "A Visual Exploration of Gaussian Processes", Distill, 2019. distill.pub/2019/visual-explo...
[4] N. Lawrence, Gaussian Processes talks on MLSS Africa, • Neil Lawrence - Gaussi... , • Neil Lawrence Gaussian...
[5] D. K. Duvenaud, The Kernel Cookbook: Advice on Covariance Functions, University of Cambridge, www.cs.toronto.edu/~duvenaud/...
[6] K. Weinberger, "Gaussian Processes", Cornell University, • Machine Learning Lectu... and • Machine Learning Lectu...
RESOURCES
GPyTorch provides an extensive suite of PyTorch based tools for GP modeling. They have efficient handling of tensors, fast variance calculations, multi-task learning tools, integrations with Pyro, and Deep Kernel Learning, among other things. Exploring this as a toolset is a great way to become a competent GP modeler. Link : gpytorch.ai/
Also, I'd recommend source [5] for getting familiar with how to model with GPs. Understanding the kernel space to function space relationship takes time, but it takes less with this guide. Also, it links to Duvenaud's PhD Thesis, which is a very deep dive on the subject (though don't ask me about it - I didn't read it!).
EXTRA
Why is it OK to act as though a sample from a multiplied kernel comes from multiplying the function samples from the two component kernels?
The problem comes from the fact that if x1 is a sample from a Multivariate Normal with mean zero and covariance matrix S1 and the same is true for x2 and S2, then the element-wise product x1*x2 is not distributed as a multivariate Normal. However, whatever distribution x1*x2 has, it still has a covariance of S1*S2 (I've verified this experimentally). That means it wiggles similarly to a sample from the product kernel.
The background here is, I accidentally thought it was true for quite a while and it was helpful for modeling. I certainly could never tell it wasn't true. When creating this video, I discovered it wasn't in fact true, but merely a useful approximation.
Wallpaper: github.com/Duane321/mutual_in...
Timestamps
0:00 Pros of GPs
1:06 Bayesian Linear Regression to GPs
3:52 Controlling the GP
7:31 Modeling by Combining Kernels
8:52 Modeling Example
11:55 The Math behind GPs
18:42 Hyperparameter Selection
21:58 Cons of GPs
22:58 Resourcing for Learning More

Пікірлер: 226

  • @sisilmehta
    @sisilmehta Жыл бұрын

    Literally the best explanation on the internet for GP Regression Models. He's not trying to be cool, but genuinely trying to explain the concepts

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thank you my man. And yes, my risk of being cool is zero lol

  • @markzuckerbread1865

    @markzuckerbread1865

    10 ай бұрын

    Agreed, I've been trying to understand GPs for a task at work and this is the easiest to understand explanation I've found, liked and subbed!

  • @user-ci7qh4bp5d
    @user-ci7qh4bp5d27 күн бұрын

    My postgrad supervisor literally told me to watch this a few times just so I can explain it clearly to Human Sciences people in my research proposal. Thanks for all your effort making it!

  • @wiggwigg2010
    @wiggwigg20102 жыл бұрын

    Great video. I've seen GPs mentioned a few times in papers and always glossed over it. Thanks for the great explanation!

  • @user-ng4cq5qe6c
    @user-ng4cq5qe6c2 ай бұрын

    I know how hard it is to explain this topic, so simply and comprehensively. I am extremely thankful for your efforts.

  • @user-or7ji5hv8y
    @user-or7ji5hv8y2 жыл бұрын

    The way you motivate the problem really adds insights for understanding.

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w2 жыл бұрын

    I learn each time I rewatched the video. So much better than sitting lectures where you only listen once.

  • @oldPrince22
    @oldPrince22 Жыл бұрын

    Excellent video on this topic. Brief and elegant explanations!

  • @anttiautere3663
    @anttiautere36639 ай бұрын

    A great video! Thank's. I used GP at work many years ago and enjoyed the framework a lot.

  • @maulberto3
    @maulberto3 Жыл бұрын

    Props for explaining such a complex model in a friendly way

  • @buttforce4208
    @buttforce42082 жыл бұрын

    Absolutely love your exposition. So good!

  • @Murphyalex
    @Murphyalex2 жыл бұрын

    I just discovered your videos yesterday and now they're popping up on my YT home screen and I feel a bit like a little boy in a toyshop. How have these high quality fantastic tutorials evaded me for so long, when I spend so much time looking at technical content on KZread? Seriously impressive! I'll definitely be one to share your videos when the opportunity arises.

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Much appreciated! I got some really cool stuff coming in May. If you like this stuff, you'll *love* what's coming. Thanks again!

  • @mengliu6720
    @mengliu67202 жыл бұрын

    Best video for GP I have seen! Thank you so much!

  • @caseyalanjones
    @caseyalanjones7 ай бұрын

    It looks like you have optimized the hyperparameters of making an awesome video. So concise, but still a sprinkle of humor here and there. Awesome visualizations, so appreciated.

  • @Mutual_Information

    @Mutual_Information

    7 ай бұрын

    haha I thought that was gonna be a nitpick, pleasantly surprised - thank you!

  • @TeoChiaburu
    @TeoChiaburu2 жыл бұрын

    Excellent explanations and visualizations. Helped me a lot, thank you!

  • @popa42
    @popa429 ай бұрын

    I did understand just a few things, but still I watched this video till the very end - the production value is insane! And maybe I’ll need GPs in the future? :D You definitely deserve much more subscribers, your videos are great!

  • @alexandrebonatto6438
    @alexandrebonatto64384 күн бұрын

    By far this is the best video I have seen on this subject! Thank you very much!

  • @cosapocha
    @cosapocha Жыл бұрын

    The production value of this is insane

  • @majdkahouli8644
    @majdkahouli86442 ай бұрын

    What a smooth way to explain such complex math , thank you

  • @LuddeWessen
    @LuddeWessen2 жыл бұрын

    Great balance between technical depth and intuition for me right now. I love how you say that multiplication "is like", but still is not. This gives intuition, but provides a warning for the day when we have come further in our understanding. 🤗

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Ha yea glad these details aren’t unnoticed. It’s a careful game making sure I never say anything *technically* wrong.

  • @Ethan_here230

    @Ethan_here230

    2 ай бұрын

    Yes sir sometimes to make a point you need to recontextualize the matter to specific to make things easier to understand​@@Mutual_Information

  • @Kopakabana001
    @Kopakabana0012 жыл бұрын

    Another great video! Love seeing each one come up

  • @swindler1570
    @swindler1570 Жыл бұрын

    Phenomenal video. I genuinely can't thank you enough for how accessible this was. I'm sure I'll come back and reference it, or your other work, as I continue preparing for my upcoming internship working on physics-informed neural networks.

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    That's awesome! Glad it helped

  • @FoldedArt
    @FoldedArt Жыл бұрын

    Thank you for creating and sharing this great material. All of your videos I watched so far are incredibly informative and well edited.

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Glad you're getting something out of it!

  • @sietseschroder3444
    @sietseschroder3444 Жыл бұрын

    Truly amazing how you turned such a complex topic into an accessible explanation, thanks a lot!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Glad you liked it :)

  • @Birdsneverfly
    @Birdsneverfly2 жыл бұрын

    The visualizations are the catch. Just excellent 😊

  • @jonathanmarshall2518
    @jonathanmarshall25182 жыл бұрын

    Love the level you've pitched this video at.

  • @tommclean9208
    @tommclean92082 жыл бұрын

    the first time I watched this a month or so a go I had no idea what was happening. However I have recently needed to use a GP and after a lot of reading up on them and coming back to this video, I can appreciate it a lot more with some understanding:)

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Yea my topics require some prerequisite 😅 but with a little getting used to on the notation and basics of probability/stats, I think it should be fairly digestible. Glad you got something out of it.

  • @mCoding
    @mCoding2 жыл бұрын

    Great reference video, I'm sure I will come back to it again and again. The level of detail in all the simulations you do is just incredible. Do you make all your animations in manim?

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Thanks brother! And i don’t use manim actually. I like representing data with Altair, which is like a better version of matplotlib. So I have a small library which turns Altair pics into vids.

  • @andreyshulga12
    @andreyshulga12 Жыл бұрын

    Thank you! I've been researching paper dedicated to the gaussian approach to time series prediction(as a task in a lab), and I really struggled with it. But after your video, everything has been sorted out in my head, and i finally have understood it!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Exactly what I've hoped to do - happy it helped!

  • @manuelstrondl1797
    @manuelstrondl1797 Жыл бұрын

    Great video! Just dived into GPs by learning about their application in system identification techniques. In fact I'm learning for my examn right now and looked for a video that nicely sums up this topic and gives some intuition. This video matches my needs 100%, thank you very much.

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Excellent - you're the ideal viewer!

  • @user-gz1fg4og5j
    @user-gz1fg4og5j9 ай бұрын

    Such a clear and intuitive explanation of GPs! Great work!. Excellent video on this topic. Brief and elegant explanations!.

  • @Mutual_Information

    @Mutual_Information

    9 ай бұрын

    Thank you - that’s what I’m going for!

  • @ApiolJoe
    @ApiolJoe2 жыл бұрын

    Absolutely perfect! I heard of GPs and was wondering what they were exactly, wanted a bit of intuition of how and why they work, how to use them, just as a quick intro or motivation before learning them later on. This video answered all of this in a duration that is absolutely perfect: not too long so that it can be watched "leisurely", and not too short so that you still give enough information that I don't have the impression that I learnt nothing. Didn't know your channel, will definitely check the rest out!

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Thanks a lot! That's exactly what I'm going for. Relatively short and dense with useful info. Glad it worked for you.

  • @eggrute
    @eggrute2 жыл бұрын

    This video is really nice. Thank you so much for creating this content material.

  • @gabrielbelouze7765
    @gabrielbelouze77652 жыл бұрын

    What the hell that's a great channel I'm so glad I found you. Production quality is spot on, thank you for taking such care !

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Happy to have you! Welcome!

  • @matveyshishov
    @matveyshishov Жыл бұрын

    Man, you have some beautiful explanations, and the way you explain the details is somehow very simple to understand, thank you so much!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Glad you liked it!

  • @Boringpenguin
    @Boringpenguin Жыл бұрын

    I read the distill article and came back to watch the whole video again for the second time. Now it's crystal clear! Thanks so much!!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Distill is an epic educational source :)

  • @Boringpenguin

    @Boringpenguin

    Жыл бұрын

    @@Mutual_Information It's sad that they're in hiatus since last year :( Hopefully they'll come back some day

  • @minhlong1920
    @minhlong1920 Жыл бұрын

    Such a clear and intuitive explanation of GPs! Great work!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thanks Minh :)

  • @heyna88
    @heyna88 Жыл бұрын

    As I wrote you on LinkedIn, this is probably the best video on GPs out there! I know it takes a long time to put together something of such high quality, but I hope I will see more of your videos in the future! 😊

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    thanks, means a lot - and it's coming. This one has been taking awhile, but it'll be out soon :)

  • @tirimula
    @tirimula Жыл бұрын

    Awesome Explanation. Thank you.

  • @nicksiska3231
    @nicksiska3231 Жыл бұрын

    Straight forward and explained well thank you

  • @xbailleau
    @xbailleau Жыл бұрын

    I will have to watch your video several times to understand (if I can) everything but undoubtly your video is professional and very very well done !! congratulations

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thanks Xavier!

  • @springnuance7048
    @springnuance7048 Жыл бұрын

    holy sh*t, you have unlocked the secret of GP and Bayesian stuff... I have struggled so hard to understand what is even GP as it is so abstract. Thank you so much for your great work!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Happy to help my friend ;)

  • @MauroRincon
    @MauroRincon Жыл бұрын

    Brilliant video! loved the graphics.

  • @taotaotan5671
    @taotaotan56712 жыл бұрын

    a really really hard-core video... thanks D.J

  • @LaRenard
    @LaRenard Жыл бұрын

    This is truly a great explanation that helps me to connect all the dots together!! Thanks a lot!!!!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    I'm glad it help. When I was studying GPs, these are the ideas that floated in my head - happy to share htem.

  • @FrederikFalk21

    @FrederikFalk21

    9 ай бұрын

    Quite literally Badum tssch

  • @aiart3453
    @aiart34532 жыл бұрын

    The best explanation of Kernel so far!

  • @LuddeWessen

    @LuddeWessen

    2 жыл бұрын

    Agreed! 😎

  • @DanieleO.
    @DanieleO. Жыл бұрын

    Super quality content! Thank you so much: I subscribed and I hope your number of subscribers increases more and more to motivate you to keep going!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thank you - I hope so too!

  • @massisenergy
    @massisenergy Жыл бұрын

    Perfect! - research, delivery, production, duration, pictorial intuitiveness, mathematical rigor, naïve friendly 👏🏽

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thank you Sourav - I'm trying!

  • @Gggggggggg1545.7
    @Gggggggggg1545.72 жыл бұрын

    Another great video. Keep up the good work!

  • @sashaaldrick
    @sashaaldrick2 жыл бұрын

    Really good explanation, the animations help so much. Thank you, I really appreciate it.

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    You're very welcome - Glad to hear it's landing as intended!

  • @Donmegamuffin
    @Donmegamuffin2 жыл бұрын

    A truly fantastic explanation to them! The visuals were instructive and well presented, thank you for making this!

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    and thank you for watching ;)

  • @BoldizsarZopcsak
    @BoldizsarZopcsak Жыл бұрын

    This is brilliant. Thank you.

  • @goelnikhils
    @goelnikhils Жыл бұрын

    Amazing video on GP's .

  • @pilurussu20
    @pilurussu204 ай бұрын

    You are the best man! Thank you for your videos, you 're helping a lot of students, because your explanations are so clear and intuitive. I Hope the best for you.

  • @Mutual_Information

    @Mutual_Information

    4 ай бұрын

    Thank you - you improved my Friday

  • @lucasvanderhauwaert419
    @lucasvanderhauwaert4192 жыл бұрын

    Super fricking impresed! Bravo

  • @abubakryagob
    @abubakryagob Жыл бұрын

    In min 3:00 I saw a smile coming out of my mouth, just how happy I was when I was listening to you! This is a masterpiece work! Really thank you :)

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thank you very much - glad it's getting some love :)

  • @anatoliizagorodnii2563
    @anatoliizagorodnii25632 жыл бұрын

    Wooow! Excellent quality video!

  • @Pabloparsil
    @Pabloparsil2 жыл бұрын

    Keep this up! It really helps

  • @andresdanielchaconvarela9405
    @andresdanielchaconvarela94052 жыл бұрын

    This is amazing, thanks

  • @mightymonke2527
    @mightymonke2527 Жыл бұрын

    Thanks a lot for this vid man it literally saved my life, you're really one hell of a teacher

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thank you - I'm getting a little better over time, but it's a work in progress. If you love what I'm doing, one thing that would be *huge* for me, is if you tell anyone you think might be interested. This channel is pretty small and it'll be easier to work on it if it gets a little more attention : )

  • @mightymonke2527

    @mightymonke2527

    Жыл бұрын

    @Mutual Information best of luck man 🫡

  • @brettbyrnes577
    @brettbyrnes57710 ай бұрын

    Nice video - love it

  • @ryandaniels3258
    @ryandaniels3258 Жыл бұрын

    What a great video! Very helpful, thanks!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    You're very welcome!

  • @5ty717
    @5ty7176 ай бұрын

    Excellent

  • @Paulawurn
    @Paulawurn2 жыл бұрын

    What an excellent video! Just thinking about the amount of effort that must have gone into this gives me anxiety

  • @jobiquirobi123
    @jobiquirobi1232 жыл бұрын

    Nice visualizations man. Just discovered your channel.

  • @chamithdilshan3547
    @chamithdilshan3547Күн бұрын

    Great video ! ❤

  • @karmpatel6832
    @karmpatel68329 ай бұрын

    What an Explanation! Become fan in seconds.

  • @youngzproduction7498
    @youngzproduction7498 Жыл бұрын

    Now you make me love math again. Thanks.

  • @graham8316
    @graham8316 Жыл бұрын

    I would be really helped by putting variable definitions on screen while they're in use! I find myself forgetting what f and f* are for example as I mull it over and watch the explanation. Amazing video! I'm a fan.

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thanks Graham! It's always a balance thinking about what does/doesn't go on screen. More recently, I'm biasing towards *less* on screen, b/c I've gotten feedback that what's on screen can be overwhelming. But, if you have some question about what may be confusing, ask here and I may be able to help

  • @graham8316

    @graham8316

    Жыл бұрын

    @@Mutual_Information I'm thinking what was hard for me is that everything was defined and then they were used? the viewer needs to remember what each things means before they can give it the context, and context allows us to combine things and save on short term memory?

  • @murphp151
    @murphp1512 жыл бұрын

    Great video

  • @waylonbarrett3456
    @waylonbarrett3456 Жыл бұрын

    I built a model years ago that I never realized is perhaps a GP model. I only learned about GP models a weeks ago. It doesn't use any real-valued data; only binary vectors. The similarity kernel is Hamming distance. Other than that, it's basically what he described here.

  • @ronitganguly3318
    @ronitganguly33182 жыл бұрын

    Dude has named his channel mutual information so when we look for the concept of mutual information, all his videos will pop up 🤣 genius!

  • @abdjahdoiahdoai
    @abdjahdoiahdoai2 жыл бұрын

    you might want to look into probabilistic numeric, cheers great video you made there!

  • @SamuelLiJ
    @SamuelLiJ9 ай бұрын

    Your observation of the product of two normally distributed variables is true for the following reason: given independent scalar random variables X,Y, we have Var(XY) = Var(X)Var(Y) + Var(X) (E(Y))^2 + Var(Y) (E(X))^2. Given two multivariate random normals U,V with mean zero, we may choose to work in a basis (possibly different for the two distributions) where the covariance matrices are diagonal. The all components of each vector are independent and so Cov(U) Cov(V) = Cov(UV) by working element-wise. Since this is true in one basis, it must therefore be true in every basis.

  • @realcirno1750
    @realcirno1750 Жыл бұрын

    sooo helpful

  • @TheRaspberryPiGuy
    @TheRaspberryPiGuy2 жыл бұрын

    Great video - I subbed!

  • @patrickl5290
    @patrickl52902 жыл бұрын

    Love these vids. Can you do a video about normalizing flows in the future?

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    I plan on making one. It’s a very interesting idea. In the meantime, there is already an excellent explanation : kzread.info/dash/bejne/m2uAzKavo6-9c8o.html

  • @thoughtsuponatime847
    @thoughtsuponatime8477 ай бұрын

    Thank you.

  • @Mutual_Information

    @Mutual_Information

    7 ай бұрын

    Thank you back.

  • @user-fg9ht5lm5r
    @user-fg9ht5lm5r11 ай бұрын

    A really good explanation! Though I wasn't able to understand everything, I would keep coming to this video until I do. ;D

  • @Mutual_Information

    @Mutual_Information

    11 ай бұрын

    Thank you. Happy to answer any questions too

  • @ovegedion1790
    @ovegedion1790 Жыл бұрын

    great!!!

  • @christiankentorasmussen7492
    @christiankentorasmussen74922 жыл бұрын

    Really well made explanation :)

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Thanks, glad the effort is appreciated!

  • @SohailKhan-zb5td
    @SohailKhan-zb5td Жыл бұрын

    Thanks a lot Amazing video

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    You are very welcome

  • @user-ch2zl7li4q
    @user-ch2zl7li4q Жыл бұрын

    Missed a lot of math, will get back later!

  • @alfrednewman2234
    @alfrednewman223410 ай бұрын

    So far beyond my abilities. Like Frankenstein's monster, I am soothed by its music.

  • @user-lr5sd8xq4g
    @user-lr5sd8xq4g11 ай бұрын

    The best video about GP I have ever seen! Thank you for sharing. I would like to reproduce the graphs that you created in a script, but unfortunately I cannot see any code about it on you github page! It is possible to access to those scripts? with the examples that you produced?

  • @Mutual_Information

    @Mutual_Information

    11 ай бұрын

    Thank Matteo - I appreciate it! Unfortunately, the code for this one was heavily intertwined with the animation code, so I didn't make it public. But I wasn't doing anything you can't learn from reading the GPyTorch docs

  • @patrickadjei9676
    @patrickadjei96762 жыл бұрын

    This is cool stuff! There is something I want to understand from the similarity heat map of the linear kernel. If the function samples are dissimilar as they get further apart (according to the lines), should the heat map not be brighter at (0,0) and fade as it approach (10,10)? I am trying to get the picture in my head.

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Thanks! I think you're thinking about it from a difficult angle. It's not function *samples* that are similar/dissimilar, it's specific *inputs* across samples. So, for the linear kernel, for inputs that are very similar (like input=0, so heatmap is high, which means similar), the outputs are virtually the same spot. For inputs closer to 10, the inputs are dissimilar and outputs are far apart. Since all function samples are lines, this will manifest as two lines which intersection at input=0 but are far apart at 10. Make sense?

  • @patrickadjei9676

    @patrickadjei9676

    2 жыл бұрын

    @@Mutual_Information I do not totally understand. It is not your explanation that is bad. I simply need to get to know GP better. Thank you for trying to explain!

  • @Friemelkubus
    @Friemelkubus Жыл бұрын

    Damn. Just Damn. This is great! Like: really really really great.

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thank you Ian - more good stuff cooking!

  • @MikeOxmol_
    @MikeOxmol_2 жыл бұрын

    Hey, that's a good video, I enjoyed it a lot and you earned a sub :) However, wouldn't weget something very similar to GPs when we allowed for different basis functions in the Bayesian regression example? These functions don't have to be straight lines, so if I choose some polynomials, sines or exponents as my basis functions and I perform Bayesian linear regression, wouldn't I get basically the same things that GPs offer?

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Yes you would! Bayesian linear regression with basis functions gives you a GP. But a GP is more general. You can input any similarity via the kernel. Given a kernel, it can be hard to determine the basis functions you’d need to use to recreate it using Bayesian linear regression.

  • @Alexander-pk1tu
    @Alexander-pk1tu Жыл бұрын

    Hey, great video! In practice in my machine learning class we did both GPR and GPC I found it very difficult to scale it to more than 10k samples. It seems that despite the advantages it has, it is not useful for a lot of practical problems. Can you maybe show show video on how to invert a matrix with less than O(n^3) complexity and which software someone could use for GPR/GPC for larger data?

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Yea, so that's a big component of GP research. Getting the cost down. A dominate approach are inducing point methods, where you try to summarize a large dataset with a smaller data set of "inducing points". It's a popular approach, but introduces another source of uncertainty. In my experience, I tend to use GPs with smaller datasets.

  • @daveh1924
    @daveh19249 ай бұрын

    Hi, great video for GP! I have quite new to this topic, is it possible to use GP to model multiple output? e.g. my "input" data is time, and the output data is 2D coordinates (x(t), y(t)). If it is possible, how to setup the covariance function? Thanks so much

  • @Mutual_Information

    @Mutual_Information

    9 ай бұрын

    Thank you. Do you care about uncertainty in your output?

  • @daveh1924

    @daveh1924

    9 ай бұрын

    @@Mutual_Information Hi, thanks for the reply 😃 Yes I need to obtain the uncertainty as well. I think if x(t) and y(t) are "independent", then fitting independent GPR with uncertainty bounds are applicable. However, if they are dependent (like coordinates of an ant moving along a circle), then x and y are correlated. What do you think?

  • @sjb27182
    @sjb271822 жыл бұрын

    If you were using a chi-distribution as a kernel could you combine kernel-a and kernel-b multiplicatively? If I recall, gaussian distributions are linear, ie: their sum is a gaussian, however there product is not. Chi-distributed variables on the other hand, when you multiply their products, you get an f-distribution, which is tractable. Really cool video! You definitely have an INSANE amount of material to make more videos on! Definitely subscribing!

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    Very interesting idea.. maybe there is a very special choice of kernel such that the multiplication kernel-and-sampled-function-distributions holds exactly, just like a sum does. I really don't know! If I had to guess, I'd say there is no such kernel. The problem arises from the multivariate normal, which is always operating in a GP, regardless of the kernel. And that problem is.. if you sample two vectors from a multivariate normal.. and multiply them together element wise.. the distribution of that thing is NOT some other multivariate normal. The kernel can only change the covariance of those two vectors, but that problem doesn't depend on those covariances. And thanks for the subscription!

  • @sjb27182

    @sjb27182

    2 жыл бұрын

    @@Mutual_Information Ah that makes sense; it's called a *gaussian process* not a *insert-random-pdf* process after all ! Thanks for the reply!

  • @user-or7ji5hv8y
    @user-or7ji5hv8y2 жыл бұрын

    Not sure if others are also interested, but I think a coding example with GPyTorch could be interesting.

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    This is something I'm working on! I'd like to make code samples available alongside my videos. They aren't currently available b/c the modeling code is intertwined with the animation code, so it would make for a terribly difficult to decipher code if released as-is. My plan is.. once my video production workflow is a little more streamlined, I'll pair these video with code snippets.

  • @RHCPhooligan
    @RHCPhooligan Жыл бұрын

    Hey, love the videos. What software do you use to create your visuals?

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    I use a very dope, though static, python plotting library called Altair. And then I have a personal library that turns many of them into videos.

  • @leonhard4145
    @leonhard4145 Жыл бұрын

    Entertaining and informative video, thank you! Out of curiousity, have you looked into neural network gaussian procceses at all? I feel like you'd dig em

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    I haven't looked at them much. All my experience with GPs has been via this handcrafted kernel approach, but it would be nice to get NNs in the mix. GPyTorch makes that pretty easy in fact.. I probably should..

  • @MeshRoun

    @MeshRoun

    Жыл бұрын

    @@Mutual_Information is that what you use (GPyTorch) when modeling gaussian processes? ( instead of GPFlow, PyMC3. Etc.)

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    @@MeshRoun yep it’s been my go-to so far

  • @ayushsaha5539
    @ayushsaha55399 ай бұрын

    Hi, I am trying to implement a GPR myself and am struggling with writing code for the parameter updating process. I am using gradient descent to maximize the log density function, w.r.t. the terms: sigma_n, sigma_f, etc. My output predictions fit decently to the data, but only works for univariate input. Also, my noise prediction is just constant or zero throughout, despite the evident noise present. Please advise me on how to approach this. Might it be possible to host a virtual meeting with someone who can help?

  • @knightofhyrulelink7531
    @knightofhyrulelink75314 ай бұрын

    thank you for your understandable video! I'm still wander what is the point of "similar y for similar x", is it make sure the function is smooth, or other usage? looking forward to your reply!

  • @Mutual_Information

    @Mutual_Information

    4 ай бұрын

    The goal of the problem is predict y for a given incoming x.. and we can learn to do this by observing many pairs of (x_i, y_i)'s. So we make an assumption: "If x1 is similar to x2 (that is K(x1, x2) is large/positive)), then we expect y1 and y2 to be close". With that assumption, we can form a prediction for y when given an x.. and that basically is formed by determining "which y value would best work with our similar-y's for similar x's given the x's and y's observed?" and then you can form your prediction that way. The GP does all this hard work for you and allows for noise and whatnot.

  • @kimyongtan3818
    @kimyongtan38188 ай бұрын

    I am already GP expert 😃

  • @user-ih6kd3cw7z
    @user-ih6kd3cw7zАй бұрын

    Sir i am using matlab regreession learner toolbox i understood that kernel functions tries to find how similar or far apart 2 input data points are,can you tell what are basis functions then there are 3 basis functions zero,constant and linear?

  • @besugui1969
    @besugui1969 Жыл бұрын

    Awesome video!. Only one question. Minute 09:20. A linear kernel does not imply that the realizations of the random process must be linear, does it?. Thanks!!

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Thanks Jesus. And regarding your Q, in the broader model, no a linear kernel doesn't imply the realizations need to be linear, since there is a noise component in the overall kernel. That allows points along a sample to be different in a nonlinear way.

  • @polares8187
    @polares81872 жыл бұрын

    I would also love a coding tutorial about gaussian processes. Even if I understand what it is if I can't use it, it does not matter

  • @Mutual_Information

    @Mutual_Information

    2 жыл бұрын

    I don't have demo code prepared, but if I did, it would be no better than this one from the GPyTorch docs : docs.gpytorch.ai/en/stable/examples/01_Exact_GPs/Simple_GP_Regression.html They go over how to create a model, tune hypers, and get the predictive distribution. Hope it helps

  • @ali-om4uv
    @ali-om4uv Жыл бұрын

    Great video. You have a really compelling style! Would you be willing to share your desktopbackground and the code that generatet it via github?

  • @Mutual_Information

    @Mutual_Information

    Жыл бұрын

    Sure, here it is: github.com/Duane321/mutual_information/tree/main/computer_background