Bayesian Linear Regression : Data Science Concepts

The crazy link between Bayes Theorem, Linear Regression, LASSO, and Ridge!
LASSO Video : • Lasso Regression
Ridge Video : • Ridge Regression
Intro to Bayesian Stats Video : • What the Heck is Bayes...
My Patreon : www.patreon.com/user?u=49277905

Пікірлер: 169

  • @brycedavis5674
    @brycedavis56743 жыл бұрын

    As soon as you explained the results from Bayesian my jaw was wide open for like 3 minutes this is so interesting

  • @kunalchakraborty3037
    @kunalchakraborty30373 жыл бұрын

    Read it on a book. Didn't understand jack shit back then. Your videos are awesome. Rich, small, consise. Please make a video on Linear Discriminant Analysis and how its related to bay's theorem. This video will be saved in my data science playlist.

  • @tobias2688
    @tobias26883 жыл бұрын

    This video is a true gem, informative and simple at once. Thank you so much!

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    Glad it was helpful!

  • @fluidice1656
    @fluidice1656 Жыл бұрын

    This is my favorite video out of a large set of fantastic videos that you have made. It just brings everything together in such a brilliant way. I keep getting back to it over and over again. Thank you so much!

  • @icybrain8943
    @icybrain89433 жыл бұрын

    Regardless of how they were really initially devised, seeing the regularization formulas pop out of the bayesian linear regression model was eye-opening - thanks for sharing this insight

  • @dennisleet9394

    @dennisleet9394

    2 жыл бұрын

    Yes. This really blew my mind. Boom.

  • @rishabhbhatt7373
    @rishabhbhatt7373 Жыл бұрын

    Really good explanation. I really like how you gave context and connected all topics together and it make perfect sense. While maintaining the perfect balance b/w math and intution. Great worl. Thank You !

  • @jlpicard7
    @jlpicard76 ай бұрын

    I've seen everything in this video many, many times, but no one had done as good a job as this in pulling these ideas together in such an intuitive and understandable way. Well done and thank you!

  • @MoumitaHanra
    @MoumitaHanra2 жыл бұрын

    Best of all videos on Bayesian regression; other videos are so boring and long but this one has quality as well as ease of understanding..Thank you so much!

  • @tj9796
    @tj97963 жыл бұрын

    Your videos are great. Love the connections you make so that stats is intuitive as opposed to plug and play formulas.

  • @rajanalexander4949
    @rajanalexander4949 Жыл бұрын

    This is incredible. Clear, well paced and explained. Thank you!

  • @davidelicalsi5915
    @davidelicalsi5915 Жыл бұрын

    Brilliant and clear explanation, I was struggling to grasp the main idea for a Machine Learning exam but your video was a blessing. Thank you so much for the amazing work!

  • @mohammadkhalkhali9635
    @mohammadkhalkhali96353 жыл бұрын

    Man I'm going to copy-paste your video whenever I want to explain regularization to anyone! I knew the concept but I would never explain it the way you did. You nailed it!

  • @sambacon2141
    @sambacon21413 жыл бұрын

    Man! What a great explanation of Bayesian Stats. It's all starting to make sense now. Thank you!!!

  • @chenqu773
    @chenqu773 Жыл бұрын

    For me, the coolest thing about statistics is that every time I do a refresh on these topics, I get some new ideas or understandings. It's lucky that I came across this video after a year, which could also explain why we need to "normalized" the X (0 centered, with stdev = 1) before we feed them into the MLP model, if we use regularization terms in the layers.

  • @fktx3507
    @fktx35072 жыл бұрын

    Thanks, man. A really good and concise explanation of the approach (together with the video on Bayesian statistics).

  • @feelmiranda
    @feelmiranda2 жыл бұрын

    Your videos are a true gem, and an inspiration even. I hope to be as instructive as you are if I ever become a teacher!

  • @SaiVivek15
    @SaiVivek152 жыл бұрын

    This video is super informative! It gave me the actual perspective on regularization.

  • @Structuralmechanic
    @Structuralmechanic5 ай бұрын

    Amazing, you kept it simple and showed how regularization terms in linear regression originated from Bayesian approach!! Thank U!

  • @mateoruizalvarez1733
    @mateoruizalvarez17334 ай бұрын

    Cristal clear! , thank you so much, the explanation is very structured and detailed

  • @chiawen.
    @chiawen.9 ай бұрын

    This is sooo clear. Thank you so much!

  • @dylanwatts4463
    @dylanwatts44633 жыл бұрын

    Amazing video! Really clearly explained! Keep em coming!

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    Glad you liked it!

  • @qiguosun129
    @qiguosun1292 жыл бұрын

    Excellent tutorial! I have applied RIDGE as the loss function in different models. However, it is the first time I understand the mathematical meaning of lambda. It is really cool!

  • @sebastianstrumbel4335
    @sebastianstrumbel43353 жыл бұрын

    Awesome explanation! Especially the details on the prior were so helpful!

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    Glad it was helpful!

  • @antaresd1
    @antaresd19 ай бұрын

    Thank you for this amazing video, It clarified many things to me!

  • @umutaltun9049
    @umutaltun90492 жыл бұрын

    It just blown my mind too. I can feel you brother. Thank you!

  • @FRequena
    @FRequena3 жыл бұрын

    Super informative and clear lesson! Thank you very much!

  • @alim5791
    @alim57912 жыл бұрын

    Thanks, that was a good one. Keep up the good work!

  • @JohnJones-rp2wz
    @JohnJones-rp2wz3 жыл бұрын

    Awesome explanation!

  • @dirknowitzki9468
    @dirknowitzki94682 жыл бұрын

    Your videos are a Godsend!

  • @benjtheo414
    @benjtheo41411 ай бұрын

    This was awesome, thanks a lot for your time :)

  • @AntonioMac3301
    @AntonioMac33012 жыл бұрын

    This video is amazing!!! so helpful and clear explanation

  • @marcogelsomini7655
    @marcogelsomini7655 Жыл бұрын

    very cool the link you explained between regularization and prior

  • @ezragarcia6910
    @ezragarcia6910 Жыл бұрын

    Mi mente explotó con este video. Gracias

  • @javiergonzalezarmas8250
    @javiergonzalezarmas8250 Жыл бұрын

    Incredible explanation!

  • @TejasEkawade
    @TejasEkawade7 ай бұрын

    This was an excellent introduction to Bayesian Regression. Thanks a lot!

  • @julissaybarra4031
    @julissaybarra40317 ай бұрын

    This was incredible, thank you so much.

  • @dodg3r123
    @dodg3r1233 жыл бұрын

    Love this content! More examples like this are appreciated

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    More to come!

  • @chuckleezy
    @chuckleezy11 ай бұрын

    you are so good at this, this video is amazing

  • @ritvikmath

    @ritvikmath

    11 ай бұрын

    Thank you so much!!

  • @sudipanpaul805
    @sudipanpaul80511 ай бұрын

    Love you, bro, I got my joining letter from NASA as a Scientific Officer-1, believe me, your videos always helped me in my research works.

  • @Maciek17PL
    @Maciek17PL Жыл бұрын

    You are a great teacher thank you for your videos!!

  • @FB0102
    @FB0102 Жыл бұрын

    truly excellent explanation; well done

  • @user-or7ji5hv8y
    @user-or7ji5hv8y2 жыл бұрын

    This is truly cool. I had the same thing with the lambda. It’s good to know that it was not some engineering trick.

  • @brandonjones8928
    @brandonjones89283 ай бұрын

    This is an awesome explanation

  • @curiousobserver2006
    @curiousobserver2006 Жыл бұрын

    This blew my mind.Thanks

  • @joachimrosenberger2109
    @joachimrosenberger2109 Жыл бұрын

    Thanks a lot! Great! I am reading Elements of Statistical Learning and did not understand what they were talking about. Now I got it.

  • @mohammadmousavi1
    @mohammadmousavi1 Жыл бұрын

    Unbelievable, you explained linear reg, explained in simple terms Bayesian stat, and showed the connection under 20min .... Perfect

  • @juliocerono_stone5365
    @juliocerono_stone53653 ай бұрын

    at last!!! Now I can see what lamda was doing in tne lasso and ridge regression!! great video!!

  • @ritvikmath

    @ritvikmath

    3 ай бұрын

    Glad you liked it!

  • @shantanuneema
    @shantanuneema3 жыл бұрын

    you got a subscriber, awesome explanation. I spent hours learning it from other source, but no success. You are just great

  • @nirmalpatil5370
    @nirmalpatil53702 жыл бұрын

    This is brillian man! Brilliant! Literally solved where the lamda comes from!

  • @caiocfp
    @caiocfp3 жыл бұрын

    Thank you for sharing this fantastic content.

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    Glad you enjoy it!

  • @rmiliming
    @rmiliming Жыл бұрын

    Tks a lot for this clear explanation !

  • @amirkhoutir2649
    @amirkhoutir2649 Жыл бұрын

    thank you so much for the great explanation

  • @mahdijavadi2747
    @mahdijavadi27472 жыл бұрын

    Thanks a lottttt! I had so much difficulty understanding this.

  • @SamuelMMuli-sy6wk
    @SamuelMMuli-sy6wk2 жыл бұрын

    wonderful stuff! thank you

  • @Life_on_wheeel
    @Life_on_wheeel3 жыл бұрын

    Thanks for video.. Its really helpful.. I was trying to understand how regularization terms are coming.. Now i got. Thanks ..

  • @dmc-au
    @dmc-au Жыл бұрын

    Wow, killer video. This was a topic where it was especially nice to see everything written on the board in one go. Was cool to see how a larger lambda implies a more pronounced prior belief that the parameters lie close to 0.

  • @ritvikmath

    @ritvikmath

    Жыл бұрын

    I also think it’s pretty cool 😎

  • @narinpratap8790
    @narinpratap87903 жыл бұрын

    Awesome video. I didn't realize that the L1, L2 regularization had a connection with the Bayesian framework. Thanks for shedding some much needed light on the topic. Could you please also explain the role of MCMC Sampling within Bayesian Regression models? I recently implemented a Bayesian Linear Regression model using PyMC3, and there's definitely a lot of theory involved with regards to MCMC NUTS (No U-Turn) Samplers and the associated hyperparameters (Chains, Draws, Tune, etc.). I think it would be a valuable video for many of us. And of course, keep up the amazing work! :D

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    good suggestion!

  • @juliocerono5193
    @juliocerono51933 ай бұрын

    At last!! I could find an explanation for the lasso and ridge regression lamdas!!! Thank you!!!

  • @ritvikmath

    @ritvikmath

    3 ай бұрын

    Happy to help!

  • @houyao2147
    @houyao21473 жыл бұрын

    What a wonderful explanation!!

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    Glad you think so!

  • @samirelamrany5323
    @samirelamrany5323 Жыл бұрын

    perfect explanation thank you

  • @kennethnavarro3496
    @kennethnavarro34962 жыл бұрын

    Thank you very much. Pretty helpful video!

  • @vipinamar8323
    @vipinamar83232 жыл бұрын

    Great video with a very clear explanation. COuld you also do a video on Bayesian logistic regression

  • @chenjus
    @chenjus2 жыл бұрын

    This is the best explanation of L1 and L2 I've ever heard

  • @undertaker7523
    @undertaker7523 Жыл бұрын

    You are the go-to for me when I need to understand topics better. I understand Bayesian parameter estimation thanks to this video! Any chance you can do something on the difference between Maximum Likelihood and Bayesian parameter estimation? I think anyone that watches both of your videos will be able to pick up the details but seeing it explicitly might go a long way for some.

  • @swapnajoysaha6982
    @swapnajoysaha69824 ай бұрын

    I used to be afraid of Bayesian Linear Regression until I saw this vid. Thank you sooo much

  • @ritvikmath

    @ritvikmath

    4 ай бұрын

    Awesome! Youre welcome

  • @j29Productions
    @j29Productions5 ай бұрын

    You are THE LEGEND

  • @kaartiki1451
    @kaartiki14513 ай бұрын

    Legendary video

  • @manishbhanu2568
    @manishbhanu2568 Жыл бұрын

    you are a great teacher!!!🏆🏆🏆

  • @ritvikmath

    @ritvikmath

    Жыл бұрын

    Thank you! 😃

  • @rachelbarnes7469
    @rachelbarnes74693 жыл бұрын

    thank you so much for this

  • @alexanderbrandmayr7408
    @alexanderbrandmayr74083 жыл бұрын

    Great video!!

  • @godse54
    @godse543 жыл бұрын

    Nice i never thought that 👍🏼👍🏼

  • @axadify
    @axadify2 жыл бұрын

    such a nice explanation. I mean thats the first time I actually understood it.

  • @louisc2016
    @louisc20162 жыл бұрын

    fantastic! u r my savor!

  • @matthewkumar7756
    @matthewkumar77562 жыл бұрын

    Mind blown on the connection between regularization and priors in linear regression

  • @datle1339
    @datle1339 Жыл бұрын

    very great, thank you

  • @abdelkaderbousabaa7020
    @abdelkaderbousabaa70202 жыл бұрын

    Excellent thank you

  • @chenqu773
    @chenqu7733 жыл бұрын

    Thank you very much

  • @Aviationlads
    @Aviationlads8 ай бұрын

    Great video, do you have some sources I can use for my university presentation? You helped me a lot 🙏 thank you!

  • @haeunroh8945
    @haeunroh89452 жыл бұрын

    your videos are awesome so much better than my prof

  • @hameddadgour
    @hameddadgour Жыл бұрын

    Holy shit! This is amazing. Mind blown :)

  • @millch2k8
    @millch2k8 Жыл бұрын

    I'd never considered a Bayesian approach to linear regression let alone its relation to lasso/ridge regression. Really enlightening to see!

  • @ritvikmath

    @ritvikmath

    Жыл бұрын

    Thanks!

  • @petmackay
    @petmackay3 жыл бұрын

    Most insightful! L1 as Laplacian toward the end was a bit skimpy, though. Maybe I should watch your LASSO clip. Could you do a video on elastic net? Insight on balancing the L1 and L2 norms would be appreciated.

  • @danielwiczew

    @danielwiczew

    2 жыл бұрын

    Yea, Elasticnet and comparison to Ridge/Lasso would be very helpful

  • @yulinliu850
    @yulinliu8503 жыл бұрын

    Beautiful!

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    Thank you! Cheers!

  • @souravdey1227
    @souravdey1227 Жыл бұрын

    Can you please please do a series on categorical distribution, multinomial distribution, Dirichlet distribution, Dirichlet process and finally non parametric Bayesian tensor factorisation including clustering of steaming data. I will personally pay you for this. I mean it!! There are a few videos on these things on youtube, some are good, some are way high-level. But, no one can explain the way you do. This simple video has such profound importance!!

  • @jaivratsingh9966
    @jaivratsingh99662 жыл бұрын

    Excellent

  • @TK-mv6sq
    @TK-mv6sq Жыл бұрын

    thank you!

  • @julianneuer8131
    @julianneuer81313 жыл бұрын

    Excellent!

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    Thank you! Cheers!

  • @shipan5940
    @shipan59402 жыл бұрын

    Max ( P(this is the best vid explaining these regressions | KZread) )

  • @jairjuliocc
    @jairjuliocc3 жыл бұрын

    Thank You , I saw this before but i didnt understand. Please , where can i find the complete derivation? And maybe You can do a complete series in this topic

  • @JorgeGomez-kt3oq
    @JorgeGomez-kt3oq10 ай бұрын

    Great video, just a question, where can I get some example of the algebra?

  • @vinceb8041
    @vinceb80413 жыл бұрын

    Amazing! But where did Ridge and Lasso start from? Were they invented with Bayesian statistics as a starting point, or is that a duality that came later?

  • @karannchew2534
    @karannchew2534 Жыл бұрын

    Notes for my future revision. *Priror β* 10:30 Value of Prior β is normally distributed. The by product of using Normal Distribution is Regularisation. Because the prior values of β won't be too large (or too small) from the mean. Regularisation keep values of β small.

  • @imrul66
    @imrul66 Жыл бұрын

    Great video. The relation between the prior and LASSO penalty was a "wow" moment for me. It would be helpful to see actual computation example in python or R. A common problem I see in Bayesian lectures is - too much focus on math rather to show how actually/ how much the resulting parameters differs. Specially, when to consider bayesian approach over ols.

  • @juanete69
    @juanete69 Жыл бұрын

    Do we need to use cross-validation in bayesian analysis to control overfitting?

  • @convex9345
    @convex93453 жыл бұрын

    mind boggling

  • @ThePiotrekpecet
    @ThePiotrekpecet Жыл бұрын

    There is an error at the beginning of the video, in frequentist approaches X is treated as non random covariate data and y is the random part so the high variance of OLS should be expressed as small changes to y => big changes to OLS estimator. The changes to covariate matrix becoming big changes to OLS estimator is more like a non robustness of OLS wrt outlier contamination. Also the lambda should be 1/2τ^2 not σ^2/τ^2 since: ln(P(β))=-p * ln(τ * √2*π) - ||β||₂/2τ^2 Overall this was very helpful cheers!

  • @rbpict5282
    @rbpict52822 жыл бұрын

    Is there an algorithm/method for choosing the tau or lambda ? Or Should i run the algo for multiple values of tau/lambda?

  • @AnotherBrickinWall
    @AnotherBrickinWall Жыл бұрын

    Great thanks! .. was feeling the same discomfort about the origin of these...

  • @bibiha3149
    @bibiha31493 жыл бұрын

    Thanks from korea 사랑해요!

  • @ritvikmath

    @ritvikmath

    3 жыл бұрын

    You're welcome!!!

  • @hws9999
    @hws99992 жыл бұрын

    What about bayesian logistic? Could you post video about it?