Tutorial 27- Ridge and Lasso Regression Indepth Intuition- Data Science

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
kzread.info/dron/NU_lfiiWBdtULKOw6X0Dig.htmljoin
#Regularization
⭐ Kite is a free AI-powered coding assistant that will help you code faster and smarter. The Kite plugin integrates with all the top editors and IDEs to give you smart completions and documentation while you’re typing. I've been using Kite for a few months and I love it! www.kite.com/get-kite/?
Please do subscribe my other channel too
kzread.info/dron/jWY5hREA6FFYrthD0rZNIw.html
Connect with me here:
Twitter: Krishnaik06
Facebook: krishnaik06
instagram: krishnaik06

Пікірлер: 404

  • @hipraneth
    @hipraneth4 жыл бұрын

    Lucid explanation at free of cost . Your passion to make the concept crystal clear is very much evident in your eyes...Hats Off!!!

  • @shubhamkohli2535
    @shubhamkohli25353 жыл бұрын

    Only person who is providing this level of knowledge at free of cost. Really appreciate it .

  • @aelitata9662
    @aelitata96624 жыл бұрын

    I'm in crisis to learn this topic and all I know is y=mx+c. I think this is the clearest one I've watched on youtube. Thank you sooooo much and love your enthusiasm when you tried to explain the confusing parts

  • @tenvillagesahead4192
    @tenvillagesahead41923 жыл бұрын

    Brilliant. I searched all over the net but couldn't find such an easy yet detailed explanation of Regularization. Thank you very much! Very much considering joining the membership

  • @AnkJyotishAaman
    @AnkJyotishAaman4 жыл бұрын

    This guy is legit !! Hat's off for the explanation!! Loved it sir, Thanks

  • @marijatosic217
    @marijatosic2174 жыл бұрын

    Great video! I appreciate how hard his effort is to help us really understand the material!

  • @iamfavoured9142
    @iamfavoured91422 жыл бұрын

    100 years of blessing for you. You just gained a subscriber!

  • @sincerelysilvia
    @sincerelysilvia Жыл бұрын

    This is the most clearest and best explanation about this topic on youtube. I can't express how thankful I am for this video for finally understanding the concept

  • @yadikishameer9587
    @yadikishameer95872 жыл бұрын

    I never watched your videos but after watching this video I regret for ignoring your channel. You are a worthy teacher and a data scientist.

  • @HammadMalik
    @HammadMalik4 жыл бұрын

    Thanks Krish for explaining the intuition behind Ridge and Lasso regression. Very helpful.

  • @harshstrum
    @harshstrum4 жыл бұрын

    Thank You bhaiya. It feels like every mroning when I watch your videos my career slope will increase. Thank you for this explaination.

  • @ganeshrao405
    @ganeshrao4053 жыл бұрын

    Thank you soo much Krish, Linear regression + Ridge + Lasso cleared my concepts with your videos.

  • @vaish6859
    @vaish685911 ай бұрын

    You are helping many of the ML enthusiasts free of cost... Thank you

  • @mumtahinhabib4314
    @mumtahinhabib43144 жыл бұрын

    This is where I have found the best explanation of ridge regression after searching a lot of videos and documentations. thank you sir

  • @BoyClassicall
    @BoyClassicall4 жыл бұрын

    Concept well explained. I've watch a lot of videos on Ridge regression but most well explained has shown mathematically the effect of lambda on slope.

  • @aish_waryaaa
    @aish_waryaaa2 жыл бұрын

    Krish Sir you are saving my masters literally,up to date explanation,and the efforts you are putting to help us understand,Thank You so Much Sir.😇🥰

  • @143balug
    @143balug4 жыл бұрын

    Hi Krish, Your are making our confidence more on data scince with the clear explanations

  • @auroshisray9140
    @auroshisray91403 жыл бұрын

    Hats offf...grateful for valuable content at 0 cost

  • @mithunmiranda
    @mithunmiranda Жыл бұрын

    I wish I could like his videos multiple times. You are a great teacher, Kind Sir.

  • @indrasenareddyadulla8490
    @indrasenareddyadulla84904 жыл бұрын

    Sir, you have mentioned in your lecture this concept is complicated but never I felt it is so. you have explained very excellent.👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌

  • @TheOntheskies
    @TheOntheskies3 жыл бұрын

    Thank you, for the crystal clear explanation. Now I will remember Ridge and Lasso.

  • @TheR4Z0R996
    @TheR4Z0R9964 жыл бұрын

    Keep up the good work, blessing from italy My friend :)

  • @koderr100
    @koderr1002 жыл бұрын

    Now I finally got about key L2 and L3 difference. Thanks a lot!

  • @prashanths4455
    @prashanths44554 жыл бұрын

    Krish An excellent explanation. Thank you so much for this wonderful in-depth intuition.

  • @rishu4225
    @rishu4225Ай бұрын

    Thanks, the enthusiasm with which you teach also carries over to us. 🥰

  • @aravindvasudev7921
    @aravindvasudev7921 Жыл бұрын

    Thank you. Now I got a clear idea on both these regression techniques.

  • @abhishekchatterjee9503
    @abhishekchatterjee95033 жыл бұрын

    You did a great job sir.... It helped me a lot in understanding this concept. In 20min I understood the basic of this concept. Thank you💯💯

  • @ajithsdevadiga1603
    @ajithsdevadiga16036 ай бұрын

    Thank you so much for this wonderful explanation, truly appreciate your efforts in helping the data science community.

  • @adijambhulkar1742
    @adijambhulkar17422 жыл бұрын

    Hats off... What a way... What a way to explain man... Clear...all doubts

  • @dollysiharath4205
    @dollysiharath4205 Жыл бұрын

    You're the best trainer!! Thank you!

  • @adinathshelke5827
    @adinathshelke58275 ай бұрын

    perfect explanationnnnnnnn. WAs wandering around for whole day. And at the end of the day, found this one.

  • @nehasrivastava8927
    @nehasrivastava89273 жыл бұрын

    best tutorials for machine learning with indepth intuition...i think there is no tutorial on utube like this...Thankuu sir..

  • @ChandanBehera-jp2me
    @ChandanBehera-jp2me2 жыл бұрын

    i found your free videos better than some other paid tutorials...thanx for ur work

  • @BipinYadav-wn1pm
    @BipinYadav-wn1pm Жыл бұрын

    after going through tons of videos, finally found the best one, thnx!!

  • @Amir-English
    @Amir-English3 ай бұрын

    You made it so simple! Thank you.

  • @moe45673
    @moe45673 Жыл бұрын

    Thank you! I thought this was a great explanation (as someone who has listened to a bunch of different ones trying to nail my understanding of this)

  • @belllamoisiere8877
    @belllamoisiere88772 жыл бұрын

    Hello from México. Thank you for your tutorials, they are as if one of my class mates was explaining concepts to me in simple words. A suggestion, please include a short tutorial on ablation of Deep Learning Models.

  • @datafuturelab_ssb4433
    @datafuturelab_ssb4433 Жыл бұрын

    Best explanation on lasso n ridge regression ever on KZread... Thanks krish... You nailed it...

  • @veradesyatnikova2931
    @veradesyatnikova29312 жыл бұрын

    Thank you for the clear and intuitive explanation! Will surely come in handy for my exam

  • @parikhgoyal5506
    @parikhgoyal55064 жыл бұрын

    Thank you very much sir, I found very few useful resources for ridge regression and your's one is definetly good

  • @juozapasjurksa1400
    @juozapasjurksa14002 жыл бұрын

    Your explanations are sooo clear!

  • @rajk58
    @rajk584 жыл бұрын

    You sir, are amazing!!! Hats off to you!!

  • @JoseAntonio-gu2fx
    @JoseAntonio-gu2fx4 жыл бұрын

    Muchas gracias por compartir. Se agradece mucho el esfuerzo por aclarar los conceptos que es la base de partida para la resolución. Saludos desde España!

  • @sridhar7488

    @sridhar7488

    2 жыл бұрын

    sí, es un tipo genial ... también me encanta ver sus videos!

  • @subramanyasagarmylavarapu5286
    @subramanyasagarmylavarapu52864 жыл бұрын

    Hi Krish, very well explained. It really helps me to understand. Thank you.

  • @bhuvaraga
    @bhuvaraga2 жыл бұрын

    Loved your energy sir and your conviction to explain and make it clear to your students. I know it is hard to look at the camera and talk - you nailed it. This video really helped me to understand the overall concept. My two cents, 1) Keep the camera focus on the white board I think it is autofocussing between you and the white board and maybe that is why you get that change in brightness also.

  • @Zizou_2014
    @Zizou_20143 жыл бұрын

    Brilliantly done! Thanks Krish

  • @sidduhedaginal
    @sidduhedaginal4 жыл бұрын

    Just an awesome explanation. concepts are very clearly explained ...thanks for your true effort

  • @yitbarekmirete6098
    @yitbarekmirete60982 жыл бұрын

    you are awesome, better than our professors in explaining such complex topics.

  • @askpioneer
    @askpioneer2 жыл бұрын

    well explained krish. thank you for creating . great work

  • @gerardogutierrez4911
    @gerardogutierrez49114 жыл бұрын

    if you pause the video and just watch his facial movements and body movements, he looks like hes trying his best to convince you to stay with him during a break up. Then you turn on the audio and its like hes yelling at you to get you to understand something. Clearly, this man is passionate about teaching Ridge regression and knows a lot. I think its easier to follow when hes like checking up on you by saying, you need to understand this, and repeats words and uses his voice to emphasize concepts. I wish he could explain other things to me besides data science.

  • @TheMrIndiankid

    @TheMrIndiankid

    4 жыл бұрын

    he will explain u the meaning of life too

  • @MrBemnet1

    @MrBemnet1

    3 жыл бұрын

    my next project is counting head shakes in a youtube video .

  • @tanmay2771999

    @tanmay2771999

    3 жыл бұрын

    @@MrBemnet1 Ngl that actually sounds interesting.

  • @fratcetinkaya8538
    @fratcetinkaya85382 жыл бұрын

    Here is where I understood that damn issue. I’m appreciated too much, thanks my dear friend :)

  • @dianafarhat9479
    @dianafarhat94795 ай бұрын

    Amazing explanation, thank you!

  • @vishalaaa1
    @vishalaaa13 жыл бұрын

    This naik is excellent. He is solving every ones problem.

  • @binnypatel7061
    @binnypatel70614 жыл бұрын

    Awesome job.....keep up with the good work!

  • @cyborg69420
    @cyborg69420 Жыл бұрын

    just wanted to say that I absolutely loved the video

  • @MsGeetha123
    @MsGeetha1232 жыл бұрын

    Excellent video!!! Thanks for a very good explanation.

  • @GauravSharma-kb9np
    @GauravSharma-kb9np4 жыл бұрын

    Great Video sir, you explained each and every step very well.

  • @aseemjain007
    @aseemjain007Ай бұрын

    Brilliantly explained !! thankyou !!

  • @gandhalijoshi9242
    @gandhalijoshi92422 жыл бұрын

    Very nice explanation. I have started watching your videos and your teaching style is very nice . Very nice you tube channel for understanding data science-Hats Off!!

  • @antonyraja9902
    @antonyraja99024 жыл бұрын

    Amazing 👌 Great explanation 👍 Thanks and keep doing videos like this🔥

  • @thulasirao9139
    @thulasirao91393 жыл бұрын

    You are doing awesome job. Thank you so much

  • @mohammedfaisal6714
    @mohammedfaisal67144 жыл бұрын

    Thanks a lot for your Support

  • @abhi9raj776
    @abhi9raj7764 жыл бұрын

    perfect explanation!!! thank you sir !

  • @anirbandey8999
    @anirbandey89997 күн бұрын

    Very good video to understand the intuition behind L1, L2

  • @316geek
    @316geek2 жыл бұрын

    you make it look so easy, kudos to you Krish!!!

  • @bahaansari7201
    @bahaansari72013 жыл бұрын

    this is great! thank you!

  • @rahul281981
    @rahul2819813 жыл бұрын

    Very nicely explained, thank God I found your posts on KZread while searching the stuff👍

  • @rayennenounou7065

    @rayennenounou7065

    3 жыл бұрын

    I have a mémoire master 2 about lasso régression i need informations more informations about régression de lasso but in frensh can you help me

  • @vladimirkirichenko1972
    @vladimirkirichenko1972 Жыл бұрын

    This man has a gift.

  • @maheshurkude4007
    @maheshurkude40073 жыл бұрын

    thanks for explaining Buddy!

  • @robertasampong
    @robertasampong Жыл бұрын

    Absolutely excellent explanation!

  • @_cestd9727
    @_cestd97273 жыл бұрын

    super clear, thanks for the video!

  • @MohsinKhan-rv7jj
    @MohsinKhan-rv7jj Жыл бұрын

    The kind of explanation is truly inspirational. I am truly overfitted by knowledge after seeing your video.❤

  • @abhishekkumar465

    @abhishekkumar465

    Жыл бұрын

    Reduce the rate of learning, this may help you as per Ridge regression :P

  • @sandipansarkar9211
    @sandipansarkar92113 жыл бұрын

    Great explanation Krish.I think I a understanding a little bit about L1 andL2 regression.Thanks

  • @smlekhashree3599
    @smlekhashree35994 жыл бұрын

    Thank you sir.. It's clear explanation..

  • @partheshsoni6421
    @partheshsoni64214 жыл бұрын

    Nice explanation. Thanks a lot!

  • @SahanPradeepthaThilakaratne
    @SahanPradeepthaThilakaratne2 ай бұрын

    Your explanations are superbbb!

  • @saurabhtiwari2541
    @saurabhtiwari25414 жыл бұрын

    Awesome tutorial with clear concepts

  • @ahmedaj2000
    @ahmedaj20003 жыл бұрын

    THANK YOU SO MUCH!!!!!!! great explanation!

  • @kanhataak1269
    @kanhataak12694 жыл бұрын

    After watching this lecture is not complicated... good teaching sir

  • @ZubairAzamRawalakot
    @ZubairAzamRawalakot9 ай бұрын

    Very informative lecture dear. You explained with maximum detail. thanks

  • @yamika.
    @yamika.2 жыл бұрын

    thank you for this! finally understood the topic

  • @kanuparthisailikhith
    @kanuparthisailikhith4 жыл бұрын

    The best tutorial I have seen till date on this topic. Thanks so much for clarity

  • @kiran082
    @kiran0824 жыл бұрын

    Thank you Krish very detailed explanation

  • @MuhammadAhmad-bx2rw
    @MuhammadAhmad-bx2rw3 жыл бұрын

    Extraordinary talented Sir

  • @dineshpramanik2571
    @dineshpramanik25714 жыл бұрын

    Excellent explanation sir...thanks

  • @muhammednihas2218
    @muhammednihas22184 ай бұрын

    thank you ! good explanation

  • @heplaysguitar1090
    @heplaysguitar10903 жыл бұрын

    Just one word, Fantastic.

  • @swaruppanda2842
    @swaruppanda28424 жыл бұрын

    Thanks this was quite helpful

  • @thespeeddemon7832
    @thespeeddemon78324 ай бұрын

    thank you so much for this explaination ☺

  • @gunjanagrawal8626
    @gunjanagrawal86262 жыл бұрын

    Very well explained!🙌

  • @saitcanbaskol9897
    @saitcanbaskol9897 Жыл бұрын

    Amazing explanations.

  • @walete
    @walete4 жыл бұрын

    thank you sir, great explanation

  • @t-ranosaurierruhl9920
    @t-ranosaurierruhl99204 жыл бұрын

    You are great!! Thanks a lot

  • @loganwalker454
    @loganwalker4542 жыл бұрын

    Regularization was a very abstruse and knotty topic. However, after watching this video; it is a piece of cake Thank you, Krish

  • @JEEVANKUMAR-hf4ex
    @JEEVANKUMAR-hf4ex2 жыл бұрын

    good explanation without touching any complex maths derivations.

  • @mohit10singh
    @mohit10singh3 жыл бұрын

    Very nicely explained. awesome Sir. keep up this good work.

  • @therawkei
    @therawkei2 жыл бұрын

    this is the best , thank you so much

  • @ibrahimibrahim6735
    @ibrahimibrahim67353 жыл бұрын

    Thanks, Krish, I want to correct one thing here, the motivation behind the penalty is not to change the slop; it is to reduce the model's complexity. For example, consider the flowing tow models: f1: x + y + z + 2*x^2 + 5y^2 + z^2 =10 f2: 2*x^2 + 5y^2 + z^2 =15 f1 is more complicated than f2. Clearly, a complicated model has a higher chance of overfitting. By increasing lambda (the complexity factor), it is more likely to have a simpler model. Another example: f1: x + 2y + 10z + 5h + 30g = 100 f2: 10z + 30g = 120 f2 is simpler than f1. If both models have the same performance on the training data, we would like to use f2 as our model. Because it is a simpler model and a simpler model has less chance for overfitting.

  • @ashishveera3431
    @ashishveera34313 жыл бұрын

    amazing explanation.. thank you!

  • @sahilzele2142
    @sahilzele21424 жыл бұрын

    so the basic idea is: 1)steeper slope leads to overfitting @8:16 (what he basically means is that the overfitting line we have has a steeper slope which does not justify his statement on the contrary) 2)adding lambda*(slope)^2 will increase the value of cost function for the overfitted line, which will lead to reduction of slopes or 'thetas' or m's (there are all the same things) @10:03 3)now that the value of cost function for overfitted line is not minimum, another best line is selected by reducing the slopes or 'thetas' or m's which will also reflect in addition of lambda*(slope)^2 ,just this time slope added will be less. @13:45 4)doing this will overcome overfitting as the new best fit line will have less variance(more successful for training data) and less bias than our previous line @14:10 , the bias maybe more because it was 0 for overfitted line ,then it will be a bit more for the new line 5)lambda can be also called as scaling factor or inflation rate to manipulate the regularization. as for the question ,what happens if we have overfitted line with less steeper slope?, then i think we'll find the best fit line with even less steep slope(maybe close to slope~0 but !=0) @16:30 and tadaa!!!! we have reduced overfititng successfully!! please correct me if anything's wrong

  • @faizanzahid490

    @faizanzahid490

    4 жыл бұрын

    I've same queries bro.

  • @supervickeyy1521

    @supervickeyy1521

    4 жыл бұрын

    for 1st point. What if test data has the same slope value as that of train data? in such case there won't be overfitting correct ?

  • @angshumansarma2836

    @angshumansarma2836

    4 жыл бұрын

    just remember the 4 th point that the main goal of regularization we just wanted to generalize better for the test dateset while having some errors in the test dateset

  • @chetankumarnaik9293

    @chetankumarnaik9293

    3 жыл бұрын

    First of all, no linear regression can be built with just two data points. He is not aware of degree of freedom.

  • @Kmrabhinav569

    @Kmrabhinav569

    3 жыл бұрын

    the basic idea is to use lambda (i.e. also known as the regularization parameter) to reduce the product term of Lambda*(slope). Here slope implies various values of m, such as if y = m1x1+m2x2 and so on... we have many values of m(i). So here, we try to adjust the value of lambda such that, the existence of those extra m(i) doesn't matter. And hence we are then able to remove them, i.e. remove the extra features from the model. And we are doing this as one of the major causes of overfitting is due to the addition of extra features. Hence by getting rid of these features, we can curb the problem of overfitting. Hope this helps.

Келесі