Tutorial 27- Ridge and Lasso Regression Indepth Intuition- Data Science
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
kzread.info/dron/NU_lfiiWBdtULKOw6X0Dig.htmljoin
#Regularization
⭐ Kite is a free AI-powered coding assistant that will help you code faster and smarter. The Kite plugin integrates with all the top editors and IDEs to give you smart completions and documentation while you’re typing. I've been using Kite for a few months and I love it! www.kite.com/get-kite/?
Please do subscribe my other channel too
kzread.info/dron/jWY5hREA6FFYrthD0rZNIw.html
Connect with me here:
Twitter: Krishnaik06
Facebook: krishnaik06
instagram: krishnaik06
Пікірлер: 404
Lucid explanation at free of cost . Your passion to make the concept crystal clear is very much evident in your eyes...Hats Off!!!
Only person who is providing this level of knowledge at free of cost. Really appreciate it .
I'm in crisis to learn this topic and all I know is y=mx+c. I think this is the clearest one I've watched on youtube. Thank you sooooo much and love your enthusiasm when you tried to explain the confusing parts
Brilliant. I searched all over the net but couldn't find such an easy yet detailed explanation of Regularization. Thank you very much! Very much considering joining the membership
This guy is legit !! Hat's off for the explanation!! Loved it sir, Thanks
Great video! I appreciate how hard his effort is to help us really understand the material!
100 years of blessing for you. You just gained a subscriber!
This is the most clearest and best explanation about this topic on youtube. I can't express how thankful I am for this video for finally understanding the concept
I never watched your videos but after watching this video I regret for ignoring your channel. You are a worthy teacher and a data scientist.
Thanks Krish for explaining the intuition behind Ridge and Lasso regression. Very helpful.
Thank You bhaiya. It feels like every mroning when I watch your videos my career slope will increase. Thank you for this explaination.
Thank you soo much Krish, Linear regression + Ridge + Lasso cleared my concepts with your videos.
You are helping many of the ML enthusiasts free of cost... Thank you
This is where I have found the best explanation of ridge regression after searching a lot of videos and documentations. thank you sir
Concept well explained. I've watch a lot of videos on Ridge regression but most well explained has shown mathematically the effect of lambda on slope.
Krish Sir you are saving my masters literally,up to date explanation,and the efforts you are putting to help us understand,Thank You so Much Sir.😇🥰
Hi Krish, Your are making our confidence more on data scince with the clear explanations
Hats offf...grateful for valuable content at 0 cost
I wish I could like his videos multiple times. You are a great teacher, Kind Sir.
Sir, you have mentioned in your lecture this concept is complicated but never I felt it is so. you have explained very excellent.👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌
Thank you, for the crystal clear explanation. Now I will remember Ridge and Lasso.
Keep up the good work, blessing from italy My friend :)
Now I finally got about key L2 and L3 difference. Thanks a lot!
Krish An excellent explanation. Thank you so much for this wonderful in-depth intuition.
Thanks, the enthusiasm with which you teach also carries over to us. 🥰
Thank you. Now I got a clear idea on both these regression techniques.
You did a great job sir.... It helped me a lot in understanding this concept. In 20min I understood the basic of this concept. Thank you💯💯
Thank you so much for this wonderful explanation, truly appreciate your efforts in helping the data science community.
Hats off... What a way... What a way to explain man... Clear...all doubts
You're the best trainer!! Thank you!
perfect explanationnnnnnnn. WAs wandering around for whole day. And at the end of the day, found this one.
best tutorials for machine learning with indepth intuition...i think there is no tutorial on utube like this...Thankuu sir..
i found your free videos better than some other paid tutorials...thanx for ur work
after going through tons of videos, finally found the best one, thnx!!
You made it so simple! Thank you.
Thank you! I thought this was a great explanation (as someone who has listened to a bunch of different ones trying to nail my understanding of this)
Hello from México. Thank you for your tutorials, they are as if one of my class mates was explaining concepts to me in simple words. A suggestion, please include a short tutorial on ablation of Deep Learning Models.
Best explanation on lasso n ridge regression ever on KZread... Thanks krish... You nailed it...
Thank you for the clear and intuitive explanation! Will surely come in handy for my exam
Thank you very much sir, I found very few useful resources for ridge regression and your's one is definetly good
Your explanations are sooo clear!
You sir, are amazing!!! Hats off to you!!
Muchas gracias por compartir. Se agradece mucho el esfuerzo por aclarar los conceptos que es la base de partida para la resolución. Saludos desde España!
@sridhar7488
2 жыл бұрын
sí, es un tipo genial ... también me encanta ver sus videos!
Hi Krish, very well explained. It really helps me to understand. Thank you.
Loved your energy sir and your conviction to explain and make it clear to your students. I know it is hard to look at the camera and talk - you nailed it. This video really helped me to understand the overall concept. My two cents, 1) Keep the camera focus on the white board I think it is autofocussing between you and the white board and maybe that is why you get that change in brightness also.
Brilliantly done! Thanks Krish
Just an awesome explanation. concepts are very clearly explained ...thanks for your true effort
you are awesome, better than our professors in explaining such complex topics.
well explained krish. thank you for creating . great work
if you pause the video and just watch his facial movements and body movements, he looks like hes trying his best to convince you to stay with him during a break up. Then you turn on the audio and its like hes yelling at you to get you to understand something. Clearly, this man is passionate about teaching Ridge regression and knows a lot. I think its easier to follow when hes like checking up on you by saying, you need to understand this, and repeats words and uses his voice to emphasize concepts. I wish he could explain other things to me besides data science.
@TheMrIndiankid
4 жыл бұрын
he will explain u the meaning of life too
@MrBemnet1
3 жыл бұрын
my next project is counting head shakes in a youtube video .
@tanmay2771999
3 жыл бұрын
@@MrBemnet1 Ngl that actually sounds interesting.
Here is where I understood that damn issue. I’m appreciated too much, thanks my dear friend :)
Amazing explanation, thank you!
This naik is excellent. He is solving every ones problem.
Awesome job.....keep up with the good work!
just wanted to say that I absolutely loved the video
Excellent video!!! Thanks for a very good explanation.
Great Video sir, you explained each and every step very well.
Brilliantly explained !! thankyou !!
Very nice explanation. I have started watching your videos and your teaching style is very nice . Very nice you tube channel for understanding data science-Hats Off!!
Amazing 👌 Great explanation 👍 Thanks and keep doing videos like this🔥
You are doing awesome job. Thank you so much
Thanks a lot for your Support
perfect explanation!!! thank you sir !
Very good video to understand the intuition behind L1, L2
you make it look so easy, kudos to you Krish!!!
this is great! thank you!
Very nicely explained, thank God I found your posts on KZread while searching the stuff👍
@rayennenounou7065
3 жыл бұрын
I have a mémoire master 2 about lasso régression i need informations more informations about régression de lasso but in frensh can you help me
This man has a gift.
thanks for explaining Buddy!
Absolutely excellent explanation!
super clear, thanks for the video!
The kind of explanation is truly inspirational. I am truly overfitted by knowledge after seeing your video.❤
@abhishekkumar465
Жыл бұрын
Reduce the rate of learning, this may help you as per Ridge regression :P
Great explanation Krish.I think I a understanding a little bit about L1 andL2 regression.Thanks
Thank you sir.. It's clear explanation..
Nice explanation. Thanks a lot!
Your explanations are superbbb!
Awesome tutorial with clear concepts
THANK YOU SO MUCH!!!!!!! great explanation!
After watching this lecture is not complicated... good teaching sir
Very informative lecture dear. You explained with maximum detail. thanks
thank you for this! finally understood the topic
The best tutorial I have seen till date on this topic. Thanks so much for clarity
Thank you Krish very detailed explanation
Extraordinary talented Sir
Excellent explanation sir...thanks
thank you ! good explanation
Just one word, Fantastic.
Thanks this was quite helpful
thank you so much for this explaination ☺
Very well explained!🙌
Amazing explanations.
thank you sir, great explanation
You are great!! Thanks a lot
Regularization was a very abstruse and knotty topic. However, after watching this video; it is a piece of cake Thank you, Krish
good explanation without touching any complex maths derivations.
Very nicely explained. awesome Sir. keep up this good work.
this is the best , thank you so much
Thanks, Krish, I want to correct one thing here, the motivation behind the penalty is not to change the slop; it is to reduce the model's complexity. For example, consider the flowing tow models: f1: x + y + z + 2*x^2 + 5y^2 + z^2 =10 f2: 2*x^2 + 5y^2 + z^2 =15 f1 is more complicated than f2. Clearly, a complicated model has a higher chance of overfitting. By increasing lambda (the complexity factor), it is more likely to have a simpler model. Another example: f1: x + 2y + 10z + 5h + 30g = 100 f2: 10z + 30g = 120 f2 is simpler than f1. If both models have the same performance on the training data, we would like to use f2 as our model. Because it is a simpler model and a simpler model has less chance for overfitting.
amazing explanation.. thank you!
so the basic idea is: 1)steeper slope leads to overfitting @8:16 (what he basically means is that the overfitting line we have has a steeper slope which does not justify his statement on the contrary) 2)adding lambda*(slope)^2 will increase the value of cost function for the overfitted line, which will lead to reduction of slopes or 'thetas' or m's (there are all the same things) @10:03 3)now that the value of cost function for overfitted line is not minimum, another best line is selected by reducing the slopes or 'thetas' or m's which will also reflect in addition of lambda*(slope)^2 ,just this time slope added will be less. @13:45 4)doing this will overcome overfitting as the new best fit line will have less variance(more successful for training data) and less bias than our previous line @14:10 , the bias maybe more because it was 0 for overfitted line ,then it will be a bit more for the new line 5)lambda can be also called as scaling factor or inflation rate to manipulate the regularization. as for the question ,what happens if we have overfitted line with less steeper slope?, then i think we'll find the best fit line with even less steep slope(maybe close to slope~0 but !=0) @16:30 and tadaa!!!! we have reduced overfititng successfully!! please correct me if anything's wrong
@faizanzahid490
4 жыл бұрын
I've same queries bro.
@supervickeyy1521
4 жыл бұрын
for 1st point. What if test data has the same slope value as that of train data? in such case there won't be overfitting correct ?
@angshumansarma2836
4 жыл бұрын
just remember the 4 th point that the main goal of regularization we just wanted to generalize better for the test dateset while having some errors in the test dateset
@chetankumarnaik9293
3 жыл бұрын
First of all, no linear regression can be built with just two data points. He is not aware of degree of freedom.
@Kmrabhinav569
3 жыл бұрын
the basic idea is to use lambda (i.e. also known as the regularization parameter) to reduce the product term of Lambda*(slope). Here slope implies various values of m, such as if y = m1x1+m2x2 and so on... we have many values of m(i). So here, we try to adjust the value of lambda such that, the existence of those extra m(i) doesn't matter. And hence we are then able to remove them, i.e. remove the extra features from the model. And we are doing this as one of the major causes of overfitting is due to the addition of extra features. Hence by getting rid of these features, we can curb the problem of overfitting. Hope this helps.