Tutorial 26- Linear Regression Indepth Maths Intuition- Data Science

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06

Пікірлер: 322

  • @mohitpatel7876
    @mohitpatel78764 жыл бұрын

    Best explanation of cost function, we learned it as masters students and the course couldnt explain it as well.. simply brilliant

  • @nandinibalyapally3388
    @nandinibalyapally33883 жыл бұрын

    I never understood what is a gradient descent and a cost function is until I watch this video 🙏🙏

  • @navjotsingh8372
    @navjotsingh83722 жыл бұрын

    I have seen many teachers explaining the same concept, but your explainations are next level. Best teacher.

  • @soumikdutta77
    @soumikdutta772 жыл бұрын

    Why am I not surprised with such a lucid and amazing explanation of cost function, gradient descent,Global minima, learning rate ...may be because watching you making complex things seems easy and normal has been one of my habit. Thank you SIR

  • @RJ-dz6ie
    @RJ-dz6ie4 жыл бұрын

    How can I not say that you are amazing !! I was struggling to understand the importance of gradient descent and u cleared it to me in the simplest way possible.. Thank you so much sir :)

  • @tarunsingh-yj9lz
    @tarunsingh-yj9lz Жыл бұрын

    Best video on youtube to understand the intution and math(surface level) behind Linear regression. Thank you for such great content

  • @shailesh1981able
    @shailesh1981able2 жыл бұрын

    Awesome!! Cleared all doubts seeing this video! Thanks alot Mr. Krish for creating indepth content on such subject!

  • @FaizanKhan-fn6ew
    @FaizanKhan-fn6ew4 жыл бұрын

    Thanq so much for all your efforts.... Knowledge, rate of speech and ability to make thing easy are nicest skill that you hold...

  • @anuragbhatt6178
    @anuragbhatt61784 жыл бұрын

    The best I've come across on gradient descent and convergence theorem

  • @shubhamkohli2535
    @shubhamkohli25353 жыл бұрын

    Really awesome video , so much better than many famous online portals charging huge amount of money to teach things.

  • @moulisiramdasu6753
    @moulisiramdasu67533 жыл бұрын

    Really thanks you krish. you just cleared my doubts on cost function and gradient descent. First I saw Andrew Ng class but have few doubts after seeing you video. Now its crystal clear.. Thank You...

  • @anuragmukherjee1878
    @anuragmukherjee18782 жыл бұрын

    For those who are confused. The convergence derivative will be dJ/dm.

  • @tusharikajoshi8410

    @tusharikajoshi8410

    Жыл бұрын

    what's J in this? Y values? I'm super confused about this d/dm of m, cz it would be just 1. and m I think is just total number of values. Shouldn't the slope be d/dx of y?

  • @mdmynuddin1888

    @mdmynuddin1888

    Жыл бұрын

    @@tusharikajoshi8410 it will be the cost or loss (J)

  • @mdmynuddin1888

    @mdmynuddin1888

    Жыл бұрын

    new(m) = m- d(loss or cost)/dm * Alpha(learning rate.

  • @suhasiyer7317

    @suhasiyer7317

    11 ай бұрын

    Super helpful

  • @threads25

    @threads25

    9 ай бұрын

    I'dont think because it netwons method actually

  • @nanditagautam6310
    @nanditagautam63102 жыл бұрын

    This is the best stuff i ever came across on this topic !

  • @rambaldotra2221
    @rambaldotra22213 жыл бұрын

    Thank You Sir, You have explained everything about gradient Descent in the best possible easiest way !!

  • @pradeepmallampalli6510
    @pradeepmallampalli65103 жыл бұрын

    Thank you Soo much Krish. No where I could find such a detailed explanation You made my Day!

  • @supervickeyy1521
    @supervickeyy15214 жыл бұрын

    i knew the concept of Linear Regression but didn't know the logic behind it.. the way Line of Regression is chosen. Thanks for this!

  • @manikaransingh3234
    @manikaransingh32344 жыл бұрын

    I don't see a link on the top right corner for the implementation as you said in the end.

  • @skviknesh
    @skviknesh3 жыл бұрын

    Great! Fantastic! Fantabulous! tasting the satisfaction of learning completely - only in your videos!!!!!

  • @ayurdubey4818
    @ayurdubey48182 жыл бұрын

    The video was really great. But I would like to point out that the derivative that you took for convergence theorem, there instead of (dm/dm) it should be derivative of cost function with respect to m . Also a little suggestion at the end it would have been helpful, if you mentioned what m was, total number of points or the slope of the best fit line. Apart from this the video helped me a lot hope you add a text somewhere in this video to help the others.

  • @padduchennamsetti6516
    @padduchennamsetti65162 күн бұрын

    you just made the whole concept clear with this video,you are a great teacher

  • @nidhimehta9278
    @nidhimehta92783 жыл бұрын

    Best video on theory of linear regression! Thankyou soo much Krish!

  • @V2traveller
    @V2traveller4 жыл бұрын

    every line you speak..so much important to understand ths concept......thank u

  • @animeshkoley6478
    @animeshkoley64783 жыл бұрын

    Best explanation of Linear Regression🙏🙏🙏.Simply wow🔥🔥

  • @annapurnaparida7655
    @annapurnaparida76553 жыл бұрын

    So beautifully explained...did not find anywhere this kind of clarity....keepnup the good work....

  • @azizahmad1344
    @azizahmad13442 жыл бұрын

    Such a great explanation of gradient descent and convergence theorem.

  • @arunsundar489
    @arunsundar4894 жыл бұрын

    Please add the indepth math intution of other algorithms like logistic, random forest, support vector and ANN.. Many Thanks for the clearly explained abt linear regression

  • @9902152322
    @99021523222 жыл бұрын

    god bless you too sir, explained very well. basics helps to grow high level understanding

  • @pjanjanam
    @pjanjanam2 жыл бұрын

    A small comment at 17:35. I guess it is Derivative of J(m) over m. In other words, the rate of change of J(m) over a minute change of m. That gives us the slope at instantaneous points, especially for non linear curves when slope is not constant. At each point of "m, J(m)", Gradient descent travels in the opposite direction of slope to find the Global minima, with the smaller learning rate. Please correct me if I am missing something. Thanks for a wonderful video on this concept @Krish, your videos are very helpful to understand the Math intuition behind the concepts, I am a super beneficiary of your videos, Huge respect!!.

  • @priyanshusharma2516
    @priyanshusharma25162 жыл бұрын

    Watched this video 3 times back to back .Now its embaded in my mind forever. Thanks Krish , great explanation !!

  • @pranitaumarji5224
    @pranitaumarji52244 жыл бұрын

    Thankyou for this awesome explanation!

  • @dhainik.suthar
    @dhainik.suthar3 жыл бұрын

    This maths is same as coursera machine learning courses Thank you sir for this great content ..

  • @ngarwailau2665
    @ngarwailau26652 жыл бұрын

    Your explanations are the clearest!!!

  • @python_by_abhishek
    @python_by_abhishek3 жыл бұрын

    Before watching this video I was struggling with the concepts exactly like you were struggling in plotting the gradient descent curve. ☺️Thanks for explaining this beautifully.

  • @deeptigoyal4342
    @deeptigoyal43423 жыл бұрын

    one of the best explanation so far :)

  • @tamellasivasubrahmanyam6683
    @tamellasivasubrahmanyam66834 жыл бұрын

    you are ultimate, got answers to some many questions, video is good.

  • @arrooow9019
    @arrooow90193 жыл бұрын

    Oh my gosh this is awesome tutorial I ever seen God bless you sir🤩🤩

  • @karthiavenger4577
    @karthiavenger45774 жыл бұрын

    Yaar you nailed it man after watching sooo many videos i had some Idea , By Finishing your Video now i m completely clear 😍😍😍😍

  • @jagdishsahu1118

    @jagdishsahu1118

    4 жыл бұрын

    Right

  • @nikhil1303
    @nikhil13034 жыл бұрын

    This is a really good explanation for Linear Regresison, Krish.. looking forward to check out more of your videos..thanks and keep going!!

  • @mellowftw
    @mellowftw3 жыл бұрын

    Thanks so much sir.. you're doing good for the community

  • @w3r161
    @w3r1614 ай бұрын

    Thank you my friend, you are a great teacher!

  • @ajinkyadeshmukh9674
    @ajinkyadeshmukh96742 жыл бұрын

    Hi Krish, That was an awesome explanation for the maths used for linear regression, especially for the cost function. Can you make a video on 5 assumptions of linear regression and also explain the assumptions in detail.

  • @magelauditore333
    @magelauditore3334 жыл бұрын

    Sir, you are outstanding. Please keep it up

  • @shrikantlandage7305
    @shrikantlandage73054 жыл бұрын

    my god that was clear as crystal...thanks krish

  • @aayushsuman4592
    @aayushsuman45924 ай бұрын

    Thank you so much, Krish!

  • @mayureshgawai5951
    @mayureshgawai59513 жыл бұрын

    No one can find easiest explanation of gradient descent on youtube. This video is the exception.

  • @PritishMishra
    @PritishMishra3 жыл бұрын

    I knew that their will be an Indian that can make all the stuffs easy !! Thanks Krish

  • @meetbardoliya6645
    @meetbardoliya66452 жыл бұрын

    Value of the video is just undefinable! Thanks a lot :)

  • @ShiVa-jy5ly
    @ShiVa-jy5ly3 жыл бұрын

    Thankyou sir...Get to learn so much from you.

  • @shhivram929
    @shhivram9293 жыл бұрын

    Hi krish, that was an awesome explanation of Gradient Descent. With respect to finding the optimal slope. But in linear regression both slope and the intercept are tweakable parameters, how do we achive the optimal intercept value in linear regression.

  • @SanjeevKumar-dr6qj
    @SanjeevKumar-dr6qj Жыл бұрын

    Great sir. Love this video

  • @auroshisray9140
    @auroshisray91403 жыл бұрын

    Thank you Krish bhaiya!

  • @nikifoxy69
    @nikifoxy693 жыл бұрын

    Loved it. Thanks Krish.

  • @koushikkumar4938
    @koushikkumar49383 жыл бұрын

    Implementation part: Multiple linear Regression - kzread.info/dash/bejne/Z6aq0M6Th93VqJs.html Simple linear Regression - kzread.info/dash/bejne/d2Gs0o-Mmsm1g7w.html

  • @vishnuppriya5263
    @vishnuppriya5263 Жыл бұрын

    Really great sir. I very much thank you sir for this clear explanation

  • @priyankachoubey4570
    @priyankachoubey45702 жыл бұрын

    As always Krish very well explained!!

  • @wellwhatdoyakno6251
    @wellwhatdoyakno62512 жыл бұрын

    lovely! love it.

  • @kevinsusan3345
    @kevinsusan33454 жыл бұрын

    I had so much difficulty in understanding gradient descent but after this video It's perfectly clear

  • @muralimohan6974

    @muralimohan6974

    3 жыл бұрын

    Bro, how we update the slope

  • @ahmedbouchou6893
    @ahmedbouchou68934 жыл бұрын

    Hi . Can you please do a video about the architecture of machine learning systems in real world . How does really work in real life .for example how hadop (pig,hive) , spark, flask , Cassandra , tableau are all integrated to create a machine learning architecture. Like an e2e

  • @juozapasjurksa1400
    @juozapasjurksa14002 жыл бұрын

    Thank you! This video was so good!

  • @shchiranth6626
    @shchiranth66263 жыл бұрын

    Great Tut sir got things pretty quick with this video ty

  • @akrsrivastava
    @akrsrivastava4 жыл бұрын

    Hi Krish, Thanks for the video. Some queries/clarifications required: 1. We do not take gradient of m wrt m. That will always be 1. We take the gradient of J wrt m 2. If we have already calculated the cost function J at multiple values of m, then why do we need to do gradient descent because we already know the m where J is minimum 3. So we start with an m , calculate grad(J) at that point and update m with m' = m - grad(J)* learn_rate and repeat till we reach some convergence criteria Please let me know if my understanding is correct.

  • @slowhanduchiha

    @slowhanduchiha

    3 жыл бұрын

    Yes this is correct

  • @vamsikrishna4107

    @vamsikrishna4107

    3 жыл бұрын

    I think we have to train the model to reach that min. loss point while performing grad. descent in real life problems.

  • @shreyasbs2861

    @shreyasbs2861

    3 жыл бұрын

    How to find best Y intercept ?

  • @vishwashah4109
    @vishwashah41093 жыл бұрын

    Best explanation. Thank you!

  • @harshdhamecha5503
    @harshdhamecha55033 жыл бұрын

    There's a little correction in Convergence Theorem: derivative of J(m) should be there in place of derivative of m in numerator.

  • @salmanjaved2816

    @salmanjaved2816

    3 жыл бұрын

    Correct 👍

  • @xanderkristopher1412

    @xanderkristopher1412

    2 жыл бұрын

    sorry to be so off topic but does anyone know of a way to get back into an instagram account..? I was stupid lost my password. I love any tricks you can give me.

  • @jaspreetsingh5334
    @jaspreetsingh53342 жыл бұрын

    Thanks Krish u are helping alot

  • @sahilswaroop1996
    @sahilswaroop19964 жыл бұрын

    excellent video u are a champion man

  • @sharikatv1989
    @sharikatv19893 жыл бұрын

    This is super helpful!

  • @varungupta2727
    @varungupta27274 жыл бұрын

    Similar to Andrew NG course from coursera kind of revision for me 😊😊

  • @Gayathri-jo4ho

    @Gayathri-jo4ho

    3 жыл бұрын

    Can you please suggest me how to begin with in order to learn machine learning

  • @Gayathri-jo4ho

    @Gayathri-jo4ho

    3 жыл бұрын

    @@ArpitDhamija did you have knowledge on machine learning??if so, please suggest me I saw so many but I couldnt able to .

  • @shhivram929

    @shhivram929

    3 жыл бұрын

    @@Gayathri-jo4ho This playlist itself is a fantastic place to start, Or can enroll in this course "Machine Learning A-Z by krill eremenkrov" in udemy. The course will give you an intuitive understanding of the ML Algorithms. Then it's up to you to research and study the math behind each concept..Reff (kgnuggets, Medium, MachineLearningplus and lot more)

  • @Gayathri-jo4ho

    @Gayathri-jo4ho

    3 жыл бұрын

    @@shhivram929 thank you

  • @sarithajaligama9548

    @sarithajaligama9548

    3 жыл бұрын

    Exactly. This is the equivalent of Andrew Ng's description

  • @RanjithKumar-jo7xf
    @RanjithKumar-jo7xf2 жыл бұрын

    Nice Explanation, I like this.

  • @cutecreature_san
    @cutecreature_san3 жыл бұрын

    your videos are clear and easy to understand

  • @pearlbabbar7981
    @pearlbabbar79812 жыл бұрын

    Amazing explanation! I have one question, from where did you study all of this? Some books or the net?

  • @debrupdey7948
    @debrupdey7948 Жыл бұрын

    great video sir, so lucid

  • @gopposoppobyNaru
    @gopposoppobyNaru4 жыл бұрын

    you are my inspiration

  • @jayeshmudaliar9155
    @jayeshmudaliar91553 жыл бұрын

    best one sir thank you so much

  • @user-ec9he3nz7f
    @user-ec9he3nz7f4 ай бұрын

    really great explanation sir 😍

  • @lubaidkhan2937
    @lubaidkhan29373 жыл бұрын

    Thanks krish sir

  • @avinashgote2770
    @avinashgote2770 Жыл бұрын

    good expplanation now clear all queries

  • @YoutuberEnjoy
    @YoutuberEnjoy Жыл бұрын

    simply great

  • @Neuraldata
    @Neuraldata3 жыл бұрын

    We would also recommend your videos to our students!

  • @sagarparigi1884
    @sagarparigi18843 жыл бұрын

    This video is really helpful.

  • @PavanKumar-xg8ye
    @PavanKumar-xg8ye3 жыл бұрын

    Excellent!!!!!

  • @cynthialobo1995
    @cynthialobo19954 жыл бұрын

    Very nice explanation. Thank you.

  • @glamgalmanu
    @glamgalmanu4 жыл бұрын

    can you do more math intuition s please. These are very helpful. Thanks!

  • @dhruv1324
    @dhruv1324 Жыл бұрын

    never found a better explaination

  • @rezafarrokhi9871
    @rezafarrokhi98713 жыл бұрын

    Thanks for all great prepared videos, I think you meant (deriv.J(m) / deriv(m)) at 17'.45", is it correct?

  • @Dinesh-uh4gw
    @Dinesh-uh4gw3 жыл бұрын

    Excellent Explanation

  • @mvcutube
    @mvcutube3 жыл бұрын

    Nice tutorial. Thank you

  • @shaiksuleman3191
    @shaiksuleman31913 жыл бұрын

    Sir No Words to explain simply super b

  • @pradnyavk9673
    @pradnyavk9673 Жыл бұрын

    very well explained Thank you.

  • @AVyt28
    @AVyt283 жыл бұрын

    Great video I understood the concept

  • @Raja-tt4ll
    @Raja-tt4ll4 жыл бұрын

    very nice video. Thanks Krish

  • @PRASHANTSHARMA-ev7rr
    @PRASHANTSHARMA-ev7rr4 жыл бұрын

    Hi Sir, I am from cloud & DevOps background is it make sense to go & learn Ml AI, what path I can follow to become a dataops engineer or devops ml ai engineer.

  • @amitpadaliya6916
    @amitpadaliya69164 жыл бұрын

    amazing man!

  • @b.chinrhea
    @b.chinrhea3 жыл бұрын

    It was an amazing video , thanks

  • @roshankumargupta3711
    @roshankumargupta37114 жыл бұрын

    It was really helpful. Difference between Stochastic Gradient Descent and Normal Gradient Descent?

  • @TheBala7123
    @TheBala71232 жыл бұрын

    Excellent explanation sir. I have started following your videos for all the ML related topics its very interesting. One doubt = In Gradient Descent, when slope is zero, M value will be considered as the slope of the best file line. I do not understand this. Can you please explain here? Thanks.

  • @kunaltibrewal2881
    @kunaltibrewal28814 жыл бұрын

    It would be great if you could suggest some best books for python programming?

  • @shadiyapp5552
    @shadiyapp5552 Жыл бұрын

    Thank you sir ♥️

  • @mahalerahulm
    @mahalerahulm3 жыл бұрын

    Great insight 👍

  • @varshadevgankar8242
    @varshadevgankar82423 жыл бұрын

    sir i can' find the simple regression and multiple regression video as u said and some videos are little jumbled its getting difficult to follow the videos and plz do explain the functionalities of each and every keyword or a inbuilt function when ur explaining the code...ofcourse ur explaining in a very good way but i faced a liitle problem while folllowing that practical implementation of univariate,multivariate,and bivariate analysis(there you have used FACETGRID function)..so will u plz expalin me what is the exact use of facetgrid...?

  • @Karthik-wj5rs
    @Karthik-wj5rs Жыл бұрын

    Finally I understood the perfect answer of gradient descent..

Келесі