Machine Learning Tutorial Python - 4: Gradient Descent and Cost Function

In this tutorial, we are covering few important concepts in machine learning such as cost function, gradient descent, learning rate and mean squared error. We will use home price prediction use case to understand gradient descent. After going over math behind these concepts, we will write python code to implement gradient descent for linear regression in python. At the end I've an an exercise for you to practice gradient descent
#MachineLearning #PythonMachineLearning #MachineLearningTutorial #Python #PythonTutorial #PythonTraining #MachineLearningCource #CostFunction #GradientDescent
Code: github.com/codebasics/py/blob...
Exercise csv file: github.com/codebasics/py/blob...
Topics that are covered in this Video:
0:00 Overview
1:23 - What is prediction function? How can we calculate it?
4:00 - Mean squared error (ending time)
4:57 - Gradient descent algorithm and how it works?
11:00 - What is derivative?
12:30 - What is partial derivative?
16:07 - Use of python code to implement gradient descent
27:05 - Exercise is to come up with a linear function for given test results using gradient descent
Topic Highlights:
1) Theory (We will talk about MSE, cost function, global minima)
2) Coding - (Plain python code that finds out a linear equation for given sample data points using gradient descent)
3) Exercise - (Exercise is to come up with a linear function for given test results using gradient descent)
Do you want to learn technology from me? Check codebasics.io/?... for my affordable video courses.
Next Video:
Machine Learning Tutorial Python - 5: Save Model Using Joblib And Pickle: • Machine Learning Tutor...
Very Simple Explanation Of Neural Network: • Neural Network Simply ...
Populor Playlist:
Data Science Full Course: • Data Science Full Cour...
Data Science Project: • Machine Learning & Dat...
Machine learning tutorials: • Machine Learning Tutor...
Pandas: • Python Pandas Tutorial...
matplotlib: • Matplotlib Tutorial 1 ...
Python: • Why Should You Learn P...
Jupyter Notebook: • What is Jupyter Notebo...
To download csv and code for all tutorials: go to github.com/codebasics/py, click on a green button to clone or download the entire repository and then go to relevant folder to get access to that specific file.
🌎 My Website For Video Courses: codebasics.io/?...
Need help building software or data analytics and AI solutions? My company www.atliq.com/ can help. Click on the Contact button on that website.
#️⃣ Social Media #️⃣
🔗 Discord: / discord
📸 Dhaval's Personal Instagram: / dhavalsays
📸 Codebasics Instagram: / codebasicshub
🔊 Facebook: / codebasicshub
📱 Twitter: / codebasicshub
📝 Linkedin (Personal): / dhavalsays
📝 Linkedin (Codebasics): / codebasics
🔗 Patreon: www.patreon.com/codebasics?fa...

Пікірлер: 705

  • @codebasics
    @codebasics2 жыл бұрын

    Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced

  • @honeymilongton8401

    @honeymilongton8401

    2 жыл бұрын

    Sir can you please upload the slides also sir

  • @codebasics
    @codebasics4 жыл бұрын

    Stochastic vs Batch vs Mini gradient descent: kzread.info/dash/bejne/e4lpyNeoiaW6cbA.html Step by step roadmap to learn data science in 6 months: kzread.info/dash/bejne/emiNxdOOfLyXXbQ.html Machine learning tutorials with exercises: kzread.info/dash/bejne/maGq2MOoktCdlbQ.html

  • @mukulbarai1441
    @mukulbarai14413 жыл бұрын

    It has become so clear that I am gonna teach it to my dog.

  • @codebasics

    @codebasics

    3 жыл бұрын

    👍🙂

  • @farazaliahmad3257

    @farazaliahmad3257

    3 жыл бұрын

    Just Do it....

  • @Austin-pw2ud

    @Austin-pw2ud

    2 жыл бұрын

    Dont do it! He may become a threat to humanity!

  • @eresque7766

    @eresque7766

    2 жыл бұрын

    @@Austin-pw2ud he may become the cleverest but he'll remain being a good boy

  • @Austin-pw2ud

    @Austin-pw2ud

    2 жыл бұрын

    @@eresque7766 ouuuuuchh! Tht touched my ♥

  • @vanlindertpoffertje3032
    @vanlindertpoffertje30325 жыл бұрын

    Thank you so much for the detailed explanation! I have difficulties understanding these theories but most of the channels just explain without mentioning the basics. With your explanation, it is now it is soooo clear! amazing!!

  • @officesuperhero9611
    @officesuperhero96116 жыл бұрын

    I’m so excited to see you uploaded a new video on machine learning. I’ve watched your other 3 a couple of times. They’re really top notch. Thank you. Please keep this series going. You’re a great teacher too.

  • @angulimaldaku4877
    @angulimaldaku48774 жыл бұрын

    3Blue1Brown is a great channel so is your explanation. Kudos to you! Also, it is quite appreciable how you positively promote and credit other's good work. That kind of Genuity is much needed.

  • @alidi5616
    @alidi56164 жыл бұрын

    This is the best tutorial i have ever seen. This is truly from scratch. Thank you so much

  • @waytosoakshya1127
    @waytosoakshya1127 Жыл бұрын

    Finally, found the best ML tutorials. Coding with mathematics combined and explained very clearly. Thank you!

  • @IVIRnathanreilly
    @IVIRnathanreilly Жыл бұрын

    I've been struggling with my online lectures on machine learning. Your videos are so helpful. I can't thank you enough!

  • @codebasics

    @codebasics

    Жыл бұрын

    👍👍🙏

  • @mamtachaudhary5281
    @mamtachaudhary52813 жыл бұрын

    I have gone through so many materials and couldn't understand a thing on these, but this video is amazing .Thanks for putting all you videos.

  • @codebasics

    @codebasics

    3 жыл бұрын

    Glad it was helpful!

  • @ayushlabh
    @ayushlabh5 жыл бұрын

    It's the most helpful video I have seen till now on gradient descent . Great work . Looking forward for more videos on machine learning .

  • @khushidonda7168

    @khushidonda7168

    11 ай бұрын

    can you help me how to plot all values of m and b on chart?

  • @codebasics
    @codebasics4 жыл бұрын

    How to learn coding for beginners | Learn coding for free: kzread.info/dash/bejne/daSo1M6ydJOyeps.html

  • @mdlwlmdd2dwd30
    @mdlwlmdd2dwd303 жыл бұрын

    For people who wants to know whats behind of scene: The reason we get partial derivative m t function (mse): - 2/n (summation) x_i ( y_i (mx_i+b)) is due to chain rule in calculus. We want to take m deriviative and as you see m would be gone as m^(1) and m^(1-1) = 1 and leave only x_i. with chain rule we dissect the function. so suppose we have random function F(m)= (am+b)^2, we would deal with (am+b)^2 first -> 2*(am+b) X df/dm (am+b) -> 2*(am+b) X a . likewise you'd use chain rule for same MSE above. and get - 2/n (summation) x_i ( y_i (mx_i+b)) Please don't accept as it is then you never learn why things are working completely and come up with your own solution. Easy way is never get you where you want it.

  • @datalearningsihan

    @datalearningsihan

    Жыл бұрын

    None really understands why we have -(2/n) instead of 2/n. if you do the calculations, even with the chainrule, you will get 2/n, never will get negative values!

  • @awakenwithoutcoffee

    @awakenwithoutcoffee

    Ай бұрын

    @@datalearningsihan I think it is indeed to prevent any negative values to occur.

  • @saltanatkhalyk3397
    @saltanatkhalyk33973 жыл бұрын

    thank you for such easy explanation. was reading about gradient descent many times but this is the first time I understood the math behind that.

  • @SudiKrishnakum
    @SudiKrishnakum3 жыл бұрын

    I followed tonnes of tutorials on gradient descent. Nothing came close to the simplicity of your explanation. Now I have a good grasp of this concept! thanks for this sir!

  • @codebasics

    @codebasics

    3 жыл бұрын

    👍☺️

  • @ishaanverma9asn523
    @ishaanverma9asn5232 жыл бұрын

    this is the best ML course I've ever came upon !

  • @moududhassan3026
    @moududhassan30265 жыл бұрын

    The best one for Gradient Descent, Thank you,

  • @tsangwingho2508
    @tsangwingho25083 жыл бұрын

    Hi, I wanna know how to plot the learned regression line for each iterations on the same graph showing that the change. Thanks

  • @hfe1833
    @hfe18335 жыл бұрын

    Thank you, I think I found the right channel for machine learning

  • @codebasics

    @codebasics

    5 жыл бұрын

    Great. Happy learning.

  • @ramsawasthi
    @ramsawasthi5 жыл бұрын

    Great tutorial, explained in very easy language in very less time.

  • @codebasics

    @codebasics

    5 жыл бұрын

    Glad you liked it ram.

  • @akashmishra5553
    @akashmishra55533 жыл бұрын

    Hey, thanks for creating all these playlists. These are so good. I think the viewers should at least like and comment in order to show some love and support.

  • @sharathchandrachowdary6828
    @sharathchandrachowdary6828 Жыл бұрын

    This video is just enough to describe the excellence of your explanation. Simply mind blowing.

  • @khushidonda7168

    @khushidonda7168

    11 ай бұрын

    can you help me how to plot all values of m and b on chart?

  • @rajanalexander4949
    @rajanalexander49492 жыл бұрын

    Sharp, to the point, succinct. Great stuff!

  • @jaiprathapgv2273
    @jaiprathapgv22733 жыл бұрын

    Sir, I have another doubt we are importing and using linear regression from sklearn. Will the gradient descent happens inside the linear regression model and gives us the result or should I use the gradient descent model separately?

  • @fernandoriosleon
    @fernandoriosleon4 жыл бұрын

    Finalmente aprendí Gradient descent, Finally I leaned Gradient Descent, thank you so much 🙏

  • @justforfun7855

    @justforfun7855

    3 жыл бұрын

    BONNE!!!

  • @jhansisetty9429
    @jhansisetty94295 жыл бұрын

    It was a very useful video. After watching many other videos, I understood the concept in the best way after watching your video. Keep making such tutorials which are simple and easy to understand the complex topics. Thankyou.

  • @yashagarwal3999
    @yashagarwal39994 жыл бұрын

    so calmly and nicely u have explained a tough topic to beginners

  • @AlonAvramson
    @AlonAvramson3 жыл бұрын

    You provide this complex material in such a nice and easy way. Thank you!

  • @shijilts4139
    @shijilts41394 жыл бұрын

    All your tutorials are amazing!! Thanks a lot.

  • @jenglong7826
    @jenglong78265 жыл бұрын

    This was an excellent explanation! Not too technical and explained in simple terms without losing its key elements. I used this to supplement Andrew Ng's Machine learning course on Coursera (which has gotten technical real quick) and it's been really helpful thanks

  • @codebasics

    @codebasics

    5 жыл бұрын

    Glad you found it useful Chia Jeng.

  • @GlobalDee_
    @GlobalDee_3 жыл бұрын

    Waoh ,waoh. Codebasics to the world.You are such a great teacher sir.Thanks for sharing these series.........

  • @amandaahringer7466
    @amandaahringer74662 жыл бұрын

    Exceptionally done! Great work and thank you!

  • @i.t.878
    @i.t.8782 жыл бұрын

    Such an excellent tutorial, the clearest I have seen on this topic. Kudos. Thank you.

  • @wolfisraging
    @wolfisraging6 жыл бұрын

    I am glad someone gives perfect explanation

  • @vivekkumargoel2676
    @vivekkumargoel26763 жыл бұрын

    sir followed your tutorial but getting runtime warning overflow error in python how to correct that

  • @Opinionman2
    @Opinionman22 жыл бұрын

    Best video on the topic I’ve seen so far! Thanks

  • @vijaydas2962
    @vijaydas29625 жыл бұрын

    Perfect explanation. Thanks for your effort

  • @princeekanim1804
    @princeekanim18044 жыл бұрын

    This tutorial made me finally understand gradient descent and cost function ... I dont know you how u did but you did... thanks man. I really appreciate

  • @codebasics

    @codebasics

    4 жыл бұрын

    You're very welcome Prince :) I am glad your concepts are clear now.

  • @princeekanim1804

    @princeekanim1804

    4 жыл бұрын

    codebasics no problem keep it up you’re a great teacher

  • @mohamedarif3464
    @mohamedarif34645 жыл бұрын

    Thanks for teaching in this approach...great!!!

  • @yousufali_28
    @yousufali_285 жыл бұрын

    Thanks for taking step by step approach and making it easy. 👍

  • @prajwal3114
    @prajwal31143 жыл бұрын

    One of the best Tutorial for Gradient Descent.

  • @srishtikumari6664
    @srishtikumari66643 жыл бұрын

    Insightful! Deep understanding of ML is necessary. You explained it very well

  • @alokpratap2094
    @alokpratap20945 жыл бұрын

    sir ur videos are really awesome, sir please try to complete these series as soon as possible, cover all the topics of machine learning like cluster analysis, principal component analysis, etc

  • @georgesmith3022
    @georgesmith30225 жыл бұрын

    so does linear regression use gradient descent to calculate m and b or some other algorithm?

  • @ishitasadhukhan1
    @ishitasadhukhan12 жыл бұрын

    The best tutorial on Gradient Descent !

  • @hv3300
    @hv33006 жыл бұрын

    Quick question- At 15.27 how did you get xi and no xi in other line? Will appreciate your help.

  • @aayush135
    @aayush1352 жыл бұрын

    Superb!! Your lectures are very good and make complicated things very easy. May you keep growing in your life.

  • @MondayMotivations
    @MondayMotivations4 жыл бұрын

    I don't know why you are so underrated. Only 73K SUBSCRIBERS. You deserve way more than that, I mean the way you clear the concepts. You're simply awesome man.

  • @codebasics

    @codebasics

    4 жыл бұрын

    I am happy this was helpful to you

  • @geekyprogrammer4831

    @geekyprogrammer4831

    3 жыл бұрын

    now he got 281K and in future I expect it to be more :D

  • @fridayemmanueljames4873
    @fridayemmanueljames4873 Жыл бұрын

    Waooo, for a long time I've struggled to really understand the gradient descent algorithm. I feel like a pro

  • @TheSocialDrone
    @TheSocialDrone4 жыл бұрын

    This was a difficult topic for me; then I spent the time to watch your video, thank you for making my learning easier! Very nice explanation.

  • @codebasics

    @codebasics

    4 жыл бұрын

    👍😊

  • @khushidonda7168

    @khushidonda7168

    11 ай бұрын

    can you help me how to plot all values of m and b on chart?

  • @chokoprty
    @chokoprty2 ай бұрын

    Watching this at 2x, like if you are too 😂

  • @clarizalook2396
    @clarizalook23964 жыл бұрын

    i'm confused. This is something very new to me despite I've studied calculus in my undergrad years. I did not get it fully but the code worked from my end. Perhaps soon, the more I get into different models, I'd slowly understand this. Thanks for sharing all these.

  • @codebasics

    @codebasics

    4 жыл бұрын

    Yup clarie. The tip here is to go slowly without getting overwhelmed. Don't give up and slowly you will start understanding it 😊👍

  • @RAJESHMANDALGAU-C-
    @RAJESHMANDALGAU-C-6 жыл бұрын

    Here is the video I found. Great to watch!

  • @mayankjain24in
    @mayankjain24in4 жыл бұрын

    awesome explanation. plz keep it up ..... also appreciate how you credit others for their work, that's very rare

  • @francisconiederleytnerenci8542
    @francisconiederleytnerenci85423 жыл бұрын

    Very thankful for your video! However, I have a doubt: I have tried same Python program with the data from housing and it does not converge. Why is that?

  • @akash200287
    @akash2002876 жыл бұрын

    how to apply same gradient descent when training data is having multiple columns, please guide

  • @HARSHRAJ-2023
    @HARSHRAJ-20236 жыл бұрын

    Hope sir you are more regular in uploading the video. It will help us a lot. Eagerly waiting for new upload.

  • @akhileshchauhan7422
    @akhileshchauhan74225 жыл бұрын

    upcoming few days I will see your whole channel

  • @ireshaweerasinghe8705
    @ireshaweerasinghe87055 жыл бұрын

    Thanks... i am new to ML and your tutorial is very useful to me :)

  • @valijoneshniyazov
    @valijoneshniyazov3 жыл бұрын

    when you calculate partail derivatives, dont assume x or y zero, assume them constants instead. for example f(x,y)= x*y your partial derivatives will be 0 but it should be x and y

  • @rajdipdas1329

    @rajdipdas1329

    2 жыл бұрын

    no why partial deriative will be zero ,we have to analyze it as a constant df(x,y)/dx=x.dy/dx+y this will be the derivative with respect to x and df(x,y)/dy=x+y.dx/dy.

  • @yangfarhana3660
    @yangfarhana36603 жыл бұрын

    Clearly broken down concepts, very very good video, thank you for this amazing guide!

  • @codebasics

    @codebasics

    3 жыл бұрын

    Glad it was helpful!

  • @afsheenmaroof6209
    @afsheenmaroof62094 жыл бұрын

    I have the prdction equation as y= w*w1. X How to implement ths I mean I have to make a function but dnt knw hw

  • @sararamadan1907
    @sararamadan19073 жыл бұрын

    I wanted to thank you before ending watching the video just to tell you that you make my day by implying this lesson

  • @codebasics

    @codebasics

    3 жыл бұрын

    sara i am glad you liked it and thanks for leaving a comment :)

  • @yourlifeonpower
    @yourlifeonpower4 ай бұрын

    Very clear, concise and helpful! Thank you !

  • @shubhamkanwal8977
    @shubhamkanwal89774 жыл бұрын

    This is pure gold!

  • @pratikpd5460
    @pratikpd54602 жыл бұрын

    Thanks a lot for the great video. Could you please suggest how to do the visual representation part that you showed here: 26:38

  • @MOHITBARTHWAL
    @MOHITBARTHWAL5 жыл бұрын

    how to use sgd when we have high dimensions. let we have 10 features ?

  • @kasahunabdisa6022
    @kasahunabdisa6022 Жыл бұрын

    great and simple approach to learning gradient descent . Thank you for your effort

  • @ajaykushwaha-je6mw
    @ajaykushwaha-je6mw2 жыл бұрын

    Best ever video on Gradient Descent.

  • @mahmoudnady4388
    @mahmoudnady43882 жыл бұрын

    Thank you, teacher. Your explanation is clear, interesting and useful ❤👌

  • @daychow4659
    @daychow465911 ай бұрын

    Omg!!! This is my first time seeing people to calculate how gradients decent works!!!!

  • @aritradutta5679
    @aritradutta56795 жыл бұрын

    Can you just say what are the changes will be there in case of multivariate regression?? I guess all the slopes value needs to be calculated for all the feature variable and the intercept part will remain the same.

  • @premkumarganji1974
    @premkumarganji19744 жыл бұрын

    Thanks for the tutorial, it's really helpful.And I have a doubt, How to find gradient descent of a multivariate?

  • @jc-co1ck
    @jc-co1ck2 жыл бұрын

    Thanks for your explanation and it is really clear and easy to understand. They are really awesome, thank you.

  • @krystianprogress4521
    @krystianprogress45212 жыл бұрын

    Thanks to you I finally understood what the gradient descent is

  • @boubacaramaiga4408
    @boubacaramaiga44085 жыл бұрын

    Excellent tutorial. Many thanks.

  • @wasirizvi2437
    @wasirizvi24374 жыл бұрын

    Explained well in easy language ! Thanks bro.

  • @VijaykumarS7
    @VijaykumarS78 ай бұрын

    You explained in the simplest way this complex concept. Best teacher in the world 🎉🎉

  • @codebasics

    @codebasics

    8 ай бұрын

    Glad you liked it ! 😊

  • @prathameshjoshi9199
    @prathameshjoshi91993 жыл бұрын

    Please help me, I've a doubt. While calculating slope of cost function, if we don't know the cost function beforehand, how can we calculate the slope of cost function ? I mean, if I know that my cost function looks like a sigmoid(for eg.) Then I can use Sigmoid Dervative to find out the Slope of Cost function. But If don't know what my cost function looks like, how can I decide which derivative formula to use, to calculate slope ?

  • @halyan2033
    @halyan20334 жыл бұрын

    thank you so much. You saved my life!

  • @joaovictorf.r.s.1570
    @joaovictorf.r.s.15703 жыл бұрын

    Well, I try to code something similar to this, but my code is veeerryy sensitive about the random values of slope and the intercept. For my data, if I choose 0 for both, the result is very bad. Is it normal?

  • @amitpatel5009
    @amitpatel50093 жыл бұрын

    Thanks for video. This is so informative for me. Can we use another expression for y_predicted? I have data set and need to find two parameter of that equation by fitting dataset. If we use another expression for y_predicted than will things change for its derivative?

  • @TheMannawar
    @TheMannawar5 жыл бұрын

    Dear Sir, I already installed converter in my jupyter notebook by using !pip install word2number But now i am stuck how to use this in your exercise to convert strings (names) into numbers? Regards,

  • @rchetia3226

    @rchetia3226

    5 жыл бұрын

    Hi, kindly use the below codes, pip install word2number from word2number import w2n hrm.experience = hrm.experience.fillna('zero') hrm['experience'] = hrm['experience'].apply(w2n.word_to_num) reg = linear_model.LinearRegression() reg.fit(hrm[['experience','test_score','interview_score']],hrm.salary) reg.coef_ reg.intercept_ reg.predict([[2,9,6]]) reg.predict([[3,7,10]])

  • @vishnusagubandi8274
    @vishnusagubandi82744 жыл бұрын

    I think this is best gradient descent tutorial even better than andrew ng sir I got stuck with andrew sir tutorial and later came up here Finally got it...Thanks a lot bro🙏🙏

  • @aniruddhapal1997
    @aniruddhapal19973 жыл бұрын

    In the exercise, why the coef_ and intercept_ values are different between gradient_descent() function and predict_using_sklearn() funcation, both should be searching for best fit line ? Can you please explain.

  • @kiranpoojary493
    @kiranpoojary4933 жыл бұрын

    how you got the graph of gradient descent in jupyter notebook?? i think gradient_descent(x,y) in your jupyter code is not a built in function.

  • @sarikamishra7051
    @sarikamishra70512 жыл бұрын

    Sir u r the best teacher I ever got for Machine Learning.

  • @codebasics

    @codebasics

    2 жыл бұрын

    Glad it was helpful!

  • @NiteshChhabra777
    @NiteshChhabra7772 жыл бұрын

    Thanks for the video. I have one question , how can we define or explain the learning rate? Can you describe it in more detail?

  • @hemantsrivastava3745
    @hemantsrivastava37453 жыл бұрын

    For plotting graph, in the .ipynb remove inverted commas in linewidth.

  • @khushidonda7168

    @khushidonda7168

    11 ай бұрын

    can you help me how to plot all values of m and b on chart?

  • @AYUSHKUMAR-dm1xg
    @AYUSHKUMAR-dm1xg4 жыл бұрын

    who are the people disliking these videos. These people work hard and make these videos for us. Please if you don't like it, don't watch it but don't dislike it. It is misleading to the people who come to watch these videos. I know many of us have studied some of these concepts before, but he is making videos for everyone and not for a few section of people. I feel that this channel's videos are amazing and doesn't deserve any dislikes.

  • @codebasics

    @codebasics

    4 жыл бұрын

    Thanks ayush. I am moved by your comment and kind words. I indeed put lot of effort in making these videos. Dislikes are fine but at the same time if these people put a reason on why they disliked, it will help me a lot in terms of feedback and future improvements 😊

  • @sohannikumbh4802
    @sohannikumbh48023 жыл бұрын

    Can anyone tell me how did the graph visualisation of the code was plotted on the jupyter notebook

  • @imtiyazshaik9950
    @imtiyazshaik99505 жыл бұрын

    big bow sir please continue making videos please !!!!!!!

  • @NowandThen77
    @NowandThen7711 ай бұрын

    at 2:07 you siad we can find the equation in your jupyter notebook? where is the jupyter notebook , where can I find it?? anybody please help?

  • @nirmalyamisra
    @nirmalyamisra5 жыл бұрын

    this was such a great video..many thanks !

  • @codebasics

    @codebasics

    5 жыл бұрын

    Nirmalya thanks for leaving a comment :)

  • @ou8xa1vkk64
    @ou8xa1vkk643 жыл бұрын

    Little hard for me! I cant do the exercise myself. But 100% sure no one will teach easier than this in the world. Keep doing it love you lot!!!!!!!!!

  • @usmanriaz8396
    @usmanriaz83962 жыл бұрын

    best video on gradient descent and cost function. understood the match pretty well., excellent,. love from pakistan

  • @kalpavrikshika8256
    @kalpavrikshika82564 жыл бұрын

    Can anyone explain the 'mathisclose' logic? My code ends with a cost of 31.6045, which still seems high, so why is it breaking and giving me the coefficient and slope?

  • @ritikpratapsingh9128

    @ritikpratapsingh9128

    4 жыл бұрын

    minimum cost depends on your values of x and y. Your cost may be optimum at 31.6045 too. because you might have taken point such that the sum of squares of error is minimum at 31.6045 only.

  • @vinodreddy2303

    @vinodreddy2303

    3 жыл бұрын

    If two consecutive iteration cost values are almost same (isclose) means that your model has reached optimal, so no further improvement is possible.

  • @MrThirupathit
    @MrThirupathit4 жыл бұрын

    Very nicely explained and clear. However, expected the code for graphs on cost function vs b and cost vs m, also expected the code on graph for regression line and its outcome. Looking forward the same.

  • @ShiftKoncepts
    @ShiftKoncepts3 ай бұрын

    Thank u! what does scikit regression give as m and b then if we don’t use gradient descent and is it even worth to do it?

  • @sukumarroychowdhury4122
    @sukumarroychowdhury41223 жыл бұрын

    Hey: you are absolutely excellent. I have seen many guys offering machine learning tutorials. None is as simple, as clear and as educative as you are. Best regards, Sukumar Roy Chowdhury - ex Kolkata, Portland, OR, USA

  • @codebasics

    @codebasics

    3 жыл бұрын

    Sukumar, I am glad this video helped 👍🙏