Multiple Linear Regression using python and sklearn
Multiple linear regression is the most common form of linear regression analysis. As a predictive analysis, the multiple linear regression is used to explain the relationship between one continuous dependent variable and two or more independent variables.
References - Kirell Ermenko Projects On Linear Regression. This video is dedicated to him
Please subscribe and share the channel
Simple Linear Regressiion link: • Simple Linear Regressi...
Github link: github.com/krishnaik06/Multip...
You can buy my book where I have provided a detailed explanation of how we can use Machine Learning, Deep Learning in Finance using python
Packt url : prod.packtpub.com/in/big-data...
Amazon url: www.amazon.com/Hands-Python-F...
Пікірлер: 134
I was struggling with Linear and Multiple regression for over 1 month. Finally, the puzzle is solved. Thanks a million, Krish. You are simply outstanding.
You are a genius I have spent so much for a course that offered me nothing. Thank you again Sir may God bless you
Yep, this was the video I was looking for!
I like this comprehensively explain the details. Thank you for this content. Excellent job!
Very nice pointers. Thank you Krish. Keep up the good work.Looking forward to your Deep Learning Videos. Learning a lot from you.
Multiple linear regression made so simple Krish sir.. I am highly indebted to you. Great job!!
Thank you. This is Very Helpful!!
You described the multiple regression very well...Thank you. I would appreciate if you can do a detailed video on r square and adjusted r square etc (Intuition and concept wise)
Sir, dont we have to first check the assumption for linear regression before fitting into the model? and adjusted r2 should be a good option in multiple linear regression?
This is a good one. Plain and simple for a beginner. Keep up your work
Sir your content are brisk clear , the content is accurate as well as the explanation .Thanks for the effort.
Nice Explanation. You deserve more viewers.
Whatever i have learnt from the theory today finally i came to know how to implement Thankyou Sir
Thanks Sir for this video.. It's really unique and helpful..
You are doing great of the world is teaching and it will helpful for so many people I personally thank you a lot for doing this kind of good things for us .. Salute you man
Help full video, thank you sir
Thank you for your detailed explanation.
Very useful tutorial. Keep it up!
Extraordinary explanation 👌👌👌
Sir how we decide that we should go for linear regression, there may be non linear relationships in dependant and independent features.
Please make a video on Multiple Linear Regression using the stats model. with forward and backward elimination technique.
Very nice Explanation. Keep it up Krish.
Good video. Thanks Krish.
This is a really good video, Sir. Thanks
Thank you so much.
When you come in board is good to understand sir you are well and excellent teacher
Hi Krish Thanks for adding all these videos which are very helpful .. and plz plot a pair plot graph for this MLR model. Thankyou.
@krish amazing explaination
You missed the most important part of plotting the best fit line The second question is when we have n dimensions (n variables in linear regression) can we apply pca ??
excellent explanation of MLR with python coding
good video! Thank you
Nice Video!
This video is really help to me thanks you so much bro
I like your explanation
Thanks for the video sir - but could you say why we have not scaled the values using standard scaler?
Thanks a lot krish
Sir. A big thanks
Sir this was super useful!!! Hats off! But since we are deleting the California feature, then how are going to find the co-efficient for that particular dropped out California independent feature?
Kudos Krish bro. In LinearRegression from sklearn.linear_model, how do we reduce the error. Is there no gradient descent to reduce the error or Linear regression itself gives you the output ?
Hello Krish. Really good work. We do not these in-depth knowledge on high end paid courses.
Hi Krish, Don't we have to find out the P-value for this model? or R squared is just good enough?
Spyder looks cool...I might switch to it😃. Great video man👌
hey krish, why didnt you use feature scaling for independent variables?
Thank you 🙏 sir
1)sir take some big data sets and explain like daily time works.... 2)how do we identify the algorithm by seeing data set also explin sir plzzz.. 3)make some analysis parts of an data sets how to analise the data by seeing those datasets
Hi @Krish Niak, 1. Why didn't you use IDE Jupyter notebook for Multiple Linear Regression? 2. Why did you decided to use IDE Spider?
good presentation..kindly do a video on logistic regression where you are using to make prediction..
if I have to plot a graph to visually understands the difference between y_pred and y_test is there a code to do it I have checked multiple sites but none has answered my question
Great video!!! What does test_size=0.2 imply? Is it going to take 20% of data from the dataframe for randomly testing? Also, is it possible to do multivariate non-linear regression in python?
@rajdeeproy5264
2 жыл бұрын
we are dividing the dataset into 80:20 ratio of train and test split respectively. So test_size = 0.2 implies 20% test and 80% train
sir just one small doubt instead of x=dataset.iloc[:,:-1] can i use dataset.drop("price")
You are god! thanks a lot for this!!
Hi..when u compared the data of testy and predy..the indexes were different? is that ok?
What made you use MLR model on this data set?? Why not other model..i have understood the concept of MLR but how do we know when to use it??
Hey Krish, Good explanation. have a doubt here, why do you take X_train, X_test, y_train, and y_test am confused? Kindly clarify
@GagandeepSingh-qs2vh
4 жыл бұрын
We usually reserve some data for testing purpose. Let's say you trained your model on 80% data. Now, it might be possible that your model says it has accuracy of 90% on training data but that doesn't mean it is a good model. So, you'll have to test it on testing data which is unseen for model. In simple words it is going to tell you how your model is going to perform in real world scenario (the data which it has never seen).
That was a great explanation Krish, thank you! **Doubt : If the number of states would have been 20 (or greater ) how to proceed in such case? **
Nice
Sir , is random state will affect our model if we increase it.
hi Krish, what is the purpose of converting categorical predictors into indicators like 0,1, or 2? Does it mean we can do manipulation with quantitative values only?
@theshishir24
3 жыл бұрын
ML algo always take numerical values. Hope it helped.
This was a good start to explain the basics of regression. But this doesnt seem complete as there could have been a visualization piece as well to explain how the regression worked. Also, what did we do with the train dataset? Is there a follow-up video on this?
sir its a request that u pls upload more and more data set on your github with code so that we can practice more.Thank u sir!
Hey can we use backward elimination method? And what's its purpose
Tq sir for this vedio and also provide graph for this plz
Good explanation even if the sound wasn't very good :) But we would like to know how could we make a data visualisation event with 3 or more explicative or dependent variables before the regression and surface that could occur once we get our model. Plz if you have any idea that's matter to be shared. Thank you !
Its a good video, but sir ensure the good sound quality of the video.
Can we use Ridge and LAsso Regression models as well where we use Multiple Linear Regression?
sir how to check with a simple an get a predcition..Thank you
How we calculate beta not ie. the intercept
Can non-numeric feature problem be solved by label encoding?
Sir after including sklearn library module not found error id occur what should I need to do sir
In this problem , why we didn't we scale the features using Standardscalar or Minmax ?
can you explain visulatization of multiple linear regression
please use regularisation and MCA also sir with multiple linear regression
If while doing get dummies we would not have dropped california state will it have an impact?
please do make some videos on Target encoding.
Thanks for the explanation.... Sir if R2 coming nearer to zero....in this case what we need to do....How to check which attributes are spoiling the regression line
@aashishdagar3307
3 жыл бұрын
@ravi Teja there are multiple methods to feature selection(attributes) like forward selection, backward, etc you can use either it's your comfortability.
sir u predict X_test data if we want to predict some random data given by user how to predict that......how to giev random data for each variables for new prediction....
what is the disadvantage of dummy variable trap? What if I don't drop any dummy variable, what's the impact of it in ML Model? Reason to drop any one dummy variable ?
guys, we can remove test_size and put random state = 10 to get 98% accuracy
U used index as bo is it right?
Hi, Can you explain when to go for linear regression? What are the pre requisites to check if a input and output will fit in a linear regression model or not.
@middle_class_Me
4 жыл бұрын
yes
@middle_class_Me
4 жыл бұрын
even i have the same dought
Hey! I was wondering why you didn't use OneHotEncoder from sklearn.preprocessing?? It would have been a nice two step conversion as follows from sklearn.compose import ColumnTransformer from sklearn.preprocessing import OneHotEncoder cl=ColumnTransformer(transformers=['encoder', OneHotEncoding(), [*Column index which needs to be encoded*]], remainder = 'passthrough') x = np.array(cl.fit_transform(x))
Bro make a video on Ridge regression
Found input variables with inconsistent numbers of samples: [100, 50]. Sir this error is coming can you help me solve it
Can we use mapping instead of getting dummies and concatenating it.
how can i get data set
if my R^2 value is not close to 1 than what should i do?
how to plot it on graph?
this tutorial help me, but it missimg few thinsgs : how to find PVALUE of each x + how to calculte cost function + how to do prediction on new dataset with the model we made .... do you have this kind of tutorial to ?
@ShahidIqbal-sq7bf
4 жыл бұрын
I believe the sckiit learns automatically calculates the best model and the best value for each variable and makes the prediction so you don't have to manually do it.
I am getting R squared value of 0.95, its so coooool.
Can you upload a video on Mulitiple variable/feature Logistic Regression
Would you please visualize it.
Why does r2_ score changes each time i run this code?
If more than 5 cities are present in State Column how to check that in pyhton?
@AK-ws2yw
3 жыл бұрын
I think Linear Regression handles Numeric type of data, if your aim is to solve with categorical data as input variables then go for Logistic Regression
why you didn't plot graph?
sir do for decision tree regressor model, adaboost, xgboost please
y did you removed california?
Where the playlist kindly anyone comment the entire playlist fast
sir ,please explain how to test on new data...
Hey Krish its a humble request please explain practically conversion of categorical features practically as i m getting problems with that even i have watched several tutorials such as the one of kirell eremenko of super data science and now even these pd.get_dummies isnt working and i m getting errors like index errors and value errors.
sir can you please provide videos on neural network implementation using python