Tutorial 28-MultiCollinearity In Linear Regression- Part 2

In regression, "multicollinearity" refers to predictors that are correlated with other predictors. Multicollinearity occurs when your model includes multiple factors that are correlated not just to your response variable, but also to each other. In other words, it results when you have factors that are a bit redundant
github link: github.com/krishnaik06/Multic...
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
github url: github.com/krishnaik06/Regres...
#Regularization
Please do subscribe my other channel too
/ @krishnaikhindi
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06

Пікірлер: 93

  • @karangupta6402
    @karangupta64023 жыл бұрын

    Thanks, Krish. One way that you suggested is to Drop the feature. There is one more thing that we can do is combine the features which we also called them as an interaction term (like multiply Age and years of experience, and then run your OLS model). This also seems to control collinearity to some extent

  • @prijwalrajavat5811
    @prijwalrajavat58114 жыл бұрын

    I was searching for the theory behind backward elimination ....everywhere just the procedure was given but why was explained by you ...Thank you, sir

  • @bonnyphilip8022
    @bonnyphilip80222 жыл бұрын

    You are simply awesome... Nothing else to say.. Thank you for your contribution and service...

  • @ibrahimmondal9104
    @ibrahimmondal91042 жыл бұрын

    Now the concept of Multicollinearity is totally clear....thanks bhaiya🙂👍✌️

  • @gargisingh9279
    @gargisingh92793 жыл бұрын

    @KrishNaik Sir !! What about another technique called as VIF(Variance inflation factor)?That is the technique also used mostly for multicollinearity

  • @StyleTrick
    @StyleTrick4 жыл бұрын

    Krish, could you explain why AGE has the high P value and not Years of Experience? If you remove Years of Experience from the table, then Age has a P value of 0, as this makes sense as in the data as Age increases, Salary increases accordingly.

  • @unezkazi4349
    @unezkazi43493 жыл бұрын

    So that means we should now have 0 expenditure on newspaper? And after removing the newspaper feature, if we do OLS again, can there be any other feature with p-value less than 0.5 in the new OLS model?

  • @VishalPatel-cd1hq
    @VishalPatel-cd1hq4 жыл бұрын

    Hi Sir, can we directly find covariance matrix of the feature data and find the determinant of the covariance matrix if it is Zero then we can directly say that our covariance matrix is singular and if it is singular then it having dependent variable. so there is Multi-Collinearity in our data ...

  • @shashankkhare1023
    @shashankkhare10234 жыл бұрын

    Hey Krish, do we really need to look at Std. error? I think looking at p-vaue is enough as p-value is calculated based on t-statistic, which is calculated based on coeff. and std. error. Correct me if I am wrong, thanks

  • @sahilgaikwad4508
    @sahilgaikwad45084 жыл бұрын

    thankyou for such a wonderful explanation sir

  • @ashwininaidu2302
    @ashwininaidu23024 жыл бұрын

    Hi Krish, Nice explaination of the concept. If you could make a video on Assumptions of Linear Regression ( theory and practical) , that would be of great help for us.

  • @sachinborgave8094
    @sachinborgave80944 жыл бұрын

    Thanks Krish, Please upload further Deep Learning videos.

  • @shreyasb.s3819
    @shreyasb.s38194 жыл бұрын

    Good explained. Thanks a lot

  • @RitikSingh-ub7kc
    @RitikSingh-ub7kc4 жыл бұрын

    Can we just use principle component analaysis to see if all variance is covered by reducing number of features here?

  • @nagarjunakanneganti5953
    @nagarjunakanneganti59533 жыл бұрын

    If I remove years of exp instead of age just by correlation matrix. It will work right? I don't have to check for p values as if I can construct a LR model with just age could change the p value? Any thoughts @channel

  • @Vignesh0206
    @Vignesh02064 жыл бұрын

    may i know is there a part one of this video?

  • @kakranChitra
    @kakranChitra3 жыл бұрын

    Thankss, good elaboration!!

  • @DataInsights2001
    @DataInsights20013 жыл бұрын

    Nice presentation! Dropping a feature is the one solution, sometimes you can’t drop and you may need that feature in the analysis what would you do? Combine features or division or multiplication? Or use factor analysis? Mostly the cases like Marketing Mix modeling the similar issues arise often. Please Try videos like Marketing Mix modeling, Optimization, and forecasting estimating ROI, IROI, response curves....

  • @akash_a_desai
    @akash_a_desai4 жыл бұрын

    Thanks sir your explanation is really good

  • @akramhossain9576
    @akramhossain95763 жыл бұрын

    I was looking for Multicollinearity problem in interaction terms. How to solve this problem? One suggestion is centring the variable! But still confused! When one of my interaction variables is dichotomous and coding =0 is important for me, then if I use centering, 0 coding will not be there anymore! So how to solve this? Can you please clarify this for me? Thanks

  • @deepaktripathi892
    @deepaktripathi8923 жыл бұрын

    As year and age are related how to decide which one to drop?

  • @abhishekchanda4002
    @abhishekchanda40024 жыл бұрын

    Hi Krish, I got a doubt. If i ignore multicollinearity in the second example, how will it affect the prediction?

  • @ganeshdevare7360
    @ganeshdevare73602 жыл бұрын

    Thank you very much sir for explanation at 00:00

  • @raedalshehri7969
    @raedalshehri79692 жыл бұрын

    Thank you so much

  • @likithliki1160
    @likithliki11603 жыл бұрын

    amazing Explanation

  • @adhvaithstudio6412
    @adhvaithstudio64124 жыл бұрын

    Can you please explain why p values is biased towards in Age variable why it is not for years of experience.

  • @shubhamkundu2228
    @shubhamkundu22283 жыл бұрын

    So to sum up, to identify multicollinearity, we use OLS (ordinary least squares) method to check the model summary for std error (values should be less not high), Rsquare value (should tend towards 1), p value (should

  • @peeyushyadav4991
    @peeyushyadav49913 жыл бұрын

    Wait if the newspaper column/feature is not correlated then how do we come to the conclusion that it can be dropped as the P-value is high? Are we making the conclusions based on the T-test values here as well?

  • @K-mk6pc
    @K-mk6pc2 жыл бұрын

    Can anyone explain why we are adding constant 1 in the predictor variables?

  • @bhagwatchate7511
    @bhagwatchate75113 жыл бұрын

    Hello Krish, Thanks for the amazing explanation. I have one question as - Is it necessary to check summary and then conclude about multicollinearity. Can we proceed only with corr() function and conclude about multicollinearity. Thanks in advance, :) #multicollinearity #Krish Naik #Regression

  • @0SIGMA
    @0SIGMA3 жыл бұрын

    WAH! beautiful sir..

  • @alexandremaillot803
    @alexandremaillot8034 жыл бұрын

    Please After making a model with python , training, testing and saving it, how can I put new data through it for predictions ? Thanks for the content

  • @amansrivastava3081

    @amansrivastava3081

    3 жыл бұрын

    for ex if here is how you have done made the regressor: reg=LinearRegression() reg.fit(x_train,y_train) then you can predict it like: reg.predict([[100]]) and you will get the predicted value if you have trained for multiple features, then use like reg.predict([[100,200]]) according to the dataframe you have trained!

  • @adityakumar-sp4ki
    @adityakumar-sp4ki3 жыл бұрын

    while Importing this library -> import statsmodels.api as sm, I'm getting this error -> module 'pandas' has no attribute 'Panel'

  • @syedhamzajamil4490
    @syedhamzajamil44904 жыл бұрын

    Sir Can i use PCA for ignoring Multi_ colinearily

  • @Mohit-im1rp
    @Mohit-im1rp3 жыл бұрын

    what is the p value in above explanation?

  • @saumyagupta2606
    @saumyagupta26062 жыл бұрын

    Is it a good practice to find multicollinearity for other ml models also apart from linear? Is it necessary for other models as well?

  • @muhammadiqbalbazmi9275
    @muhammadiqbalbazmi92753 жыл бұрын

    I think, instead of removing any feature we can combine both correlated features (in this context you talked about) Age+YearsOfExperience=> Seniority level. Problem solved. (domain knowledge required for feature engineering).

  • @SatnamSingh-cm2vt
    @SatnamSingh-cm2vt4 жыл бұрын

    what does p-value signifies?

  • @nithinmamidala
    @nithinmamidala4 жыл бұрын

    correlation part 1 video is missing please upload and rearrange with sequence

  • @abhi9raj776
    @abhi9raj7764 жыл бұрын

    thanx a lot sir !

  • @maskman9630
    @maskman9630 Жыл бұрын

    can we use the same process for logistic regression brother....?

  • @Datacrunch777
    @Datacrunch7773 жыл бұрын

    Sir kindly upload vdio on ridge regression for lambda value and how can simulate with respect to the basic estimators or formulas

  • @abhishekverma549
    @abhishekverma5494 жыл бұрын

    Sir what is P>[t]

  • @sejalchandra2114
    @sejalchandra21144 жыл бұрын

    Hi sir, I have a doubt. Could we use this if we have one hot encoded features in the dataset?

  • @saidurgakameshkota1246

    @saidurgakameshkota1246

    3 жыл бұрын

    Did you get the answer

  • @ajaykushwaha4233
    @ajaykushwaha42332 жыл бұрын

    Will this approach work on data with huge features?

  • @dhirajbaruah9888
    @dhirajbaruah98884 жыл бұрын

    What does the p value mean??

  • @rankerhill2335
    @rankerhill23354 жыл бұрын

    Unless you calculate Variance Inflation Factor, you cannot be certain that multicollinearity is nor present. Ideally you should be calculating partial correlation coefficient and compute (1/1-partial correlation coefficient) and see if this is greater than 2

  • @Katarahul
    @Katarahul4 жыл бұрын

    plotting a correlation matrix and analyzing it is a faster option right?

  • @Roblox5091

    @Roblox5091

    3 жыл бұрын

    Yes

  • @sumeersaifi6354
    @sumeersaifi63543 жыл бұрын

    at the last, you said 1 unit decrease in newspaper exp will result in one unit increase in sales. can you plzz explain it. because according to me it will result in change in profit but not in sales

  • @amitkhandelwal8030
    @amitkhandelwal80303 жыл бұрын

    sir why you cannot choose tv as B0.??

  • @jyothinkjayan6508
    @jyothinkjayan65084 жыл бұрын

    When will the next deep learning batch begins

  • @JP-fi1bz
    @JP-fi1bz3 жыл бұрын

    Isn't P value used for null hypothesis??

  • @abdullahshafi8865
    @abdullahshafi88654 жыл бұрын

    Intuition King is back.

  • @SayantanSenBony
    @SayantanSenBony4 жыл бұрын

    Hi Karish, i have one question, which i am facing daily basic in realtime, How to detect heteroscedasticity and what is the method to rectify it in python ?

  • @simonelgarrad

    @simonelgarrad

    3 жыл бұрын

    There are many tests available. You can research on Bartlett's test. It assumes that the samples come from populations having same variances. There is another called Goldfeld quandt test. To rectify in python, what I Believe is that when working with MLR, if the normality assumption is satisfied then, you shouldnt get heteroscedasticity. So if we have variables that aren't following normal distribution, they can be transformed (refer Krish's video for that too) and then I dont think heteroscedasticity should be present.

  • @galymzhankenesbekov2924
    @galymzhankenesbekov29244 жыл бұрын

    very good

  • @madhavilathamandaleeka5953
    @madhavilathamandaleeka59533 жыл бұрын

    What is the difference between OLS model and Linear regression with MSE ??......Do both give same ??.... plzz clear it 🙏🙏

  • @shubhamchoudhary5461
    @shubhamchoudhary54613 жыл бұрын

    Is it like hypothesis testing??

  • @guneet556
    @guneet5564 жыл бұрын

    Hi krish kindly please answer to this question of mine for under and oversampling data the techniques u have mentioned is applied on numneric datatypes for categorical datatypes first we have to encode in some 0,1 then further SMOTETomek and neamiss wil apply that is the proper way to deal ?? and if we use trees approach then it will straight away deal this imbalance and encoding thing itself ?? Please krish reply to this question it will be a big help!

  • @krishnaik06

    @krishnaik06

    4 жыл бұрын

    Yes tree approach will solve that problem. But understand tree techniques usually require lot of time

  • @guneet556

    @guneet556

    4 жыл бұрын

    @@krishnaik06 sir thanks a lot i m just a beginner follow ur each video thoroughly . Sir can u make a video on how to built ur resume for guys looking for transition in ds.

  • @avinashmishra6783
    @avinashmishra67833 жыл бұрын

    Why multicollinearity arises? What happens that reduces adj Rsq?

  • @namratarajput7092
    @namratarajput70923 жыл бұрын

    hi i am getting error after running this line "import statsmodels.api as sm" ImportError: cannot import name 'factorial' i am beginner so plz help

  • @tanweerkhan3020

    @tanweerkhan3020

    3 жыл бұрын

    You need to downgrade your scipy or install statsmodels from master. You can check this in stackoverflow

  • @slowhanduchiha
    @slowhanduchiha3 жыл бұрын

    In the 2nd example wasn't scaling important??

  • @jatin7836
    @jatin78363 жыл бұрын

    An imp ques, if anyone knows, please answer ---> "Why we give OLS, the X value(which is constant) and not the x values(which are our independent features)? if we are not giving the independent values to OLS, then how can it is showing the table in output(with X and y?)"

  • @dikshagupta3276
    @dikshagupta3276 Жыл бұрын

    Pls add the link of colinearity

  • @user-qz1hd4xp1p
    @user-qz1hd4xp1p4 жыл бұрын

    Cool video, but about heteroskedasticity?

  • @ruturajjadhav8905
    @ruturajjadhav89053 жыл бұрын

    1.Linearity 2.Homoscedascity 3.Multivariate normality 4.Indepence of errors. 5.Lack of multicollinearity. Please do a video on it. I am not getting these concepts.

  • @harshstrum
    @harshstrum4 жыл бұрын

    Hi bhaiya, didn't get why you choose to drop age feature over year of experience.

  • @krishnaik06

    @krishnaik06

    4 жыл бұрын

    because age and experience are highly correlated and we could see the p value of age was greater than 0.05

  • @shivanshsingh5555
    @shivanshsingh55554 жыл бұрын

    why u r keep saying 0.5 again and again sir. This is not cleared. Plz send the link for the reference coz i m watching this playlist step by step but im still not getting this 0.5 criteria

  • @nidhipandey8004
    @nidhipandey80043 жыл бұрын

    What is the difference in OLS and gradient descent?

  • @simonelgarrad

    @simonelgarrad

    3 жыл бұрын

    What i have read on it and understood as of now is, that OLS is a good method for simple linear regression. Gradient descent is better method when working with many Independent variables. You can understand more on this by watching Josh Stramer's Gradient descent video

  • @a_wise_person
    @a_wise_person4 жыл бұрын

    How is Dell ispiron i5 8th gen 2tb HDD for data science and programming work.

  • @srujohn652

    @srujohn652

    4 жыл бұрын

    Its pretty enough though you can use google colab for your ML projects

  • @adhvaithstudio6412
    @adhvaithstudio64124 жыл бұрын

    why you are ignoring age only why can't we ignore years of expereince?

  • @sushilchauhan2586
    @sushilchauhan25864 жыл бұрын

    What if our feature are in 1000's ? pls reply any one who knows can answer

  • @krishnaik06

    @krishnaik06

    4 жыл бұрын

    For that we will apply dimensionality reduction

  • @sushilchauhan2586

    @sushilchauhan2586

    4 жыл бұрын

    @@krishnaik06 Kabhi socha nahi tha, ki BHAI khud reply denge

  • @sushilchauhan2586

    @sushilchauhan2586

    4 жыл бұрын

    @@krishnaik06 sorry we cant apply dimensionality reduction as its only deal with high variance and has no relation with our class output... we will be using randomized lasso regression or randomized logistic regression.. thank you krish bhai

  • @jatashukla6891
    @jatashukla68914 жыл бұрын

    hi krish i had bought your package of 300 rs on youtube , just to get in connect with you .i need your time to answer some of my doubts.Its regarding switching my career into datascience.Please let me know way to connect with you directly

  • @chitranshaagarwal4676
    @chitranshaagarwal4676 Жыл бұрын

    Please make the topics in order, getting confused

  • @guneet556
    @guneet5564 жыл бұрын

    Sorry but have to ask again in my previous comment can someone correct me ??

  • @raghavchhabra4783
    @raghavchhabra47834 жыл бұрын

    dekhlia, pr samjh nhi aya agar 150+ columns honge to kaise krna hai!!

  • @sahilgaikwad4508

    @sahilgaikwad4508

    4 жыл бұрын

    use iloc to get independent variables and store it in X and store target in Y and use OLS model

  • @adityapathania3618
    @adityapathania36183 жыл бұрын

    bhai data set to de dia kro

  • @pushkarasharma3746
    @pushkarasharma37464 жыл бұрын

    everything is fine Krish sir but pls try not to say particular in every sentence...it is very annoying

  • @ayushasati2110
    @ayushasati21103 жыл бұрын

    your voice is not that much clear

Келесі