Feature Importance using Random Forest and Decision Trees | How is Feature Importance calculated

This video breaks down the process using Random Forest and Decision Trees, making it easy to comprehend. Learn how these techniques help identify the most impactful features in your data.
Code: github.com/campusx-official/1...
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Intro
00:52 - What is Feature Importance
06:10 - Feature importance in SKLearn
08:40 - Feature Importance Documentation
11:10 - Calculating Importance using Decision trees
23:45 - Calculating Importance using Random Forest

Пікірлер: 30

  • @MohitChaudhary-bc9wb
    @MohitChaudhary-bc9wb3 жыл бұрын

    Sir great respect for you. I will rate your course higher than many top coursera courses. I have watched all your ML videos till now. I am doing masters from IIT kanpur. If i will get a job in this domain, then many credit will go to you and your dedication. 👒 off

  • @AjayBharathRathiMCI

    @AjayBharathRathiMCI

    9 ай бұрын

    IIT kanpur mei hoke bhi "IF " word ka use krra hai BKL

  • @guitarkahero4885
    @guitarkahero48852 жыл бұрын

    Thanks for putting so much efforts. Appreciate your work! Worth watching

  • @Ankit-hs9nb
    @Ankit-hs9nb2 жыл бұрын

    so awesome sir! you explained everything in detail! thanks!

  • @dpchand
    @dpchand2 жыл бұрын

    Awesome... really helpful.... i was wandering for such easily understandable video.

  • @user-hf5cb8hh8z
    @user-hf5cb8hh8z10 ай бұрын

    Its a great video! Thanks for explaining in detail. It would be very much helpful If you do similar video on how permutation importance is calculated. And more questions. Does this 'feature importance' helps in finding a root cause for a problem?

  • @akashdeepmishra7835
    @akashdeepmishra78352 жыл бұрын

    Thanks for the informative video. I'd just like to kindly point out that for the 2nd example of Tree with 15 datapoints and 2 features, there was a slight error in the node importance formula for the 2nd and 3rd node. As per the formula you mentioned earlier, the impurity weight should have been 3/9 instead of 3/15. That explains the discrepancy in the feature importance numbers between your calculations and the package.

  • @samriddhlakhmani284
    @samriddhlakhmani2844 ай бұрын

    @20:32 based on formula it should be 3/9*0.44

  • @stoic_sapien1

    @stoic_sapien1

    28 күн бұрын

    😊

  • @123arskas
    @123arskas Жыл бұрын

    Man you're a beast. Awesome

  • @datasciencegyan5145
    @datasciencegyan51452 жыл бұрын

    random forest me jo DT use hotey hai wo kitne DT hotey hai during training pata ker saktey jai kya ya randomly select hotey depending on rows and col which they select during row and feature sampling.

  • @rafibasha4145
    @rafibasha41452 жыл бұрын

    Hi Nitish,pls cover feature selection ,xgboost and s

  • @dineshjoshi4100
    @dineshjoshi4100 Жыл бұрын

    Hello, Thanks for the explanation. I have one question. My question is, Does using best features helps to reduce the training data sets. Say I do not have a large datasets, but I can make independent variable that is highly corelated with the dependent variable, will it help me reduce my traning data sets. Your response will be highly valuable.

  • @adnanwalayat7895
    @adnanwalayat789512 күн бұрын

    Awesome ❤❤❤

  • @ComicKumar
    @ComicKumar Жыл бұрын

    what if there are more than 2 columns.. then how will the importance be calculated and how will the x/ (x+y) formula for the nodes look?

  • @kindaeasy9797
    @kindaeasy97975 ай бұрын

    thanks sir , maja aaya

  • @supriyachaudhary5112
    @supriyachaudhary5112 Жыл бұрын

    Can also plot feature importance for SVM classifier and KSVM??

  • @gowthamsr3545
    @gowthamsr3545 Жыл бұрын

    Hi sir, how can we check which features contributed most for each prediction?? Suppose we built model to predict if loan should be given or not...... Then if a person ask why did my application get regected, then feature importantance will differ from person to person..... So, how to check feature importanance for each prediction?????

  • @stevegabrial1106
    @stevegabrial11063 жыл бұрын

    Hello Sir, plz update this series. Thanks

  • @ankitasonkar42
    @ankitasonkar425 ай бұрын

    can we do landslide predicition from this?

  • @rafibasha4145
    @rafibasha41452 жыл бұрын

    Pls cover feature selection xgboost knn dbscan catboost

  • @maniteja8561
    @maniteja85612 жыл бұрын

    If the zero column has more feature importance then y cant be it primary node

  • @yashjain6372
    @yashjain6372 Жыл бұрын

    best

  • @studywithamisha9903
    @studywithamisha99038 ай бұрын

    Plz explain golden features??

  • @sanamsharma8886
    @sanamsharma88862 жыл бұрын

    Hi, that was very informative. I have a question regarding the above problem: In Decision Tree, if a feature is more important, as we saw 1st feature was more important. Shouldn't it be the root node? Is there any relation b/w what should be the order of nodes with the feature importance?

  • @kindaeasy9797

    @kindaeasy9797

    5 ай бұрын

    there are parameters which decide the feature that has to be used for root note , in random forest , we have multiple decision trees , so it comes down to Feature Sampling

  • @shivoham5939
    @shivoham5939 Жыл бұрын

    20:46 PICHE DHEKO

  • @PratapO7O1
    @PratapO7O12 жыл бұрын

    high cardinality as in numbers also or only categorical data?

  • @campusx-official

    @campusx-official

    2 жыл бұрын

    cardinality in categorical data Binod

  • @PratapO7O1
    @PratapO7O12 жыл бұрын

    How is this different from Mutual Information?