Feature Importance using Random Forest and Decision Trees | How is Feature Importance calculated
This video breaks down the process using Random Forest and Decision Trees, making it easy to comprehend. Learn how these techniques help identify the most impactful features in your data.
Code: github.com/campusx-official/1...
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Intro
00:52 - What is Feature Importance
06:10 - Feature importance in SKLearn
08:40 - Feature Importance Documentation
11:10 - Calculating Importance using Decision trees
23:45 - Calculating Importance using Random Forest
Пікірлер: 30
Sir great respect for you. I will rate your course higher than many top coursera courses. I have watched all your ML videos till now. I am doing masters from IIT kanpur. If i will get a job in this domain, then many credit will go to you and your dedication. 👒 off
@AjayBharathRathiMCI
9 ай бұрын
IIT kanpur mei hoke bhi "IF " word ka use krra hai BKL
Thanks for putting so much efforts. Appreciate your work! Worth watching
so awesome sir! you explained everything in detail! thanks!
Awesome... really helpful.... i was wandering for such easily understandable video.
Its a great video! Thanks for explaining in detail. It would be very much helpful If you do similar video on how permutation importance is calculated. And more questions. Does this 'feature importance' helps in finding a root cause for a problem?
Thanks for the informative video. I'd just like to kindly point out that for the 2nd example of Tree with 15 datapoints and 2 features, there was a slight error in the node importance formula for the 2nd and 3rd node. As per the formula you mentioned earlier, the impurity weight should have been 3/9 instead of 3/15. That explains the discrepancy in the feature importance numbers between your calculations and the package.
@20:32 based on formula it should be 3/9*0.44
@stoic_sapien1
28 күн бұрын
😊
Man you're a beast. Awesome
random forest me jo DT use hotey hai wo kitne DT hotey hai during training pata ker saktey jai kya ya randomly select hotey depending on rows and col which they select during row and feature sampling.
Hi Nitish,pls cover feature selection ,xgboost and s
Hello, Thanks for the explanation. I have one question. My question is, Does using best features helps to reduce the training data sets. Say I do not have a large datasets, but I can make independent variable that is highly corelated with the dependent variable, will it help me reduce my traning data sets. Your response will be highly valuable.
Awesome ❤❤❤
what if there are more than 2 columns.. then how will the importance be calculated and how will the x/ (x+y) formula for the nodes look?
thanks sir , maja aaya
Can also plot feature importance for SVM classifier and KSVM??
Hi sir, how can we check which features contributed most for each prediction?? Suppose we built model to predict if loan should be given or not...... Then if a person ask why did my application get regected, then feature importantance will differ from person to person..... So, how to check feature importanance for each prediction?????
Hello Sir, plz update this series. Thanks
can we do landslide predicition from this?
Pls cover feature selection xgboost knn dbscan catboost
If the zero column has more feature importance then y cant be it primary node
best
Plz explain golden features??
Hi, that was very informative. I have a question regarding the above problem: In Decision Tree, if a feature is more important, as we saw 1st feature was more important. Shouldn't it be the root node? Is there any relation b/w what should be the order of nodes with the feature importance?
@kindaeasy9797
5 ай бұрын
there are parameters which decide the feature that has to be used for root note , in random forest , we have multiple decision trees , so it comes down to Feature Sampling
20:46 PICHE DHEKO
high cardinality as in numbers also or only categorical data?
@campusx-official
2 жыл бұрын
cardinality in categorical data Binod
How is this different from Mutual Information?