13.0 Introduction to Feature Selection (L13: Feature Selection)

Ғылым және технология

This video gives a brief intro of how we care about dimensionality reduction and introduces feature selection as a subcategory that we will cover in more detail in the upcoming videos.
Slides: sebastianraschka.com/pdf/lect...
-------
This video is part of my Introduction of Machine Learning course.
Next video: • 13.1 The Different Cat...
The complete playlist: • Intro to Machine Learn...
A handy overview page with links to the materials: sebastianraschka.com/blog/202...
-------
If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Пікірлер: 19

  • @emilwalleser4752
    @emilwalleser47522 жыл бұрын

    You taught my Introduction to Biostats and 451 at UW. You are one of the best professors I have ever encountered. Thank you for providing all of this extra material. It is greatly appreciated.

  • @SebastianRaschka

    @SebastianRaschka

    2 жыл бұрын

    Nice hearing from you, Emil! And thanks so much for these very kind words! You can't imagine how motivating this is to hear :)

  • @pulkitmadan6381
    @pulkitmadan63812 жыл бұрын

    Commitment level: 1000. Thanks for adding additional topics and making the lecture videos publicly available :-) Really looking forward to the new edition of your book 👀

  • @SebastianRaschka

    @SebastianRaschka

    2 жыл бұрын

    That's great to hear! It's a busy time, but I am hoping to record a video a day on a regular basis until all the additional topics are covered :). Hah, the new edition is also not too far out (I hope) -- currently editing the last chapter!

  • @zaynnicholas9151
    @zaynnicholas91512 жыл бұрын

    Absolutely love your content! Best machine learning lectures I've watched so far on youtube :)

  • @SebastianRaschka

    @SebastianRaschka

    2 жыл бұрын

    Wow, thanks!!!

  • @TrainingDay2001

    @TrainingDay2001

    2 жыл бұрын

    I have to second that! Sebastian please keep continuing the great work :)

  • @HannyDart
    @HannyDart6 ай бұрын

    Thank you so much for uploading! I have a dataset where feature selection/dimensionality reduction is vital, but in my class at university that we were only scatching the surface of feature selection (and we didnt get those add on lectures ;))

  • @1UniverseGames
    @1UniverseGames2 жыл бұрын

    Great to find out your YT, seems I'm following you in each platforms. ❤️

  • @rahulpaul9432
    @rahulpaul9432 Жыл бұрын

    I have been following your videos since the past month and I can say with confidence that your videos are one of the best I have seen so far on ML! Thanks a lot for all the efforts you have put in making these lectures!!! Are there any lectures/books that you have authored which are dedicated to forecasting (both time-series and using regression models?). Would be of great help to me if you could guide me to them. Thanks again! :)

  • @SebastianRaschka

    @SebastianRaschka

    Жыл бұрын

    Thanks for the kind words! Regarding the textbooks, I actually don't do much/any forecasting and have limited experience in that realm. I would make check out sktime (github.com/alan-turing-institute/sktime) and StumPy (github.com/TDAmeritrade/stumpy) and see if they have any good literature pointers.

  • @wolfisraging
    @wolfisraging2 жыл бұрын

    Amazing content ❤️ keep it up, have learnt so much from you mate 🙂

  • @keyushshah7295
    @keyushshah72952 жыл бұрын

    I have been enjoying your ML lectures and they are very well explained with a deep understanding of the concepts. Firstly, thank you for putting these up on KZread. I just had one question - so I have heard that dimensionality reduction has principal component analysis and Anomaly detection. So how is Principal component analysis different from feature selection?

  • @SebastianRaschka

    @SebastianRaschka

    2 жыл бұрын

    Glad you find them useful! So both feature selection and feature extraction are ways to reduce the dimensionality. In feature selection, you usually select a subset of existing features. The features you select are still original features. In feature extraction, however, you create "new" features. In principal component analysis, these "new" feature are linear transformations of the features. For a more simple illustration, consider a dataset with 3 features, x1, x2, and x3. Feature selection would be e.g., selecting feature x1. Feature extraction could be creating a new feature x' = x1 + x2 + 3*x3.

  • @wagutijulius4529
    @wagutijulius45292 жыл бұрын

    same same following you here as well, besides tweerrrra

  • @beautyisinmind2163
    @beautyisinmind21632 жыл бұрын

    Respected professor, your videos are guiding me very well and recently I'm learning how to select features. I have one huge confusion, Anova(F-test) is often used in Filter method for feature selection. Theory says, Anova should be used for feature selection when target is Binary but I saw in some practical use people also uses Anova when target is multi class. So Anova(F-test) can also be applied if our target is not binary and has multiple classes(say data like IRIS)? Another question Anova assumes features to be normally distributed, But in practice most of the time we encounter data that are not fully normal in such case does it matter much to apply it in feature selection? or Transformation of data into some distribution is compulsory? Please clear my confusion answering these two questions.

  • @bogdan3209
    @bogdan32092 жыл бұрын

    Hello, Are slides from all lectures available somewhere?

  • @SebastianRaschka

    @SebastianRaschka

    2 жыл бұрын

    Yeah, they are under sebastianraschka.com/pdf/lecture-notes/stat451fs20/. Will add links to the video descriptions.

  • @danieleboch3224
    @danieleboch3224 Жыл бұрын

    So based

Келесі