8.6 David Thompson (Part 6): Nonlinear Dimensionality Reduction: KPCA

Ғылым және технология

Пікірлер: 19

  • @swavekbu4959
    @swavekbu49599 ай бұрын

    Outstanding speaker and communicator.

  • @dorukhansergin9831
    @dorukhansergin98315 жыл бұрын

    Hi, the reference should be corrected as Tanenbaum et al., Science 22, 2000 not 2009. Took me a while to find. Great video and thanks a lot!

  • @godexolrv4906
    @godexolrv49063 жыл бұрын

    Very well explain and presented , really quite helpful

  • @khawlaallouche3082
    @khawlaallouche30825 жыл бұрын

    Thank you so help full

  • @user-sc4bu7fy1o
    @user-sc4bu7fy1o Жыл бұрын

    Thank you very much!

  • @swavekbu4959
    @swavekbu49599 ай бұрын

    14:25 summarizes very well what KPCA is up to. It's identical to PCA, only that instead of weighting values of variables by corresponding eigenvector weights, the eigenvector weights are applied to the kernel of data points. Why apply to kernels instead of the original variables? So that we can benefit from the kernel trick, which means we can compute in the original dimensional space but basically figure out what's "going on" in higher-D even if we do not know the exact function in higher D. The kernel trick is not exclusive to KPCA but is also seen in techniques such as support vector machines and other kernel methods.

  • @matiassanchezgavier155
    @matiassanchezgavier1553 жыл бұрын

    Excelente, me encanto!

  • @ltbd78
    @ltbd785 жыл бұрын

    Very well presented. Thank you.

  • @gnanadeepch5065
    @gnanadeepch50652 жыл бұрын

    awesome!

  • @kishkash8350
    @kishkash83502 жыл бұрын

    Great lecture! thank you! I believe there's a small typo at 13:30 . In the bottom row, the subscript of the first x in the third addend should be j rather than i.

  • @learnwithash12345
    @learnwithash123456 ай бұрын

    Can you give the playlist where lectures are in sequence?

  • @ElChe-Ko
    @ElChe-Ko4 жыл бұрын

    How do you know beforhand that your dataset has a non-linear structure if you are a dimension higher than 3?

  • @sau002

    @sau002

    3 жыл бұрын

    I have never understood this. All examples that I come across are with toy datasets and that does not help me.

  • @TheSimslash

    @TheSimslash

    3 жыл бұрын

    @@sau002 Depends on what you want to do. Try linear approach and if it fails you might consider using non linear ones ?

  • @CursedByManga

    @CursedByManga

    2 жыл бұрын

    If your linear methods result in bad accuracy, you can try non linear methods. although you will not know for certain

  • @scholar7558

    @scholar7558

    Жыл бұрын

    I don't know if an answer after two years would help, but basically, apply linear PCA and plot it. If the linear PCA worked perfectly, then your data has a linear structure and vise versa.

  • @ElChe-Ko

    @ElChe-Ko

    Жыл бұрын

    @@scholar7558 Ok thank you very much for the help! :) I really apreciate it

  • @jmtv1474
    @jmtv14745 жыл бұрын

    Poor pedagogy...

Келесі