Machine Learning Lecture 37 "Neural Networks / Deep Learning" -Cornell CS4780 SP17

Lecture Notes:
www.cs.cornell.edu/courses/cs4...

Пікірлер: 83

  • @kirtanpatel797
    @kirtanpatel7974 жыл бұрын

    Just completed all 37 Lectures :) This is the only course that forced me to come back, and complete entire series. It's only because of you Great Sir! Thank you so much for sharing these !

  • @xuanwu8045
    @xuanwu80454 жыл бұрын

    This is a wonderful machine learning course. I watched several machine learning/deep learning related courses on KZread. This is my favorite one. In my opinion, a good teacher generally has one of the 2 traits: 1. Make the learning process easier for students by giving Illuminating lectures. 2. Want the students to learn from the heart and motivate students to learn by displaying his/her own passion about the subject. Professor Kilian has both traits. This makes me really enjoy watching this course. Thank you Kilian!

  • @Biesterable
    @Biesterable5 жыл бұрын

    This was wonderfull!! It's strange that not more people are watching this. Thank you so much for sharing!

  • @amuro9616

    @amuro9616

    5 жыл бұрын

    Exactly. One of most approcahble and intutive lectures on ML there is.

  • @cricketjanoon
    @cricketjanoon3 жыл бұрын

    I started learning ML in 2017 when I was an undergrad student and now I am a graduate student. I took many courses and read many books but these lectures cleared many tiny details and concept which I was missing. Spend my COVID-19 summer watching the whole series. Thank Killian!

  • @linxingyao9311
    @linxingyao93113 жыл бұрын

    All I can say it is the Holy Grail of Machine Learning lectures. Thank you, Professor Kilian.

  • @jachawkvr
    @jachawkvr4 жыл бұрын

    This class was so amazing and I learnt so many useful concepts. What I loved most was Dr.Weinberger's engaging and intuitive delivery which made the complex concepts so easy to grasp. He is also funny as hell, which made the classes a lot of fun. A big thank you from my side to Dr.Weinberger for sharing these wonderful lectures as well as the assignments.

  • @dantemlima
    @dantemlima5 жыл бұрын

    Thank you, professor Kilian! What a great teacher you are! I learned a lot and laughed a lot. Awesome!

  • @zelazo81
    @zelazo814 жыл бұрын

    It took me 4 months but I've finally completed watching your series of lectures! You made it extremely informative, intuitive and fun and you have a great teaching style :) Thank you!

  • @jy9p4
    @jy9p44 жыл бұрын

    This was hands down the best lecture series I have seen in my life. I watched at least one video over the past three weeks, wrote notes along the way, and even tried the homework problems. Wow, what a ride. Thanks, Professor Weinberger!

  • @tubakias1

    @tubakias1

    4 жыл бұрын

    Where can we find the homeworks? Thanks

  • @jy9p4

    @jy9p4

    4 жыл бұрын

    @@tubakias1 Here's the link! www.dropbox.com/s/tbxnjzk5w67u0sp/Homeworks.zip?dl=0

  • @halfmoonliu
    @halfmoonliu4 жыл бұрын

    Dear Prof. Weinberger, It's a privilege to be able to listen to the whole series, from the very beginning to the very end. It really helped me getting through some parts that I was not very sure about. Thank you very much!

  • @autrepseudo1980
    @autrepseudo19804 жыл бұрын

    Just finished the series. It was great, I'm kinda sad now! Thanks professor Weinberger. I wish I had you as a prof in college!

  • @satviktripathi6601
    @satviktripathi66013 жыл бұрын

    It took me two months to complete this course, and my knowledge level has drastically changed! Thank you so much!

  • @kilianweinberger698

    @kilianweinberger698

    3 жыл бұрын

    Great job!

  • @satviktripathi6601

    @satviktripathi6601

    3 жыл бұрын

    @@kilianweinberger698 Sir, I can't believe you replied, I am a high school senior and have applied to Cornell! - Really hope to meet you one day!

  • @benxneo

    @benxneo

    7 ай бұрын

    @@satviktripathi6601 did you get into Cornell

  • @yaolinxing1968
    @yaolinxing19685 жыл бұрын

    Very Illuminating lectures. This series should have been popular as Andrew.NG's classic one. Thank you, professor Kilian.

  • @nicksaraev
    @nicksaraev2 жыл бұрын

    Thank you for the delightful class, Kilian! With ML making significant strides over the last few months, I was looking for a course that thoroughly and sufficiently explained the foundations behind it. This was it. Dutifully recommended you to all of my friends who are interested in the subject.

  • @icewave3031
    @icewave30317 ай бұрын

    I lost it at the cinnamon roll part. Thanks for posting these! They have been very helpful for studying

  • @karansawhney2906
    @karansawhney29064 жыл бұрын

    Hello Dr. Weinberger. Your videos are hands down the best I've ever seen in terms of setting up intuition and explaining the concepts in the easiest way possible. This has helped me immensely in my studies. Thank you so much!!

  • @gowtham6071
    @gowtham6071 Жыл бұрын

    I just love this course, everything is both intuitive and mathematically deep. Loved the course so much that I finished everything in 21 days.

  • @zeroes_ones
    @zeroes_ones6 ай бұрын

    Thank you Kilian, your lectures has bought in a completely new perception/understanding(which was missing earlier) on how machine learning algorithms work. Your lectures also made me to appreciate Machine Learning even more. Thank you is a small word. May you always be blessed with good health and happines.

  • @Jeirown
    @Jeirown3 жыл бұрын

    I came here only to learn about gaussian processes. I ended up watching ~10 hours, as if this was a TV series. Even watched lectures on things I already knew well, but just wanted your perspective. Best course really. Thank you

  • @saitrinathdubba
    @saitrinathdubba5 жыл бұрын

    Thanks a lot for brilliant lectures, prof. Kilian. It was Awesome fun and extremely insightful !!

  • @chaowang3093
    @chaowang30933 жыл бұрын

    Today, I am going to complete all the lectures!!! This is a legendary course that should have a similar number of views as Dr.Gilbert strang's linear algebra. Thank you so much, Dr. Kilian!!!

  • @kilianweinberger698

    @kilianweinberger698

    3 жыл бұрын

    Well done!!

  • @yogeshdhingra4070
    @yogeshdhingra40704 жыл бұрын

    I hope you are safe and sound!! Just wanted to say Thank you for the amazing lecture series. I have tear in my eyes... Professor Killian..you're the best!! I hope you add more videos related to Machine learning and Deep learning in the future.

  • @sharique7214
    @sharique72144 жыл бұрын

    This is such s wonderful course. I have come across so many machine learning courses, blogs, videos but this was the best I came across. I sort of binged watched in during quarantine, playing back the lecture to note down so many things you explained. Thanks a lot Professor Killian!

  • @davejung8732
    @davejung87324 жыл бұрын

    Just Loved the whole lecture series :) It's so hard to find a series of lectures on youtube which motivates you to go back and go through the whole thing, but your lectures I succeeded in watching every one of them and also doing the homeworks :)) Thank you for the resource and love your sense of humor LOLLL

  • @jordankuzmanovik5297
    @jordankuzmanovik52973 жыл бұрын

    I just wanna say Thank you very much. You are really the best teacher for this stuff. i can't thank you enough. And please make new courses even if they are not free, i think a lot of people would like to pay for your courses

  • @manogyakaushik8924
    @manogyakaushik89242 жыл бұрын

    Completed all the lectures and absolutely loved them! Professor, you are really inspiring. Thank you so much for sharing these here.

  • @sashwotsedhai2836
    @sashwotsedhai28369 ай бұрын

    Thank you, professor Kilian! Thank you for these amazing lectures. Finally finished the whole series, and I feel like this is just the beginning.

  • @TrentTube
    @TrentTube4 жыл бұрын

    I've completed your lecture series! Thank you for your generous contribution to my understanding of machine learning!

  • @StarzzLAB
    @StarzzLAB3 жыл бұрын

    Thank you! I am sure that this course will blow up someday!

  • @andresguzman5665
    @andresguzman56653 жыл бұрын

    Amazing and inspiring course. Thank you so much Professor Kilian. Your ML course was the first that I watched complete .All the 37 lectures helped me so much. And when I read new ML material, very often remember the content that I watched in your course (More frequently Gaussian Distribution, Bagging and Boosting :)) thank you so much!

  • @Ankansworld
    @Ankansworld3 жыл бұрын

    Onto the last one now! but yeah, feels sad as this course comes to end. Quite interesting, informative, and highly engaging :) All thanks to our amazing professor! Please share a few more course lectures at Cornell! We'd love to level up ourselves...

  • @dnalor8753
    @dnalor87532 жыл бұрын

    your humor made these lectures very enjoyable

  • @108PRS
    @108PRS3 жыл бұрын

    An outstanding class! Filled with technical rigor and humor.

  • @ugurkap
    @ugurkap4 жыл бұрын

    Thanks for sharing this, I believe it is one of the best out courses out here.

  • @RHardy25
    @RHardy253 жыл бұрын

    This was an amazing course, thank you Prof. Kilian!

  • @rahulchowdhury9739
    @rahulchowdhury97392 жыл бұрын

    Thank you so much, Professor, for sharing your perspectives and knowledge to the world.

  • @michaelmellinger2324
    @michaelmellinger23242 жыл бұрын

    2:58 Current research on Deep Learning 5:10 We lose information when working on images when we use a regular fully connected network. Images are translationally invariant 9:30 Convolutional layer explanation 13:30 We are restricting network to only learn functions functions that are translation invariant 16:50 Research on CovNets - Nvidia presentation 21:40 Residual networks. Skip networks. Stochastic depth 26:55 Impotent layers. Robustness because no layer is too important 28:25 Dense connectivity - DenseNet 30:30 Image Manifold - Images lie on a sub-manifold - Add/remove beards to faces 43:25 Dropout is used less these days and BatchNormalization is more common 44:20 Demo - Machine Learning for Data Science - Learn to discover structure in data - Manifolds

  • @danallford7144
    @danallford71442 жыл бұрын

    Thank you so much for putting these lectures online. I have enjoyed them all massively. I came across them while reading about decision trees, watched all of them and over the last 2 weeks have sat in my office every night and made my way through the whole course. Everyone I learnt alot and I now feel I have a way better understanding of ML to ground the rest of my learning (after I go and spend some time making up for my absence to my wife and kid :D) Would be great if you had a link to some site where I could buy you a drink, I feel like I'm in debt :)

  • @HhhHhh-et5yk
    @HhhHhh-et5yk3 жыл бұрын

    Adding lectures on unsupervised learning , would have taken this lecture series to an another level!☄♥️।.

  • @divykala169
    @divykala1693 жыл бұрын

    What an amazing journey, thank you professor!

  • @jeet3797
    @jeet37974 жыл бұрын

    Couldn't resist commenting, My first youtube comment since ever. A BIG THANK YOU!

  • @thinkingaloud1833
    @thinkingaloud18333 жыл бұрын

    Thank you professor Kilian! The lecture is really great.

  • @amarshahchowdry9727
    @amarshahchowdry97273 жыл бұрын

    I honestly can't thank you enough for this series. Thank you so much Kilian. Just wanted to confirm that this translational invariance is due to the combination of Conv layers as well as a pooling layer right. Cause Conv layers by themselves are translational equivarient. With the presence of a Pooling layer after them, we can achieve translational invariance for certain section of the image (If the image is taken to an opposite corner, the final rep. fed to the FC layers will be different right??), since the output even slight changes in position would lead to a slight change in the output of the conv layer, but maxing or avging in the region would give us the same output, at least for small shifts. Hence, we won't we require a lot of data ( faces in every position) to generalize. Am I right here???

  • @arshtangri5210

    @arshtangri5210

    3 жыл бұрын

    I also used believed the same but there is some recent research that says otherwise.

  • @kilianweinberger698

    @kilianweinberger698

    3 жыл бұрын

    I you have many layers then the receptive field (i.e. the pixels it is influenced by) of each neuron in the last layer is huge and translation invariance becomes less of an issue. So yes you are right, but creating many layers really helps in that respect.

  • @gregmakov2680
    @gregmakov26802 жыл бұрын

    yeah, exactly. NN learns non-linear relationships naturally and thus it can learn manifold easily.

  • @madhurgarg4114
    @madhurgarg41142 жыл бұрын

    Finally completed. Thank you very much prof !

  • @kilianweinberger698

    @kilianweinberger698

    Жыл бұрын

    Great job!

  • @Shkencetari
    @Shkencetari5 жыл бұрын

    Thank you very much. These lectures were great. Could you please publish the lectures for other classes like the one you mentioned about, called, "Machine Learning for Data Science" as well?

  • @ugurkap

    @ugurkap

    4 жыл бұрын

    Other classes were not taught by him. I am not aware of any lecture recordings, but you might find some of the assignments and slides here: www.cs.cornell.edu/courses/cs4786/2019sp/index.htm

  • @user-kf9tp2qv9j

    @user-kf9tp2qv9j

    2 жыл бұрын

    @@ugurkap hi Kaplan, does the classes you mentioned above has a video on line?

  • @susansun4130
    @susansun41303 жыл бұрын

    Thank you so much for explaining everything so clearly. So exactly how many electrons are there in the universe XD

  • @kilianweinberger698

    @kilianweinberger698

    3 жыл бұрын

    a lot ...

  • @louis6720
    @louis67204 жыл бұрын

    you are a god my man

  • @alexstar8512
    @alexstar85123 жыл бұрын

    Hi! Thank you for the wonderful course! Are past exams available as I would like to test my knowledge now that I have completed the course

  • @sunilkumarmeena2450
    @sunilkumarmeena2450 Жыл бұрын

    Killian, Thank you. ❤️

  • @71sephiroth
    @71sephiroth4 жыл бұрын

    I am trying to play with this idea but at 35:29 I don't understand how this image is represented, what is the coordinate system? Is it like axes represent weights and biases and for each one you have an entry such as w1*x1 etc. ? At 36:46 why is it meaningful to take gradient descent to reconstruct this image? If we have w1*x1 do you take gradient descent with the respect to x1?

  • @user-kf9tp2qv9j
    @user-kf9tp2qv9j2 жыл бұрын

    the water bucket in PCA is really impressive🤣

  • @gregmakov2680
    @gregmakov26802 жыл бұрын

    yeah, great experiment!!

  • @gregmakov2680
    @gregmakov26802 жыл бұрын

    hahha, gooood experience :D:D:D we can only unfold it when we know before hand its structure.

  • @galhulli5081
    @galhulli50813 жыл бұрын

    Hi professor Killian, Once again thank you very much for the great material. I have a quick question regarding to the NN in general. I apologize in advance if I miss this part in one of the lectures (or comments). Is feature selection necessary before any nn (or deep learning) algorithm? One would think that since it is built to solve this central problem in representation as well as the weights, it should be automatically handled...

  • @kilianweinberger698

    @kilianweinberger698

    3 жыл бұрын

    If you have enough data (and you normalize your features) the neural net can learn if some features are irrelevant. However, you can make its life easier (and get away with less training data) if you identify useless features before you do learning. Put it this way: Anything that the network doesn't have to learn itself makes its life easier.

  • @galhulli5081

    @galhulli5081

    3 жыл бұрын

    thank you very much for the help! Cheers, Gal

  • @clubmaster2012
    @clubmaster20124 жыл бұрын

    Is it fair to say that the idea of stochastic depth is similar to the randomization of dimensions we do before each greedy search in a random forest? Great lectures btw!

  • @kilianweinberger698

    @kilianweinberger698

    4 жыл бұрын

    Not entirely. Stochastic depth is more a form of regularization as it forces the layers in a neural network to be similar.

  • @dude8309
    @dude83094 жыл бұрын

    Is the last layer of a deep network still considered a linear classifier even if it has a non-linear activation function? If not, does that assumption still hold?

  • @kilianweinberger698

    @kilianweinberger698

    4 жыл бұрын

    Yes. Assuming you fix the previous layers, and treat them as feature extractors, then the last (linear) layer is essentially very similar to e.g. logistic regression. Note that logistic regression also has a (non-linear) sigmoid as output s(w'x). The key is that the function s() here acts as a thresholding / scaling function, that essentially makes sure we have output probabilities. Because it is strictly monotonic, it preserves the linearity of the decision boundary. If s() was a sin() function instead of a sigmoid, the classifier would not be linear. Hope this helps.

  • @kc1299
    @kc12993 жыл бұрын

    DenseNet!

  • @gregmakov2680
    @gregmakov26802 жыл бұрын

    yeah, the sur-real fact about researchers, scientists :D:D

  • @gregmakov2680
    @gregmakov26802 жыл бұрын

    hahhah, exactly PCA is good enough to handle many situations.

  • @shrishtrivedi2652
    @shrishtrivedi26522 жыл бұрын

    3:30

  • @itachi4alltime
    @itachi4alltime2 жыл бұрын

    Damn, I am sad

  • @user-kf9tp2qv9j

    @user-kf9tp2qv9j

    2 жыл бұрын

    i feel a little say too in the end