On the Importance of Deconstruction in Machine Learning Research
Фильм және анимация
This is a talk I gave in December 2020 at the NeurIPS Retrospective Workshop.
I explain why it is so important to carefully analyze your own research contributions through the story of 3 recent publications from my research group at Cornell University. In all three cases did we first invent something far more complicated, only to realize that the gains could be attributed to something far simple and different from what we had initially believed.
slideslive.com/38938218/the-i...
Пікірлер: 19
I come here after a long time, with my favourite words being 'Put away your Laptops, please' and 'raise your hand if you understand'. Super awesome content professor.
Can we please force everyone who writes research papers, specifically in AI/ML, to watch this video? The world would be a far better place. Loved the insights.
I have learnt a lot from your lectures on Machine Learning !! Thanks for very valuable insights into the brilliant work that your lab has produced. Please upload any new lecture series that you are offering. Thank you professor !!
Pretty tough to be a Ph.D student under Kilian, apparently Step 1 is making every other SOTA research team look incompetent by an order of magnitude, Step 2 is realizing your own solution is still stupidly complex prior to publication and Step 3 is increasing Tesla's valuation by billions of dollars without receiving any compensation in return.
You are developing a community of learners around you and teaching them to think simple. Hopefully someday someone in this community will come up with something simple yet groundbreaking. Thank you Professor Killian
I became a big fan yours , Prof. Kilian
Vola!! after a long time , those words "does it makes sense " and "raise your hands if you are with me " learnt a lot. Eager to see more of ur lectures
Thanks a lot professor Kilian Weinberger for your great lectures . I can not put it into words how much i appreciate your great lectures .I have learned a lot from your lectures .I am watching the last lecture of your course and i do not want it to end . I think i will binge watch it again in my free time because it is so much entertaining (needless to say informative and thought provoking) and i would like to wish you a merry christmas .
great talk and great message for the community. thank you
Hello sir i have learnt many things from your videos, and i would like to say Merry Christmas and have a great year.
Bring us new videos about deep learning.. Pleaseeeeee
Amazing lecture ❤❤❤❤
Deep learning and natural language processing lectures please!! I have learnt a lot from your lectures . Thank you!
I wish I could take a sabbatical to work with Dr. Weinberger. I guess applying what I’m learning here is great compromise. 👨🏾💻
There’s still a curious relationship between error correcting codes and nearest neighborhoods - are you familiar with hilbert curves? Thank you for a great and sensible talk.
@kilianweinberger698
3 жыл бұрын
I am, but what is the relationship that you are alluding to?
@alleycatsphinx
3 жыл бұрын
@@kilianweinberger698 I’ve been impressed with how elegantly nearest neighbors can be solved (approximately) very quickly in parallel if you use a “linear octree” type structure, and those are most effective when based on error correcting codes (ie Gray Code.) What inspired your use of error correcting codes?
Thank you very much for sharing your lecture series and a great talk! It worked for me like smth is needed as the air at the time when you can't breath. I've caught up a lot and you inspired me as well in some sense. If you accept some small $ support for your youtube channel work, I'd love to do it. Thanks again!