A Short Introduction to Entropy, Cross-Entropy and KL-Divergence
Ғылым және технология
Entropy, Cross-Entropy and KL-Divergence are often used in Machine Learning, in particular for training classifiers. In this short video, you will understand where they come from and why we use them in ML.
Paper:
- "A mathematical theory of communication", Claude E. Shannon, 1948, pubman.mpdl.mpg.de/pubman/item...
Errata:
* At 5:05, the sign is reversed on the second line, it should read: "Entropy = -0.35 log2(0.35) - ... - 0.01 log2(0.01) = 2.23 bits"
* At 8:43, the sum of predicted probabilities should always add up to 100%. Just pretend that I wrote, say, 23% instead of 30% for the Dog probability and everything's fine.
The painting on the first slide is by Annie Clavel, a great French artist currently living in Los Angeles. The painting is reproduced with her kind authorization. Please visit her website: www.annieclavel.com/.
Пікірлер: 460
This feels like a 1.5-hour course conveyed in just 11 minutes, i wonder how much entropy it has :)
@grjesus9979
3 жыл бұрын
hahaha
@anuraggorkar5595
3 жыл бұрын
Underrated Comment
@klam77
2 жыл бұрын
ahhh....too clever. the comment has distracted my entropy from the video. Negative marks for you!
@Darkev77
2 жыл бұрын
@@klam77 Could you elaborate on his joke please?
@ashrafg4668
2 жыл бұрын
@@Darkev77 The idea here is that most other resources (videos, blogs) take a very long time (and more importantly say a lot of things) to convey the ideas that this video did in a short time (and with just the essential ideas). This video, thus, has low entropy (vs most other resources that have much higher entropy).
Fantastic video, incredibly clear. Definitely going to subscribe! I do have one suggestion. I think some people might struggle a little bit around 2m22s where you introduce the idea that if P(sun)=0.75 and P(rain)=0.25, then a forecast of rain reduces your uncertainty by a factor of 4. I think it's a little hard to see why at first. Sure, initially P(rain)=0.25 while after the forecast P(rain)=1, so it sounds reasonable that that would be a factor of 4. But your viewers might wonder why you can’t equally compute this as, initially P(sun)=0.75 while after the forecast P(sun)=0. That would give a factor of 0! You could talk people through this a little more, e.g. say imagine the day is divided into 4 equally likely outcomes, 3 sunny and 1 rainy. Before, you were uncertain about which of the 4 options would happen but after a forecast of rain you know for sure it is the 1 rainy option - that’s a reduction by a factor of 4. However after a forecast of sun, you only know it is one of the 3 sunny options, so your uncertainty has gone down from 4 options to 3 - that’s a reduction by 4/3.
@AurelienGeron
6 жыл бұрын
Thanks Jenny! You're right, I went a bit too fast on this point, and I really like the way you explain it. :)
@god-son-love
5 жыл бұрын
Shouldn't one use information gain to check the extent of reduction ? IG = (-1log2(1) - 0log2(0) ) - (-(3/4)log2(4/3)-(1/4)log2(1/4)) = 0.01881437472 bit
@dlisetteb
5 жыл бұрын
thank youuuuuuuuuuuuuuuuu
@rameshmaddali6208
5 жыл бұрын
Actually I understand the concept from your comment than the video itself :) thanks a lot
@maheshwaranumapathy4678
5 жыл бұрын
awesome, great insight i did struggle to get it at first place. Checked out the comments and bam! Thanks :)
As a Machine Learning practitioner & KZread vlogger, I find these videos incredibly valuable! If you want to freshen up on those so-often-needed theoretical concepts, your videos are much more efficient and clear than reading through several blogposts/papers. Thank you very much!!
@AurelienGeron
6 жыл бұрын
Thanks! I just checkout out your channel and subscribed. :)
@pyeleon5036
5 жыл бұрын
I like your video too! Especially the VAE one
@fiddlepants5947
5 жыл бұрын
Arxiv, it was actually your video on VAE's that encouraged me to check out this video for KL-Divergence. Keep up the good work, both of you.
@grjesus9979
4 жыл бұрын
thank you, at first i messed up trying to understand but now reading your comment i understamd it. Thank you! 😊
You are the most talented tutor I've ever seen
I've been googling KL Divergence for some time now without understanding anything... your video conveys that concept effortlessly. beautiful explanation
you are a genius in creating clarity
Sir, you have a talent to explain stuff in a crystal clear manner. You just make something that is usually explained by a huge sum of math equations to be something so simple like this. Great job, please continue on making more KZread videos!
This channel will sky rocket. no doubt. Thank you so much! Clear, visualized and well explained at a perfect pace! Everything is high quality! Keep it up sir!
Wow! This was the perfect mix of motivated examples and math utility. I watched this video twice. The second time I wrote it all out. 3 full pages! It’s amazing that you could present all these examples and the core information in ten minutes without it feeling rushed. You’re a great teacher. I’d love to see you do a series on Taleb’s books - Fat Tails and Anti-Fragility.
Im so happy that I found your channel and youre making such great videos! As a computer Science student, truly understanding those concepts is the essence of learning them. Videos like this one help enormously by giving a simple mounting point for understanding using intuition! Great work! Keep it going! P.S. Your book is also great, can recommend it for everyone really trying to understand ML, not just applying it...
This 11-ish minute presentation so clearly and concisely explained what I had a hard time understanding from a one hour lecture in school. Excellent video!
Thank you, very well explained! I decided to get into machine learning in this hard quarantine period but I didn't have many expectations placed on me. Thanks to your clear and friendly explanations in your book I am learning, improving and, not least, enjoying a lot. So thank you so much!
You make the toughest concepts seem super easy! I love your videos!!!
Haven't seen a better, clearer explanation of entropy and KL-Divergence, ever, and I've studied information theory before, in 2 courses and 3 books. Phenomenal, this should be made the standard intro for these concepts, in all university courses.
ur tutorial is always invincible. quite explicit with great examples. Thanks for ur work
Thank you , I have always confused about these three concepts, you make these concepts really clear for me.
Your channel has become one of my favorite channels. Your explanation of CapsNet and now this is just amazing. I am going to get your book too. Thanks a lot. :)
I come to find Entorpy, but I received Entorpy, Cross-Enropy and KL-divergence. You are so generous!
This explanation is absolutely fantastic. Clear, concise and comprehensive. Thank you for the video.
Very recommendable! Finally, I found someone who could explain these concepts of entropy, cross entropy in very intuitive ways
Fantastic series of videos, looking forward to every new one! Thanks for taking the time out to make these
I've seen all your videos now. You've taught me a lot of things and this was some good moments. Can't wait for more. Thanks so much
Wow! It's just incredible to convey so much information while still keeping everything simple & well-explained, and within 10 min.
This is the best explanation of entropy and KL I have found. Thanks
This was the best intuitive explanation of entropy and cross entropy I've seen. Thanks!
Fantastic! This short video really explains the concept of entropy, cross-entropy, and KL-Divergence clearly, even if you know nothing about them before. Thank you for the clear explaination!
You have no idea how much this video has helped me.Thanks for making such quality content and keep creating more.
this is by far the best description of those 3 terms , can't be thankful enough
Phenomenal explanation of a seemingly esoteric concept into one that's simple & easy-to-understand. Great choice of examples too. Very information-dense yet super accessible for most people (I'd imagine).
Finally, someone who understands, and doesn't just regurgitate the wikipedia page :) Thanks alot!
Really the best explanation of KL divergence I have seen so far !! Thank you.
Thank you so much! Not only it helped me understand KL-Divergence, but also it is helpful to remember the formula. From now I will place signs in right places. Keep it up!
Your book and your videos are incredible. Thank you !
the best video on cross entropy on youtube so far
Really good explanation, the visuals were also great for understanding! Thanks Aurelien.
I have been using cross-entropy for classification for years and I just understood it. Thanks Aurélien!
Awesome video, you made the concept of entropy so much clearer.
the best explanation I ever had about the topic. It was really insightful.
This is the best explanation of the topics that I have ever seen. Thanks!
It's so good to watch your video! Thank you so much!
This video explains the concepts so well! Thank you!
this is by far the best and most concise explanation on the fundamental concepts of information theory we need for machine learning..
To-the-point and intuitive explanation and examples! Thank you very much! Salute to you!
very clear and well-structured explanation. Your book is great, too!Thank you very much!
One of the most beautiful videos I've watched and understood a concept :')
I always seem to come back to watch this video every 3-6 months, when I forget what KL Divergence is conceptually. It's a great video.
Fantastic video! It made me understand and get together many "loose" concepts. Thank you very much for this contribution!
So much clear explanation! Need more of them!
I've learned about this before, but this is the best explanation I've come across. And was a helpful review, since it's been a while since I used this. Well done.
Awesome video! Hope you deliver more content here very soon!
Great video to learn interpretations of the concept of cross-entropy.
I rarely comment on videos, but this video is so good. I just couldn't resist. Thank you so much for the video. :)
Wow, this is great. Thank you for the detailed and clear explanation.
I am new to information theory and computer science in general, and this is the best explanation I could find about these topics by far!
I'm amazed by this video, you are a gifted teacher.
Excellent explanation and discussion. Thank you very much!!
Beautiful short video, explaining the concept that is usually a 2 hour explanation in about 10 minutes.
Thankyou for such a wonderful and to the point video. Now I know: Entropy, Cross Entropy, KL Divergence and also why cross entropy is such a good choice as loss function.
Will definitely check out your book..your videos are amazing...please keep them coming!!
Kinda feels like 3Blue1Brown's version of Machine learning Fundamentals. Simply Amazing
@AurelienGeron
5 жыл бұрын
Thanks a lot, I'm a huge fan of 3Blue1Brown! 😊
Finally I understood Shannon's theory of information. Thank you Aurélien
This video is so clear and so well explained, just like his book!
Best Entropy and Cross-Entropy explanation I have ever seen
Incredibly video, easily one of the top three I've ever stumbled across in terms of concise educational value. Also love the book, great for anyone wanting this level of clarity on a wide range of ml topics. Not sure if this will help anyone else, but I was having trouble understanding why we choose 1/p as the "uncertainty reduction factor," and not, say 1-p or some other metric. What helped me gain an intuition for this was realizing 1/p is the number of bits we would need to encode a uniform distribution if every event had the probability p. So the information, -log(p), is how many bits that event would be "worth" were it part of a uniform distribution. This uniform distribution is also the maximum entropy distribution that event could possibly come from given its probability...though you can't reference entropy without first explaining information.
Great work in the explanation. I have been pretty confused with this concept and the implication of Information theory with ML. This video does the trick in clarifying the concepts while providing a sync between information theory and ML usage. Thanks much for the video.
I want to like this video 1000 times. To the point, no BS, clear, understandable.
Simply awesome. Thank you for such great explanation!
Really, I definitely cannot come up with an alternative way to explain this concept more concisely.
Thank you very much! Excellent video. I started to read your book. I respect you.
This is fantastic. Thank you so much for this and your book!
Very elegant indicating how cognizant the presenter is.
Such a great explanation! Thank you.
thank you for useful video , and also really thanks for your book . You express very difficult concepts of machine learning like a piece of cake .
super clear .. never I heard this explanation of Entropy and Cross Entropy !
I have that book, didn't realized you wrote it until now.
I really enjoyed the way you are explaining it. It's so inspiring watching and learning difficult concepts from the author of such an incredible book in the ML realm. I wish you could teach via video other concepts as well. Cheers, Roxi
i saw many video and then i stumbled on your video.. so much informative and very well articulated thankyou once more.. will check out your book.
Every concept are very clear... Thanks a lot!!
Absolutely easy to understand. Thank you!
Very clear and explainable, I bought your book! Thank you!
Fantastic video! Now all the dots are connected! I have used loss function for NN machine learning, but not knowing the math behind it! This is so enlightening!
I came here to learn how to correctly pronounce his name :). The content is simply great. Thanks a lot.
I had to find a word for how well you explain. Perspicious. Thank you.
@AurelienGeron
6 жыл бұрын
I just learned a new word, thanks James! :)
this explanation really helps the learner in understanding such vague scientific concepts, thanx for the clear explanation !!
The best explanation I've seen on this topic.
Loved your explanation. Thanks, man!
Thanks for the explanation, very clear and complements your excellent book
Hats off! One of the best teachers ever! This definitely helped me better understand it both mathematically and intuitively just in a single watch. Thanks for reducing my 'learning entropy'. My KL divergence on this topic is near zero now. ;)
Nicely conveyed what is to be learned about the topic. I think I absorbed all the way. Best tutorial, keep dropping video like this.
Beautifully explained. Thank you!!
Thanks a lot! Such clear and understandable descriptions!
Magnificent explanation! 👍
I really enjoyed your book and these videos! Keep them coming! Even though some part of my PhD had to do with Information Theory I enjoyed the way you explain IT and Cross Entropy in a very practical way. Helped understand why it is used in machine learning the way it is. Looking forward for more great videos (and maybe a second book?)!
@AurelienGeron
6 жыл бұрын
Thanks Omri, I'm glad you enjoyed the book & videos. :) I recently watched a great series of videos by Grant Sanderson (3Blue1Brown) about the Fourier Transform, and I loved the way he presents the topic: I thought I already knew the topic reasonably well, but it's great to see it from a different angle. Cheers!
@OmriHarShemesh
6 жыл бұрын
Yes, the Fourier transform is a fascinating and multifaceted topic ;) In physics we use it very often for very surprising reasons. I'm looking for a book similar to yours which focuses specifically on NLP with Python and is very well written and modern. Do you have any recommendations? Thanks! Omri
I am reading your book! and oh man oh what a book!!! first I thought how the book and video has exactly same example for explanation until I saw the book of yours on the later part of the video, and realized it's you it's so great to listen to you after reading you!!
Great explaination! Very intuitive examples I love it!! keep it comingg!!
Beautifully explained thank you.
phew !! as newbie to Machine Learning without a background in maths this video saved me, else i never expected to grasp the Entropy concept
you are 3blues1brown kind of guy. nowadays i see lot of youtubers making machine learning videos by repeating the words found in research papers and wikipedia . u r different
@bhargavasavi
4 жыл бұрын
Grant Sanderson is like the Morgan Freeman of visual Mathematics.....I wish his videos existed during my earlier days in college
Finally got the point of what all this stuff actually means. Thanks a lot! My lecturers could learn from you. Just subscribed
thanks for this wonderful video explaining the concepts