What happens *inside* a neural network?

Visuals to demonstrate how a neural network classifies a set of data. Thanks for watching!
Support me on Patreon! / vcubingx
Source Code: github.com/vivek3141/dl-visua...
Here's the course I referred to in the video. I am not affiliated with NYU.
• NYU Deep Learning SP20
Sinusoids as activation functions:
openreview.net/forum?id=Sks3z...
vsitzmann.github.io/siren/
Here's the distill.pub article:
distill.pub/2020/grand-tour/
Special thanks to Alfredo Canziani and Nikhil Maserang for reviewing the video.
And also thanks to Grant Sanderson himself for giving me some manim tips!
I've been active on twitter, follow me here!
/ vcubingx
Join my discord server!
/ discord
These animation in this video was made using 3blue1brown's library, manim:
github.com/3b1b/manim
Music is from GameChops (Route 113, Azalea Town, Ecruteak City
Follow me!
Website: vcubingx.com
Twitter: / vcubingx
Github: github.com/vivek3141
Instagram: / vcubingx
Patreon: / vcubingx
What does a Neural Network actually do? Visualizing Deep Learning, Chapter 2
0:00 Intro
0:18 Recap of Part 1
1:57 Introducing the dataset
2:52 Structure of the Neural Network we’ll be using
3:34 What is softmax?
5:52 Input space decision boundaries
6:24 Modifying the Neural Network to visualize what it’s doing
7:36 Out-of-domain boundaries
8:46 sin(x) as an activation function
9:30 Neuron planes
11:57 Softmax surfaces
13:20 MNIST Transformation
13:42 Outro

Пікірлер: 97

  • @lored6811
    @lored68112 жыл бұрын

    I'm sorry for you that this video didn't do so well with clicks, don't let this discourage you from making more beautiful explanations :) There will always be people having their aha-Moments through you

  • @mihir777

    @mihir777

    Жыл бұрын

    Intellectual videos don't really get much views. The cat and dog videos will always satisfy the greater population, providing easy dopamine hits to the reptilian brains.

  • @NovaWarrior77
    @NovaWarrior772 жыл бұрын

    The prodigal son returns.

  • @allantouring
    @allantouring8 ай бұрын

    Where's part 3? I love this series! ❤

  • @DelandaBaudLacanian
    @DelandaBaudLacanian2 жыл бұрын

    my mind is blown, this is so simple and elegant, thank you for taking the time to explain neural networks and linear transformations, this is going to be one of those videos I watch over and over until I really grok it!

  • @saidelcielo4916
    @saidelcielo4916 Жыл бұрын

    Once again WOW this is the best visualization of neural networks I've ever seen, and I've learned tremendously from it. Please make more videos!!

  • @elidrissii
    @elidrissii Жыл бұрын

    Thank you for making these videos, absolute gems. People like you make KZread worth it.

  • @prometheus7387
    @prometheus73872 жыл бұрын

    Grant Junior returns

  • @sortsvane
    @sortsvane2 жыл бұрын

    Hands down THE most lucid explanation of NN I've seen 💯 Sharing it with my CompSci group. Also curious to see how you'll visualise back propagation.

  • @KenanSeyidov
    @KenanSeyidov Жыл бұрын

    Excited for part 3!

  • @BlackM3sh
    @BlackM3sh Жыл бұрын

    I happy I managed to find this video again. 😄 I suddenly felt an urge to rewatch it. I really like the clear visuals of the video. It's a shame you have yet to come out with a part 3, though.

  • @IndyRider
    @IndyRider Жыл бұрын

    This video has done such a great job of visually breaking down a complex concept with examples!

  • @m4sterbr0s
    @m4sterbr0s2 жыл бұрын

    Awesome, a new video!! Really happy to see you making content again!!

  • @stevenbacon3878
    @stevenbacon38782 жыл бұрын

    Thank you for making this video, it's awesome. I look forward to seeing more of your work!

  • @aleksszukovskis2074
    @aleksszukovskis20742 жыл бұрын

    Bruce! It's been a whole year. you still owe me 16 contents

  • @imranyaaqub1704
    @imranyaaqub17042 жыл бұрын

    Thank you for this informative video. I was one of many waiting for part 2, but didn't get notified as I was only subscribed, and didn't know to also hit the bell notification to get an update on when part 2 was out. I suspect many people will be coming back at odd points into the future to see if part 2 has come out. Hope they enjoy it as much as I have.

  • @BooleanDisorder
    @BooleanDisorder4 ай бұрын

    I might not understand much of the mathemagic terms but the visualization really helped. I'd love to mess around with a neural network and see how different things change depending on modifications. "What happens with the output if I change the network to use another activation function in this layer" type of fun

  • @JamieSundance
    @JamieSundance Жыл бұрын

    This video series is fantastic, these concepts never land for me until I see visual spacial context. Keep up the great work, you are greatly appreciated!

  • @NotRexButCaesar
    @NotRexButCaesar2 жыл бұрын

    Your linked material about using periodic activation functions was very interesting.

  • @judo-rob5197
    @judo-rob51972 жыл бұрын

    Very nice explanations of a complicated topic. The visuals make it more intuitive.

  • @vtrandal
    @vtrandal2 жыл бұрын

    This is a rare occasion where I am fortunate to be witnessing excellent progress in technology as it happens. Thank you!

  • @polqb3205
    @polqb32052 жыл бұрын

    Wow, the video is sooo good, the explanations are wonderful and the animations are so beautiful, I just love it 😍😍

  • @waynedeng9604
    @waynedeng96042 жыл бұрын

    this is the best video I’ve ever watched, I’m in tears, you’ve changed my life with your beautiful animations and soothing voice

  • @asemhusein7575
    @asemhusein75752 жыл бұрын

    The words can't explain how amazing this video finally a video that clears everything Thank you

  • @symbolspangaea
    @symbolspangaea Жыл бұрын

    I saw this video 11 months after published, and came as a gift. Thank you sooooo much!

  • @finkelmann
    @finkelmann2 жыл бұрын

    Brilliant stuff. I've watched my share of neural network videos, and this one is truly unique

  • @hiewhongliang
    @hiewhongliang2 жыл бұрын

    This is awesome!!! Keep posting and keep up the great work.

  • @usama57926
    @usama579262 жыл бұрын

    What a great explanation. Waiting for part 3

  • @Odisse0
    @Odisse09 ай бұрын

    big up for this outstanding work! as an fellow student of these topics, i want to thank you for the effort put there. i'm really impressed both in the script and animations. much love ❤

  • @arnavvirmani8688
    @arnavvirmani86882 жыл бұрын

    Video makes it easy for non math folks like me to gain some semblance of an understanding of neural networks. Great job!

  • @ChauNguyen-jy3fk
    @ChauNguyen-jy3fk2 жыл бұрын

    I've been waiting for this video for several months!

  • @my_master55
    @my_master552 жыл бұрын

    ngl, this is what is called the "high-quality content", thank you very much for your efforts 👏😍 🚀

  • @Max-fw3qy
    @Max-fw3qy11 ай бұрын

    Geez man, your video is very good to visualize what a nn really does! One piece of advice, if I may....after a complicated explanation or a vary loaded explanation as you did with the output of the nn, which is very complex to understand uf you know nothing about it, try to summarize it with a simple sentence, just as you did in 7:40. That was beautifully explained, bravo!👍🏻👍🏻👍🏻

  • @soumyasarkar4100
    @soumyasarkar41002 жыл бұрын

    this is some extraordinary explaination

  • @airatvaliullin8420
    @airatvaliullin84202 жыл бұрын

    What a wonderful explanation! I need to know this for my project and each time I watch something about the NN I'm sure im getting better at understanding what's under the cover. But never have I seen such elegant way to introduce the topic. Bravo!

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    Thank you!

  • @hannesstark5024
    @hannesstark50242 жыл бұрын

    Awesome job!

  • @williamharr7338
    @williamharr733810 ай бұрын

    Excellent Video!

  • @mourirsilfaut6769
    @mourirsilfaut67692 жыл бұрын

    Thank you for making these videos.

  • @TheBookDoctor
    @TheBookDoctor2 жыл бұрын

    Wow. I've watched a lot of "how do neural networks work" videos, and this is the first one that has offered me any truly new insight in a long time. Excellent!

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    Thank you! I appreciate the kind words :)

  • @arturpodsiady7978
    @arturpodsiady7978 Жыл бұрын

    Great video, thank you!

  • @laurent-simpliciteetminima5793
    @laurent-simpliciteetminima57932 жыл бұрын

    Man, this video is a masterpiece! congrat!

  • @usama57926
    @usama579262 жыл бұрын

    Oh man! Finally 2nd part is here.....

  • @KukaKaz
    @KukaKaz6 ай бұрын

    Amazing video ! Keep it up 👍

  • @adriangabriel3219
    @adriangabriel32192 жыл бұрын

    Really great! Do you have a tutorial on how you created the visualizations of the different layers? Would it be possible to do that in pure python as well?

  • @saidelcielo4916
    @saidelcielo4916 Жыл бұрын

    Thanks!

  • @jasdeepsinghgrover2470
    @jasdeepsinghgrover24702 жыл бұрын

    Amazing explanation!!..

  • @woddenhorse
    @woddenhorse2 жыл бұрын

    Simply Awesome 🔥🔥🔥🔥

  • @vincent2154
    @vincent2154 Жыл бұрын

    Really great 👍

  • @mohegyux4072
    @mohegyux4072 Жыл бұрын

    KZread's algorithm should be ashamed of itself !! how could this video have less than 20k views!!!!!! thanks, had multiple whoa! moments

  • @jacobliu760
    @jacobliu7602 жыл бұрын

    I enjoyed this video so much.

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    Thank you Jacob.

  • @CesarMaglione
    @CesarMaglione2 жыл бұрын

    ¡Excellent! Take your like! 👍😉

  • @wise_math
    @wise_math Жыл бұрын

    Nice video. How do you make the white edge border of a scene? (like in Recap Part 1 scene)

  • @praveenrajab0622
    @praveenrajab06222 жыл бұрын

    In 10:49 , aren't the x and y coordinates of the plot is the output values of the second last layer of the nn?

  • @aaronwtr1150
    @aaronwtr11502 жыл бұрын

    Thank you for this gerat video

  • @LuddeWessen
    @LuddeWessen2 жыл бұрын

    Really nice video. However, I think you should mention that you use a binarized (one hot) encoding of argmax and not argmax as it is commonly defined, as viewers (like me) could get confused. Otherwise an excellent video, that conveys the intuition really well! 😀

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    Good point, I'll include the terminology next time

  • @Anujkumar-my1wi
    @Anujkumar-my1wi2 жыл бұрын

    I want to ask that as neural net approximates a function over a particular domain interval ,what'll happen if it gets input outside the domain when testing?

  • @ali493beigi5
    @ali493beigi52 жыл бұрын

    Great! Can you explain me how you produce these animations? Is there any software you have used?

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    I use manim

  • @Hopeful-zx9wk
    @Hopeful-zx9wk2 жыл бұрын

    return of the king

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    But when will hopeful69420 return

  • @MadlipzMarathi
    @MadlipzMarathi2 жыл бұрын

    Finally

  • @ko-prometheus
    @ko-prometheus11 ай бұрын

    Can I use your mathematical apparatus, to investigate the physical processes of Metaphysics?? I am looking for a mathematical apparatus capable of working with metaphysical phenomena, i.e. metamathematics!!

  • @RohanDasariMinho
    @RohanDasariMinho2 жыл бұрын

    Goat cubing x

  • @skifast_takechances
    @skifast_takechances Жыл бұрын

    banger

  • @pi-meson7677
    @pi-meson76772 жыл бұрын

    When you come back after 2¹⁰ years

  • @gdash6925
    @gdash69252 жыл бұрын

    where were you at 8:50? in University?

  • @alexcheng2498
    @alexcheng24982 жыл бұрын

    I've missed this.

  • @TheRmbomo
    @TheRmbomo2 жыл бұрын

    5:25 When describing that the sum of the array resulting from softmax equals 1, I think the visual is missing that communication too. Such as stacking all of the lines on top of each other, up to a value of 1 or 100%. Don't just rely on words. Otherwise great video, thank you.

  • @anwarulbashirshuaib5673
    @anwarulbashirshuaib56732 жыл бұрын

    holy shit!

  • @OrenLikes
    @OrenLikes4 ай бұрын

    w12 reads the first weight of the second input? this is confusing! should be w21 => from input x2, we look at w1 (that, obviously, goes to output 1)!

  • @dewibatista5752
    @dewibatista5752Ай бұрын

    PART 3 PART 3 PART 3

  • @nathannguyen2041
    @nathannguyen20412 жыл бұрын

    How would a neural network handle categorical variables?

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    As inputs? One way is to have each input be a vector of dimension n, where n is the number of categories. Then, for each input, assign the category index 1, and the rest 0. For example, if my input was a 4-category variable of either cat, dog, wolf, tiger. Then the input cat could be {1, 0, 0, 0}. See "one-hot encoding" if you're interested

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    There are plenty of other ways. In the case of NLP (which is my domain atm), we want to be able to encode tokens (some sequence of characters) into input vectors. An older method to do this is word2vec, which converts words to vectors based off context. This allows us to assign each word to some input vector, and we can pass along each vector as inputs to an NN. These days though, modern neural language models (GPT3, etc.) have sophisticated embeddings and word2vec has largely fallen out of grace

  • @enisten
    @enisten2 жыл бұрын

    3:47 Did you mean a range of i̶n̶p̶u̶t̶s̶ outputs?

  • @anshul.infinity
    @anshul.infinity2 жыл бұрын

    I am trying to visualise how the neural network transformed the input space into linearly separable space layer by layer in a new basic data set.

  • @jamietea1072
    @jamietea1072 Жыл бұрын

    Intro part 1 Funny Galaxy part 2 Swastika part 3 Ending of Evangelion

  • @nit235
    @nit2352 жыл бұрын

    Very informative video Thank you a lot Do you have any suggestions for me, I want to learn manim and make videos like how ML algorithms work, their pros and cons cases? Or, if you have a manim learners classes, then I can directly enroll to learn.

  • @abrahamgomez653
    @abrahamgomez65312 күн бұрын

    Chaos happens

  • @omridrori3286
    @omridrori3286 Жыл бұрын

    What about part 3!!!

  • @PapaFlammy69
    @PapaFlammy692 жыл бұрын

    wb :)

  • @vcubingx

    @vcubingx

    2 жыл бұрын

    Thanks:)

  • @jayantnema9610
    @jayantnema96102 жыл бұрын

    hey don't you think saying "this is what NN does under the hood" an overshoot? I mean all the popular literature in textbooks and all ML community also claims that it does exactly that but if this was truly the case, if it was behaving that logicallh then adverserial attacks would have been impossible. But we all know that one pixel attack and noise based attacks are quite frequently achievable by GANs. The interpretation of layers extract features from the input is true provided features are not the human interpretable shapes or patterns, to call them so leads to an error. Because one pixel attacks and noise based attacks do not affect the feature as such, the horse is still horse if you change twentyish pixels out of a 1000.. but the NN suddenly starts saying it is a dog with 99% confidence. If it were really extracting features as in patterns as humans understand it would never even make that error. Humans have 100% accuracy and immunity against some twenty pixels changing out of a 1000 because we extract patterns. NN does not, if it did it should also be immune. But it is not. This means that the popular understanding is still incomplete and it would be wrong to say anything on how NN works under the hood. Since you can find multiple completely different sets of weights and still get excellent classification accuracy. This means the NN is interpreting the spiral in its own way and not human style 5 zone with nonlinear boundary. Because human style there is only 1 interpretation logically possible. That fails to explain how we can get multiple sets of weights, not at all close or alike, still giving solid accuracy

  • @TimmacTR
    @TimmacTR2 жыл бұрын

    What the.....

  • @revimfadli4666
    @revimfadli46662 ай бұрын

    But salty redditors say this isn't how the thing works at all (they deleted their comments in shame after I asked for elaboration)

  • @vcubingx

    @vcubingx

    2 ай бұрын

    Haha, sorry but what redditors? What post are you talking about. Kinda curious

  • @MrMehrd
    @MrMehrd11 ай бұрын

    Hm

  • @jamesjones8487
    @jamesjones8487 Жыл бұрын

    I finally realize that I am a useless stupid fool.

  • @usama57926
    @usama579262 жыл бұрын

    When 3rd party is coming

  • @OrenLikes
    @OrenLikes4 ай бұрын

    you said "softmax is not a version of argmax" and then you say "softmax is a smoother version of argmax" - make up your mind!

  • @AegeanEge35
    @AegeanEge35Ай бұрын

    Thanks!

  • @meguellatiyounes8659
    @meguellatiyounes86592 жыл бұрын

    Finally

  • @sythatsokmontrey8879
    @sythatsokmontrey88792 жыл бұрын

    Finally