BACKPROPAGATION algorithm. How does a neural network learn ? A step by step demonstration.
Ғылым және технология
It is my first video in English I hope it is ok. I will start to do on my KZread channel more expert video in English.
In this first video we details the backpropagation algorithm, really used in Deep Learning to train supervised neural network.
Instagram : / defend.intelligence
Twitter : / dfintelligence
The Blog Post of Matt Mazur : mattmazur.com/2015/03/17/a-st...
Пікірлер: 83
I actually kind of laughed at around 1:38 when he was like "All you need to know is addition, subtraction, multiplication ... and partial derivatives." Lol, you really had all those 3rd graders in the first half, not gonna lie.
For anyone in the future trying to look for how todo backprop easily: For the first set of weights, we use -(expected - given) * (given * (1 - given)) * output of previous node. expected and given in respect to this video would be values using o2 or o1 (depending on which weight you're working on), and output of previous node in this would be output of j2 or j1 (again depending on which weight it's attached to). This is our final gradient, so we can multiply this by the learning rate and subtract that from the weight to get our updated weight. For the rest of the weights in any hidden layer: We take the two (-(expected - given) * (given * (1 - given))) we just computed in step 1 and multiply them by the two weights they were used to update(so if we're updating w4 we use the two weights connected to j2). We then multiply this by (given * (1 - given)) for the given value after the activation function (so for w4 we'd use the output of j2 for given). Finally, we multiply this by the input the current weight is affecting (so for w4, we'd use i2). This is our final gradient, so we can multiply this by the learning rate and subtract that from the weight to get our updated weight.
I spent one month reading books to understand how it works as I'm bigginer and thanks to that video I got the concept in 15mn. very good job, keep on looking forward to learn more and apply it real world problem
After watching your video I was finally able to derive the equation myself . Thank you!
Thank you very much for this great video! I have watched a lot of videos before finally landing on this video. Unfortunately all others have seemed to just shy away from explaining the real math behind back-propagation. They just cover the basic idea or update the weight for the output layer only. This is the first video I have seen that explains the actual math in updating the hidden layers too.
I spotted two mistakes but please tell me if I am wrong . At 10:51 "dEo1/d(out o1)" should be "0.80-0" (which is the output_produced-desired_output) which evaulates to 0.80 and you wrote "-0.18" in that place so please once check it and tell me if I am wrong😊😊
@Coder-0
10 ай бұрын
I believe you are both wrong you were on the right track though at 6:42 it says desired output-produced output (t-a).
@Dhanush-zj7mf
10 ай бұрын
@@Coder-0you forgot to apply chain rule. You r Diffententing wrt a. You have t-a inside so you should multiply by -1.
cette video ma bien aide a comprendre BP algorithm. merci a vous!
Thanks for this amazing video!
in the Compute 02 line in the formula there should be w7 instead of w5 and w8 instead of w6. Regards Slawek
Great video! I wrote a Python project to carry out and visualize the manipulations, for learning, from this very video. I think I noticed a mistake at 7:30 in the video -- I think you mean "O2" in the chain rule on the right, rather than "O1." But easy enough to account for. Again, thank you very much for this video! This is the most straight-forward description of how to apply back-propagation that I've found yet.
@LionKimbro
Жыл бұрын
8:55 -- Also, isn't the derivative of in(O2) just w8 itself?
good explanation helped to understand some inner details from a basic neural network.
Je ne parle pas anglais, mais étonnamment, j'ai compris ce que vous disiez !!
@DefendIntelligence
3 жыл бұрын
Merci 😊
Super helpful, thanks 👍
Thanks for a helpful video.
Excellent video
Very nice!
C'est vraiment dommage, je suivais cette chaine pour le simple fait que c'était en français. Des trucs en anglais sur le sujet, il y en a par tonne.
@DefendIntelligence
4 жыл бұрын
Hello ! Pas de crainte, je vais continuer en français je voulais juste tenter l'expérience sur ce sujet précis :).
@gabupouet4221
4 жыл бұрын
@@DefendIntelligence , je vous en remercie beaucoup
@abdel8502
3 жыл бұрын
@@DefendIntelligence Et du coups, tu peux la refaire en français ?
@karimkondua1736
3 жыл бұрын
Ça fait travailler l anglais la prononciation est bien mais bon quand on part de loin c est vrai que le français 😅😅 c est plus pratique
There is at least one mistake, but as overall how its work is quite good presented. In sake of correctness you should check number once again. In w dIno2 / dW8 you wrote 0.61 but in dEtotal /dW8 you wrote 0.52. Best regards
@Antagon666
3 жыл бұрын
Nobody cares about the results, if the formula is correct
@FPChris
2 жыл бұрын
I care about the results. I want to do it all on paper so I can confirm the results went I rewrite it to code.
@darshshah7155
5 ай бұрын
yes exactly. also bugs me when the result i get are different than the video. @@FPChris
Nice video. More videos on english would be cool :)
At 8:10 I dont follow where the -1 came from. Anyone care to shed some light?
NICE VIDEO BRO
This is great
9:06 There should be 0.61 instead of 0.52
could you please explain why you did not update the biases and how the biases are updated in back propagation?
Bonjour et un grand merci pour la Vidéo, je cherchais un example vulgarisé et c'est parfait. Concernant les "Bias" est ce que l'on applique aussi une correction ou on ne s'occupe que des "weights" ?
@nopana_
Жыл бұрын
C'est une vielle question, mais on doit appliquer aussi une correction sur les "biases" de ce que je sais car ils influent aussi le résultat de manière importante ^^- (edit: oui c'est très important d'apporter le changement sur les biases aussi)
Génial c'est pile que je cherchais, les vidéos sont super propres et claires, en + du contenu en français qui + est ! bravo et merci
@geogeo14000
3 жыл бұрын
et un subscriber de gagné ofc ^^
@DefendIntelligence
3 жыл бұрын
Merci beaucoup !! Et bienvenu sur la chaine :)
Please can you make a video on support vector machines and ROC AUC
thank you
Thanks!
Why you didn't update biases?
your (d out j2)/(d inp j2) value at 11:09 is wrong. See at 3:45 the value of sigmoid of j2 is 0.61. If you calculate 0.61(1 - 0.61) you will get 0.2379 instead of what you got 0.16. Please fix that. Its bugs me after calculating for so long my answer is not matching the answer in the video.
petite question pour le learning rate qui est de 0,8 dans la formule, tu l'a choisi par défaut ou tu l'a calculé plus tôt?
@DefendIntelligence
4 жыл бұрын
Hello Théo, Non je l'ai juste choisi de manière aléatoire pour démontrer l'intérêt de l'exercice. On verra dans les prochaines vidéos comment ajuster toutes ces variables précieuses en Deep Learning (nombre de layer, nombre de neurones, epochs, learning rate etc..)
In back propagation you didn't update the bias weights. Do they stay constant throughout the whole training?
@meobliganaponerunnom
Жыл бұрын
No. Biases are also parameters so they should be updated.
The forward propagation is well explained but the backpropagation isn^t. The example has errors.
Great tutorial! I wonder how do you backpropagate for bias values b1 and b2. Great job!
Salut, je crois qu'il y a juste une petite erreur sur ta diapo lorsque tu récupères la dérivée partielle de Etotal par rapport à W8. OutJ2 est de 0,61 or dans le calcul il a la valeur de W8 soit 0,52. Peut-être une incompréhension de ma part, sinon super vidéo même en anglais ! :)
@DefendIntelligence
4 жыл бұрын
Oui il y a une petite erreure :/ . Merci !
Pourquoi lors du calcul du "nouveau poids" on multiplie la dérivée par 0.80? Merci
@noa4953
Жыл бұрын
c'est le learning rate, ie la "force" avec laquelle on déplace les poids. Le gradient en lui-même n'est qu'une direction, on choisit arbitrairement cette valeur.
At 11:13, could you explain how did you get 0.16? just the values
@niang46
3 күн бұрын
I think he just used o1 instead of o2 there. 0.8*0.2=0.16
The numbers don't add up. From the graph : (j1 = i1. w1 + i2.w2+b1) 'w2' corresponds to 0.13 and not 0.25. 0.25 appended to w3, as shown in the graph. w5 is 0.67 and not 0.84! I have a lot of trouble understanding.
@DefendIntelligence
4 жыл бұрын
Yes there is a mistake here. Consider the value in the formula :). Sorry about that.
Thanks for this video, but unfortunately unclear how to update bias values while training.
@FireBurn256
Жыл бұрын
Bias values should not be upgraded. They are here just to push the ending results towards the okayish values for outputs. It is the weights for biases that should be updated, and they are updated the same way the other links (I think).
No labeling is confusing me with numbers only
J1 = 0.5 but diagram show 0.4976
La partie simple du problème est longuement & bien expliqué , mais la backward propagation c'est vite fait mal fait. Comme si tu n'avais pas toi même compris la problématique. Tu m'as m'as plus induit en erreur qu'autre chose ...
J'ai manqué un épisode ou quoi ?! De l'anglais !!! 🤔Dois-je peut-être m'abonner aux chaînes de geek qui ont encore le Français comme langue de diffusion ?
@DefendIntelligence
3 жыл бұрын
C’est la seule vidéo de la chaîne en anglais 😊😊
@yvesdky6826
3 жыл бұрын
ha ok j'ai eu peur 😅
Je ne comprend rien c'est quoi e ?
Pourquoi la faire en anglais ??? T.T
en français, je suivais mais en anglais c'est plus possible
mais ce n'était pas censé être une chaine en français?
@DefendIntelligence
4 жыл бұрын
C'est l'unique vidéo en anglais :). Je voulais tenter une vidéo en anglais.
@rossloubassou8755
4 жыл бұрын
@@DefendIntelligence svp, je travaille depuis quelques temps grace à vous sur deep learning mais j'ai quelques soucis...comment je peux vous contacter directement?
a lot of mistakes 👎
bjr , ta casser l'emniance avec l'anglais
mec j'ai rien compris je te jure
There are mistakes in your calculations. Check again.
math is wrong i the first minute....this is useless
non mais non ! ! le but de ce sujet est complètement rater.... y'a presque pas de vidéo en français sur le sujet et y'as en des milliard en anglais !!!! fait le en français mon ami
@DefendIntelligence
3 жыл бұрын
Cest la seule en anglais tu m’excusera 😅
@MegaBaye
3 жыл бұрын
@@DefendIntelligence bien alors ....en tout cas tu fais un travail super !!!! il manque juste le français !!!
never make again a video in English plz
@renatomauro6300
2 жыл бұрын
Why not? Because his english is not perfect? No, its isn't. But he makes a valuable video! I loved the video.