What is Automatic Differentiation?

Ғылым және технология

This short tutorial covers the basics of automatic differentiation, a set of techniques that allow us to efficiently compute derivatives of functions implemented as programs. It is based in part on Baydin et al., 2018: Automatic Differentiation in Machine Learning: A Survey (arxiv.org/abs/1502.05767).
Errata:
At 6:23 in bottom right, it should be v̇6 = v̇5*v4 + v̇4*v5 (instead of "-").
Additional references:
Griewank & Walther, 2008: Evaluating Derivatives: Principles and Techniques
of Algorithmic Differentiation (dl.acm.org/doi/book/10.5555/1...)
Adams, 2018: COS 324 - Computing Gradients with Backpropagation (www.cs.princeton.edu/courses/...)
Grosse, 2018: CSC 321 - Lecture 10: Automatic Differentiation (www.cs.toronto.edu/~rgrosse/c...)
Pearlmutter, 1994: Fast exact multiplication by the Hessian (www.bcl.hamilton.ie/~barak/pap...)
Alleviating memory requirements of reverse mode:
Griewank & Walther, 2000: Algorithm 799: revolve: an
implementation of checkpointing for the reverse or adjoint mode of computational differentiation (dl.acm.org/doi/10.1145/347837...)
Dauvergne & Hascoët, 2006. The data-flow equations of checkpointing in
reverse automatic differentiation (link.springer.com/chapter/10....)
Chen, T et al., 2016: Training Deep Nets with Sublinear Memory Cost (arxiv.org/abs/1604.06174)
Gruslys et al., 2016: Memory-efficient Backpropagation
Through Time (arxiv.org/abs/1606.03401)
Siskind & Pearlmutter. Divide-and-conquer checkpointing for arbitrary programs with no user annotation (arxiv.org/abs/1708.06799)
Oktay et al., 2020: Randomized Automatic Differentiation (arxiv.org/abs/2007.10412)
Example software libraries using various implementation routes:
Source code transformation:
Tangent - github.com/google/tangent
Zygote - github.com/FluxML/Zygote.jl
Operator overloading:
Autograd - github.com/HIPS/autograd
Jax - github.com/google/jax
PyTorch - pytorch.org/
Graph-based w/ embedding mini lanugage:
TensorFlow - www.tensorflow.org
Special thanks to Ryan Adams, Alex Beatson, Geoffrey Roeder, Greg Gundersen, and Deniz Oktay for feedback on this video.
Some of the animations in this video were created with 3Blue1Brown's manim library (github.com/3b1b/manim).
Music: Trinkets by Vincent Rubinetti
Links:
KZread: / ariseffai
Twitter: / ari_seff
Homepage: www.ariseff.com
If you'd like to help support the channel (completely optional), you can donate a cup of coffee via the following:
Venmo: venmo.com/ariseff
PayPal: www.paypal.me/ariseff

Пікірлер: 105

  • @anjelpatel7918
    @anjelpatel79182 жыл бұрын

    I like how more and more people are adopting 3b1b's style. Makes the content much better and easier to understand. This slowly converts a lot of the more complicated topics into easy-to-digest modules.

  • @Artaxerxes.

    @Artaxerxes.

    2 жыл бұрын

    It literally uses manim

  • @tomerzilbershtein849

    @tomerzilbershtein849

    2 жыл бұрын

    3B1B’s creator Grant Sanderson created an animation library for himself to use to make videos. People forked that library (made a copy of it) and now there is a community supported version of it for creators, while he continues to use his own ( as well as the community one). Pretty cool stuff!

  • @atotoole21

    @atotoole21

    Жыл бұрын

    @@Artaxerxes. Nice! I didn't know about manim or that 3B1B's animation technic was python based. I assumed it was done by hand using Illustrator or something.

  • @umbraemilitos

    @umbraemilitos

    10 ай бұрын

    Yes, though I don't think 3B1B wants their videos to be a template to copy. I think he's happy to inspire, but doesn't think that his Manim program is the right tool for most cases. He released a video explaining the SOME criteria, and it allows for lots of creative expression in teaching.

  • @andreypopov6166

    @andreypopov6166

    Ай бұрын

    3b1b or any other style on its own doesn't mean that the content is easier to understand.

  • @raminbohlouli1969
    @raminbohlouli19699 ай бұрын

    I knew basically 0 about AD and didn't know where to start since all the articles, websites ,books etc that I have looked into, explained everything in a really comlicated way. I would like to thank you immensly for this very informative yet simple video! Now I know enough to dive deeper into the concept. This video was all I needed. Keep up the great work! You got yourself a new follower.

  • @arkasaha4412
    @arkasaha44123 жыл бұрын

    Man this is pure gold. We all use this stuff but hardly have a clear idea about it's nitty-gritties. Thanks for thre awesome content and presentation, keep it up! :)

  • @stathius
    @stathius Жыл бұрын

    Class act, being concise and clear at the same time is no easy feat. Thank you.

  • @andrewbeatty5912
    @andrewbeatty59123 жыл бұрын

    Best summary I've ever seen !

  • @abhishek.shenoy
    @abhishek.shenoy3 жыл бұрын

    This is so well explained! I love the quality of your videos!

  • @chandank5266
    @chandank52663 жыл бұрын

    Your way of explanation is outstanding.....love from india sir♥️

  • @jaf7979
    @jaf7979 Жыл бұрын

    Well done, superbly explained in context of other differentiation methods. Exactly what I needed!

  • @arnold-pdev
    @arnold-pdev2 жыл бұрын

    Went from complete ignorance to understanding in 15 min. Thank you!

  • @koushik7604
    @koushik7604 Жыл бұрын

    This is highly motivated by Andrej Karpathy's lecture, but very clear explanation. It is indeed a good addition to my resource list.

  • @pandatory1108
    @pandatory11083 жыл бұрын

    Excellent video Ari. Thanks for such a great explanation! Also, your animations were really well done. I suspected you might be using manim based on the style and then I read the description :)

  • @TheLokiGT
    @TheLokiGT Жыл бұрын

    Very good job. One of the very few good videos I've seen around about autodiff.

  • @esaliya
    @esaliya3 жыл бұрын

    This is a neat summary that's hard to find in a single place!

  • @pulusound
    @pulusound3 жыл бұрын

    very well explained video with lovely calm background music. i need to brush up on my vector calculus and come back but this gave me a good intuition. hope you make more of these!

  • @YorkiePP
    @YorkiePP3 жыл бұрын

    Fantastic video on autodiff, really cleared up a lot of things I wasn't sure about.

  • @prydt
    @prydt3 жыл бұрын

    Amazing explanation of Autograd and wonderful visualizations!!! Thank you so much.

  • @jorgeanicama8625
    @jorgeanicama8625 Жыл бұрын

    Thank you Ari. I used symbolic computation in the past but this novel way of calculating derivatives is quite interesting. Learnt lots by watching your video. For sure, I will follow up with the recommended literature

  • @BrianAmedee
    @BrianAmedee3 жыл бұрын

    Excellent presentation mate. That was an awesome explanation and a nice trip down memory lane (university days).

  • @jkkang9666
    @jkkang96663 жыл бұрын

    Thanks for the great summary and the nice video.

  • @aldaszarnauskas27
    @aldaszarnauskas27 Жыл бұрын

    Great video, well presented, clearly explained, nice visualisation... Thank you!

  • @SohailKhan-zb5td
    @SohailKhan-zb5td Жыл бұрын

    Thanks a lot. This kind of videos are really a lot of hardwork to produce. Thanks a lot

  • @user-kl1xv8in2q
    @user-kl1xv8in2q2 жыл бұрын

    Thanks you so much. This video really helps me to understand a little more what is automatic differentiation is.

  • @stansilverman1901
    @stansilverman19013 жыл бұрын

    In order to explain this to my wife, I differentiated voter rights-the analog process humans decide who should be allowed to vote, someone who looks like me, or everyone?. I think she got it. Brilliant Ari

  • @weinansun9321
    @weinansun93213 жыл бұрын

    more videos please, this is amazing!

  • @Roshan-xd5tl
    @Roshan-xd5tl2 жыл бұрын

    Brilliant video, Ari. Thank you!

  • @asdf56790
    @asdf56790 Жыл бұрын

    Exactly what I was looking for! Thank you :)

  • @ccgarciab
    @ccgarciab3 жыл бұрын

    Looking forward to your future videos

  • @halneufmille
    @halneufmille3 жыл бұрын

    Thanks! I never understood this before, but it became obvious in one second.

  • @AJ-et3vf
    @AJ-et3vf2 жыл бұрын

    Awesome presentation! I understand autodiff a little bit more. I'll rewatch several more times in the future to understand it better till I completely understand it :)

  • @setsunakevin6861
    @setsunakevin68613 жыл бұрын

    Amazing video! Very well explained.

  • @thivinanandh4430
    @thivinanandh44302 жыл бұрын

    Awesome Explanation..!!!!! Keep rocking..!!!

  • @BrianBin
    @BrianBin7 ай бұрын

    I like your tutorial video because it is short and good

  • @VHenrik007
    @VHenrik00714 күн бұрын

    Just as a note for anyone wondering, the arxiv link doesn't work because it includes the closing parenthesis. Otherwise great video!

  • @Vaporizer41
    @Vaporizer413 жыл бұрын

    Great video!, I love your content, hope you will keep making many more :)

  • @jishnuak3000
    @jishnuak3000 Жыл бұрын

    Very intuitive explanation, thanks

  • @nathanielscreativecollecti6392
    @nathanielscreativecollecti63923 жыл бұрын

    Bravo! I have a final today and now I get it!

  • @KulvinderSingh-pm7cr
    @KulvinderSingh-pm7cr Жыл бұрын

    This is exceptionally well explained.

  • @KulvinderSingh-pm7cr

    @KulvinderSingh-pm7cr

    Жыл бұрын

    And thanks a lot for references too, they're very useful.

  • @datamike7457
    @datamike74573 жыл бұрын

    Ari, this is great content! I used to call symbolic differentiation 'analytical'. It is obnoxious to track all of the coefficients.

  • @sandropollastrini2707
    @sandropollastrini27072 жыл бұрын

    Beautiful and clear!

  • @tom_verlaine_again
    @tom_verlaine_again2 жыл бұрын

    Great lesson! Thank you.

  • @vijaymaraviya9443
    @vijaymaraviya94433 жыл бұрын

    Awesome summary👌

  • @andersgadlauridsen1533
    @andersgadlauridsen1533 Жыл бұрын

    So is so great content, please keep making more :)

  • @hadik4497
    @hadik44973 жыл бұрын

    Thanks! This is phenomenal!

  • @kong1397
    @kong13973 жыл бұрын

    Wow, that's great explanation.

  • @manumerous
    @manumerous2 жыл бұрын

    This video is genius! love it.

  • @juandavidnavarro
    @juandavidnavarro11 ай бұрын

    Excellent video!! thank you so much. I have a question: is there any AD reverse mode based on dual numbers?

  • @amadlover
    @amadlover Жыл бұрын

    timely information about source code manipulation and google tangent. It was a kind of confirmation for me that it was indeed possible. I started to learn meta programming hoping to generate code for the differentials, based on the function, without actually knowing if it was possible., basically a shot in the dark. cheers

  • @SuperDonalByrne
    @SuperDonalByrne4 ай бұрын

    Great video!

  • @tom-sz
    @tom-szАй бұрын

    Great video! Where can I learn more about the rounding and truncation errors plot at 2:06? I need to make an analysis of these errors for a project. Thanks :)

  • @amirrezarezayan8121
    @amirrezarezayan812120 күн бұрын

    great great great , Thanks a million 😃

  • @jbl4174
    @jbl41742 жыл бұрын

    Thanks for putting out such a great video. Im still a bit confused why forward mode AD requires a separate forward pass for each input variable. In Bayden et al. it says "Conversely, in the other extreme of f : R^n → R, forward mode AD requires n evaluations to compute the gradient". But I dont see why you couldnt compute the primal table and then the tangent table for each n variables, unless "n evaluations" means n evaluations of the tangent table and not forward passes.

  • @bitahasheminezhad2887
    @bitahasheminezhad28873 жыл бұрын

    That was awesome, thank you

  • @newbie8051
    @newbie8051 Жыл бұрын

    Beautiful video but I lost track quite a few times, is there any pre-requisite topics/stuff I should know before trying to understand this

  • @jianwang7433
    @jianwang74332 жыл бұрын

    thanks for sharing

  • @bryanbischof4351
    @bryanbischof43513 жыл бұрын

    This is quite good. I’m wondering if a part 2 digging deeper yet into how the implementation takes advantage of the concept you introduce here would be possible?

  • @ariseffai

    @ariseffai

    3 жыл бұрын

    Thanks Bryan. That's a possibility. It would certainly be interesting to dig deeper into the implementation schemes, which were only briefly described here. In the meantime, check out some of the links for further information on implementations.

  • @softerseltzer
    @softerseltzer3 жыл бұрын

    Love it!

  • @alfcnz
    @alfcnz2 ай бұрын

    @Ari, this is really great! 🤩🤩🤩

  • @ariseffai

    @ariseffai

    2 ай бұрын

    Thanks Alfredo!

  • @dullyvampir83
    @dullyvampir835 ай бұрын

    Great video, thank you! Just a question, you said a main problem with symbolic differentiation is that no control flow operations can be part of the function. Is that in any way different for Automatic differentiation?

  • @sofa33
    @sofa332 жыл бұрын

    Thank you so much!

  • @superagucova
    @superagucova3 жыл бұрын

    Loved this video! Are you using 3b1b's Manim?

  • @ariseffai

    @ariseffai

    3 жыл бұрын

    Yep! Manim is awesome

  • @GordonWade-kw2gj
    @GordonWade-kw2gjАй бұрын

    Wonderful video. The detailed example helps tremendously. And I think there's an error: At t=6.24, sInce $v_6 = v_5\times v_4$, in $\dot{v}_6$ shouldn't there be a plus sign where you've got a minus sign?

  • @gabrielmccartney7975
    @gabrielmccartney79752 жыл бұрын

    Hello! Can we use dual numbers for integration?

  • @UnnamedThe
    @UnnamedThe3 жыл бұрын

    12:26 May I ask where you got that c

  • @ariseffai

    @ariseffai

    3 жыл бұрын

    Baydin (arxiv.org/abs/1502.05767) references this bound in Sec. 3.2. I don't have the exact location for it in Griewank and Walther.

  • @UnnamedThe

    @UnnamedThe

    3 жыл бұрын

    @@ariseffai Thank you a lot! That is already very helpful.

  • @rtcoffee1235
    @rtcoffee12353 жыл бұрын

    thanks for this!

  • @ktugee
    @ktugee Жыл бұрын

    slight type : @6.29 : v6' = v5'v4 + v4'v5. ( there should a + instead of - )

  • @sirallen2591
    @sirallen2591 Жыл бұрын

    Thanks!

  • @user-vm9hl3gl5h
    @user-vm9hl3gl5h Жыл бұрын

    어쨌든 요점은, 모든 것을 다 closed form으로 저장해서 gradient를 매번 구하는 게 아니라는 점이다. 한 번 계산할 때마다, output value와 더불어 gradient value도 함께 계산해두어, 나중에 forward / backward 할 때 사용한다.

  • @PahenPWNZ
    @PahenPWNZ3 жыл бұрын

    Awesome explanation, thanks! But I still have one question, can someone explain please, at 12:05, right column (Adjoints) I don't understand how did we get these values (f. e. v bar 5 = v4 * v bar 6, etc...) From where did these values come from? If we use the formula at the previous slide with sum of children nodes, I get different values..

  • @MarkKrebs

    @MarkKrebs

    2 жыл бұрын

    Hi I have same Q. The moment when adjoints are defined is a break to me. vbar5 = v4 * vbar6 seems "backwards." I see it matches the formula given on the prior graph page, but not the intuition for it. "The sum of the output values, weighted by my leverage in creating them," is as close as I can get.

  • @abhaysolanki9284

    @abhaysolanki9284

    2 жыл бұрын

    I know when he said children I automatically thought of v3 and v4. But instead the children in the case v5 is only v6. And children for v4 are v5 and v6. Children are the nodes that the node is pointing to.

  • @paulpassek6118
    @paulpassek61183 жыл бұрын

    Thanks for the superb video. I think you made a little mistake in the forward mode example at 6:24. Shouldn't it be v̇_6 = v̇_5*v_4 + v̇_4*v5 ?

  • @ariseffai

    @ariseffai

    3 жыл бұрын

    Thanks Paul, good catch-placed this under errata.

  • @germangonzalez3063
    @germangonzalez30633 жыл бұрын

    Very useful

  • @jorgeanicama8625
    @jorgeanicama8625 Жыл бұрын

    One more note ARI. I think there is a small typo. From minute 7:36 until 7:46 the derivative of V6 should be a "+" instead of a "-".

  • @proweiqi
    @proweiqi3 жыл бұрын

    this is very good. but some of the stuff moves too fast and not explaining things like the primal part clearly enough

  • @deepanshuchoudhary4598
    @deepanshuchoudhary45983 жыл бұрын

    Please reply to my Question. Where do you learn these and how are you able to grasp them completely, I'm a data science student and i need to know it badly. Pls share insights.

  • @ariseffai

    @ariseffai

    3 жыл бұрын

    I found the survey by Baydin et al. to be particularly helpful. See the description for links!

  • @chnlior
    @chnlior3 жыл бұрын

    Great summary, Ari. Thank you. I think there is small error in 6:23. v6' = v5'v4 + v4'v5 and not "-".

  • @ariseffai

    @ariseffai

    3 жыл бұрын

    Thanks Lior, good catch-placed this under errata.

  • @9888622400
    @98886224002 жыл бұрын

    thanks bro!

  • @rachelellis6655
    @rachelellis6655 Жыл бұрын

    Derivative at 0:43 would actually be: f' (x) = (2x)e^(2x-1)- 3x^2 ... would it not? Great video.. I've subscribed! I'm just learning derivative and chain rule so I want to be sure I'm understanding the concept/rules/procedures correctly. I'm probably wrong though, that's why I'm asking for verification... thanks!

  • @M3rtyville
    @M3rtyville14 сағат бұрын

    Reverse-on-Forward sounds like ACA.

  • @zappist751
    @zappist751 Жыл бұрын

    THANK YOU LORD THANK YOU JESUS AND THANK YOU SIR

  • @diodin8587
    @diodin85872 жыл бұрын

    not mention *dual number*?

  • @yavarjn2055
    @yavarjn2055 Жыл бұрын

    Wooow

  • @bokibogi
    @bokibogi Жыл бұрын

    4:27 automatic differentiation ...

  • @Manishsingh-dl6ho
    @Manishsingh-dl6ho3 жыл бұрын

    Fking Great!!!

  • @Rems766
    @Rems7662 жыл бұрын

    chain rule rules

  • @sarvasvarora
    @sarvasvarora3 жыл бұрын

    Reddit gang?

  • @user-rr7uz9hd4m
    @user-rr7uz9hd4m2 жыл бұрын

    Do you get paid to make such videos? Definitely should

  • @MariaFernandez-pv9hn
    @MariaFernandez-pv9hn3 жыл бұрын

    You should point on the screen what you are talking about when doing examples.

  • @maxyazhbin826
    @maxyazhbin8263 жыл бұрын

    please no music, fantastic otherwise

  • @ollllj
    @ollllj6 ай бұрын

    on expression-swell: one of my proudest computations (and hard to debug code) is the automated differentiation 3rd derivative of the general quotient rule within [shadertoy ... /WdGfRw ReTrAdUi39] , with identical parts already pre-multiplied out by how much it is constantly repeated. webgl code: Struct d000{float a;float b;float c;float d;};//1 domains t,dt,dt²,dt³ , sure, this could just be a vec4, but i REALLY needed my custom labels for debugging. d000 di(d000 a,d000 b){return d000( //autodiff up to 3 derivatives for division , up to 3 iterations of; quotient rule within chain rule) a.a/b.a //0th derivative, simple division ,(a.b*b.a-a.a*b.b)/(b.a*b.a) //dx first derivative ,((a.c*b.a+a.b*b.b-a.b*b.b-a.a*b.c)*(b.a*b.a)-2.*(a.b*b.a-a.a*b.b)*(b.a*b.b))/(b.a*b.a*b.a*b.a) //dxdx second derivative ,((((a.d*b.a+a.c*b.b+a.c*b.b+a.b*b.c-a.c*b.b-a.b*b.c-a.b*b.c-a.a*b.d)*(b.a*b.a) +(a.c*b.a+a.b*b.b-a.b*b.b-a.a*b.c)*(b.b*b.a*b.a*b.b)) +(-2.*(a.c*b.a+a.b*b.b-a.b*b.b-a.a*b.c)*(b.a*b.b) +(a.b*b.a-a.a*b.b)*(b.b*b.b+b.a*b.c)))*(b.a*b.a*b.a*b.a) -((a.c*b.a+a.b*b.b-a.b*b.b-a.a*b.c)*(b.a*b.a) -2.*(a.b*b.a-a.a*b.b)*(b.a*b.b)) *4.*(b.b*b.a*b.a*b.a))/(b.a*b.a*b.a*b.a*b.a*b.a*b.a*b.a)) //dxdxdx //3rd derivative quotient rule sure is something ;}

  • @a.osethkin55
    @a.osethkin552 жыл бұрын

    Thanks!!!

Келесі