Convolution Intuition

In this video, I provide some intuition behind the concept of convolution, and show how the convolution of two functions is really the continuous analog of polynomial multiplication. Enjoy!

Пікірлер: 148

  • @Keithfert490
    @Keithfert4904 жыл бұрын

    Idk if this helped with my intuition, but it did kinda blow my mind.

  • @hamzazavier5645

    @hamzazavier5645

    2 жыл бұрын

    instaBlaster

  • @influential7693
    @influential76932 жыл бұрын

    The result is not important , whats important is the process.. - Dr. Peyam..... sir you are extremely motivational to me

  • @dougr.2398
    @dougr.23984 жыл бұрын

    General comment: Convolution can be thought of as a measure of self-similarity. The more self similarity between and within the two functions, the larger the convolution integral’s value. (There is the group theory connection)

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Interesting!

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Dr Peyam yes! That is why & how it applies to biology and music theory!!

  • @blackpenredpen

    @blackpenredpen

    4 жыл бұрын

    Wow! I didn't know about that! Very very cool!

  • @chobes1827

    @chobes1827

    4 жыл бұрын

    This intuition makes a lot of sense because for each x the convolution is essentially just an inner product between one function and another function that has been reflected and shifted.

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Chobes 182 so if thé function and its shiftted values are strongly correlated (or even equal) the convolution integral approaches the integral of the square of the function. The more dissimilar the shift and the non-shifted values are, the integral can be greater or lesser than the integral of the square.

  • @quantphobia2944
    @quantphobia29444 жыл бұрын

    OMG, this is the simplest explanation of convolution I've ever come across, thank you so much!!!

  • @blackpenredpen
    @blackpenredpen4 жыл бұрын

    So who is convolution? I still don’t get it.

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Oh, it’s just the integral of f(y) g(x-y) dy, a neat way of multiplying functions

  • @blackpenredpen

    @blackpenredpen

    4 жыл бұрын

    Dr Peyam Lol I know. Notice I asked “who”. Since I remembered my still asked me who is convolution before. Because it’s taught after Laplace.

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Self-similarity..... see my other postings in the comments, please!!

  • @blackpenredpen

    @blackpenredpen

    4 жыл бұрын

    I just did! Thank you! That is so cool!

  • @bballfancalmd2583
    @bballfancalmd25834 жыл бұрын

    Dear Dr. Peyam, THANK YOU !! And engineering we’re taught how to use convolution, but never learn where the hell it comes from. Your explanations are like a brain massage 💆‍♂️. Thank you, thank you! You know an explanation is good when it not only answers a question I hadn’t even thought of, but also opens my mind to other ways of thinking about math. So much fun! Danka!!

  • @MrCigarro50
    @MrCigarro504 жыл бұрын

    Thanks for this video. For us, statisticians, is a very important result for it is related to finding the distribution of the sum of two random variables. So, in general I wish to express our appretiation for your efforts.

  • @arteks2001
    @arteks20012 жыл бұрын

    I loved this interpretation. Thank you, Dr. Peyam.

  • @area51xi
    @area51xi Жыл бұрын

    This arguably might be the most important video on youtube. I wanted to cry at the end from an epiphany.

  • @drpeyam

    @drpeyam

    Жыл бұрын

    Thanks so much 🥹🥹

  • @stevenschilizzi4104
    @stevenschilizzi41042 жыл бұрын

    Brilliant explanation! Brilliant - makes it look so natural and so simple. Thanks heaps. I had been really curious about where it came from.

  • @gastonsolaril.237
    @gastonsolaril.2374 жыл бұрын

    Damn, this is amazing, brother. Though I'll need to watch this video like 2 or 3 more times to connect the dots. Keep up with the good work! Really, you are one of the most interesting and useful KZread channels I've been subscribed to

  • @yhamainjohn4157
    @yhamainjohn41574 жыл бұрын

    One word in my mouth : Great ! Bravo !

  • @ibrahinmenriquez3108
    @ibrahinmenriquez31084 жыл бұрын

    I can surely say that i am continuously happy to see you explaining this ideas. Thanks

  • @erayser
    @erayser4 жыл бұрын

    Thanks for the explanation! The convolution is quite intuitive to me now

  • @user-ed1tg9rj1e
    @user-ed1tg9rj1e4 жыл бұрын

    Great video!!! It really helps to make the intuition of convolution!

  • @mnada72
    @mnada723 жыл бұрын

    That clarified convolution once and for all 💯💯

  • @apoorvvyas52
    @apoorvvyas524 жыл бұрын

    Great intuition. Please do more such videos.

  • @mattetor6726
    @mattetor67263 жыл бұрын

    Thank you! The students you teach are very lucky :) And we are lucky to be able to watch your videos

  • @sciencewithali4916
    @sciencewithali49164 жыл бұрын

    Thank you so much for the baby step explanation ! It became completly intuitive thanks to the way you ve presented. It ! We want more of awsome content

  • @bipuldas2060
    @bipuldas20603 жыл бұрын

    Thank you. Finally understood the intuition behind this pop operation called convolution.

  • @dvixdvi7507
    @dvixdvi75073 жыл бұрын

    Awesome stuff - thank you for the clear explanation

  • @corydiehl764
    @corydiehl7644 жыл бұрын

    Okay, I really am seeing what you did there, but I feel like what makes this really suggestive to me is looking at the each power of x as a basis function. Wooooow, this is so much more abstracted and interesting compared to the way I usually look at it as a moving inner product

  • @LuisBorja1981
    @LuisBorja19814 жыл бұрын

    Dirty puns aside, really nice analogy. Never thought of it that way. As always, brilliant work.

  • @lambdamax
    @lambdamax4 жыл бұрын

    Hey Dr. Peyam. I had this issue in undergrad too! Thank you for the video. Out of curiosity, for convolutional neural networks, whenever they talk about the "window" in convolving images, would the "window" be analogous to getting the coefficient of a particular degree on this example?

  • @klam77
    @klam774 жыл бұрын

    very enjoyable! good stuff!

  • @prettymuchanobody6562
    @prettymuchanobody6562 Жыл бұрын

    I love your attitude, sir! I'm motivated just hearing you speak, let alone how good you explain the subject.

  • @Debg91
    @Debg914 жыл бұрын

    Very neat explanation, thanks! 🤗

  • @camilosuarez9724
    @camilosuarez97244 жыл бұрын

    Just beautiful! thanks a lot!

  • @visualgebra
    @visualgebra4 жыл бұрын

    More interesting dear Professor

  • @monsieur_piyushsingh
    @monsieur_piyushsingh Жыл бұрын

    You are so good!!!

  • @corydiehl764
    @corydiehl7644 жыл бұрын

    Now I'm really curious if this interpretation could be used to give a more intuitive interpretation of volterra series analysis. Which is my favorite analysis technique that I learned in electrical engineering

  • @polacy_w_strefie_komfortu
    @polacy_w_strefie_komfortu4 жыл бұрын

    Very interesting. I wonder if we can draw other intuitions from polynomial functions and transfer them to general analytical functions. Anyway analytical function can be aproxymated locally by Taylor series. But in this case analogy seems to work not only locally but also in whole range.

  • @sukursukur3617

    @sukursukur3617

    3 жыл бұрын

    :)

  • @sheshankjoshi
    @sheshankjoshi8 ай бұрын

    This is wonderful. It does really make sense.

  • @ShubhayanKabir
    @ShubhayanKabir4 жыл бұрын

    You had me at "thanks for watching" 😍🤗

  • @kamirtharaj6801
    @kamirtharaj68014 жыл бұрын

    Thanks man......finally understood why we need convolution theorem

  • @lauralhardy5450
    @lauralhardy5450Ай бұрын

    Thanks Doc, easy to follow. This is a good generalisation of convolution.

  • @vineetkotian5163
    @vineetkotian51633 жыл бұрын

    I wasn't really understanding convolution...just had a broad idea of it.... this video made my mind click😎🔥.. insane stuff

  • @ronaktiwari7041
    @ronaktiwari70413 жыл бұрын

    Subscribed! It was wonderful!

  • @gf4913
    @gf49134 жыл бұрын

    This was very useful, thank you so much

  • @maestro_100
    @maestro_1002 жыл бұрын

    Wow!, Thank You Very Much Sir....This Is A Very Nice Point Of View!!!

  • @DHAVALPATEL-bp6hv
    @DHAVALPATEL-bp6hv4 жыл бұрын

    Convolution is for most mortals, a mathematical nightmare and absolutely non intuitive. But this explanation, makes it more obvious. So thumbs up !!!

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit4 жыл бұрын

    When I transform a function f(x )into an endless series, and also the function g(x) Then I create a convolution with these two powerseries in your discrete way, with Sigma and Indices. Is the resulting series the same as I transform the continous version (f*g)(x) into a series?

  • @corydiehl764

    @corydiehl764

    4 жыл бұрын

    That was my realization from the video too. Now that I think about it, I think that's the result from multiplication of taylor series both fixed about a point a.

  • @chuefroxz9408
    @chuefroxz94084 жыл бұрын

    very helpful sir! thank you very much!

  • @ranam
    @ranam2 жыл бұрын

    Brother I know mechanical engineers could find resonance but when I had a deep thought on this resonance Is an slow accumulation of energy which is accumulated very high in small installments when the frequencies match if you strike a turning fork of 50 hz you get the same frequency of vibration on another tuning fork so they both vibrate if you strike it harder the amplitude changes hence loudness is a human factor the frequency is the same the languages that human speak through out the world the sound only resonate your ear drum for few seconds my question is that the harmonics is the fundamental frequency and overtones are the frequency that follow it take a word in any language you spell it according to convolution the thing scales and ques and stack the signal so convolution can be used to model resonance so when your ear drum vibrates it vibrates so the electrical signals are carried to brain like tuning fork ear drums vibrate within the audible spectrum 20 hz to 20000 hz hence resonance is caused by the word we speak and within the audible range the ear drums vibrate and we make sense of words I have seen in one videos on KZread that due to harmonics in any sound causes resonance which could be modelled by convolution recalling the resonance its destructive because slow and steady accumulation of sound on the mass causes high stress and high energy to build inside and stress increase and the system fractures or collapses but our ear drum hearing the sound from human languages try to vibrate but why our ear drum when subjected to continuous exposure of sound does not fracture or rupture like a wine glass iam not telling about high loud sound higher than 80 db but a audible range sound within the frequency of 20 hz to 20000 hz under continuous exposure why it's not damaging it again not failure by high energy but low one in synchronisation on air . But I tried it in my students when I told them to be quite in class they did not listen to me so I took my phone and set an frequency 14000 hz and they told it was irritating the idea of resonance is "small effort but large destruction " just like Tacoma bridge where the wind just slowly accumulated energy on the bridge and it collapsed it so my conclusion is if an audible frequency at continuous exposure to an human ear can it cause bleeding again "small effort but great destruction" sorry for the long story I you are able to reach hear you must be as curious as me so still not finished the ear drum is shook by harmonics in the sound we make by the words( or )overtones in the sound we make by the words I know harmonics is the fundamental frequency and overtones are following it which under slow and steady accumulation of sound energy resonates and could damge the ear drums again "small effort but big destruction" not to mention we assume the person is in coma or brain dead hence when the sound irritates him he or she could not make a move so my question is so simple normally human ear responds to harmonics or overtones according to convolution which could be a disaster but with minimal effort 🙏🙏🙏🙏 at here I could be wrong because harmonics can also be used to construct sound so can it be destructive or the overtones which are the trouble makers and which one according to this gives a response curve when two signals convolved by harmonics or overtones which is destructive but with minimal effort and convolution happens when ear drums oscillate is by harmonics or the overtones or also the trouble makers there

  • @skkudj
    @skkudj4 жыл бұрын

    Thanks for good video - from Korea

  • @jaikumar848
    @jaikumar8484 жыл бұрын

    Thanks a lot doctor payam ! Convolution is really confusion topic for me ... I would like to ask that, is convolution useful for mathematicians. ..? It is part of digital signal processing as per my information

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    So many applications! To get the distribution of the random variable X+Y, to solve Poisson’s equation, etc.

  • @klam77

    @klam77

    4 жыл бұрын

    @@drpeyam Here's "the" classic video on convolution from the engineering school perspective: kzread.info/dash/bejne/kaqtzcdspqmafs4.html you will have to forgive the "cool 70s disco" look of the professor, it was indeed the 70s, so......he looks the part, (but, Prof Oppenheim is/was the guru on signals and systems theory") This is immensely useful math. immensely.

  • @sandorszabo2470

    @sandorszabo2470

    4 жыл бұрын

    @@klam77 I agree with you. The "real" intuition of convolution comes from Signals and systems, the discrete case.

  • @klam77

    @klam77

    4 жыл бұрын

    @@sandorszabo2470 Hello. But prof Peyam is nearly the same: when he talks of convolution in terms of multiplying two polynomials, prof oppenheim talks about "linear time invariant" systems which produce polynomial sums as "outputs" of multiple inputs in the LTI context! Almost similar! But yes, the original intuition was from the Engg department side, historically.

  • @user-mz6hc5cv8x
    @user-mz6hc5cv8x4 жыл бұрын

    Thanks for video De Peyam! Can you show Fourier and Laplace transform of convolution?

  • @danialmoghaddam8698
    @danialmoghaddam86982 жыл бұрын

    thank you so much best one found

  • @DHAVALPATEL-bp6hv
    @DHAVALPATEL-bp6hv4 жыл бұрын

    Awesome !!!!

  • @dgrandlapinblanc
    @dgrandlapinblanc4 жыл бұрын

    Thank you very much.

  • @burakbey21
    @burakbey21Ай бұрын

    Thank you for this video, very helpful

  • @drpeyam

    @drpeyam

    Ай бұрын

    You are welcome!

  • @adambostanov4822
    @adambostanov4822 Жыл бұрын

    so what is the result of the convolution of those two polinomials?

  • @dougr.2398
    @dougr.23984 жыл бұрын

    My profs at The Cooper Union, 1967-1971 likes to say the variable integrated over is “integrated out”..... which I hold is not accurate, as it is only in appearance, vanished..... the functions evaluated at each point of the “integrated out” variable contribute to the sum, as well as the end points. As the variable EXPLICITLY vanishes, it “goes away. By the way, Dr. Tabrizian, what is “f hat” you refer to in the Fourier transform description of the convolution? Please explain?

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Fourier transform

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Dr Peyam thanks!

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit4 жыл бұрын

    2:35 You should handle less coefficients and more coffeeicents ^^

  • @krzysztoflesniak2674
    @krzysztoflesniak26742 жыл бұрын

    Remark 1: This one is pretty nice: kzread.info/dash/bejne/g6GX0bKShcnIeps.html ["What is convolution? This is the easiest way to understand" by Discretised] It is in terms of integration of processes with fading intensity, but it is amenable for economic interpretation as well. Remark 2: This multiplication by gathering indices that sum up to a constant is crucial for the Cauchy product of two infinite series instead of polynomials (Mertens theorem). Remark 3: This convolution is with respect to time. In image manipulation the convolution is with respect to space (a kind of weighted averaging over pixels). That "spatial convolution" in the continuous case leads to an integral transform. One of the functions under convolution is then called a kernel. Just loose thoughts.

  • @alexdelarge1508
    @alexdelarge1508 Жыл бұрын

    Sir, with your explaination, what was an esotheric formula, now has some real figure. Thank you very much!

  • @jonasdaverio9369
    @jonasdaverio93694 жыл бұрын

    Is it called the convolution because it is convoluted?

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Hahaha, probably! But I’m thinking more of “interlacing” values of f and g

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Convolution is a term that really might better be described as “self-similarity”. It even has application to music theory! (THERE is the Group Theory connection!!! And even Biology!!!)

  • @bat_man1138
    @bat_man11383 жыл бұрын

    Nice bro

  • @Mau365PP
    @Mau365PP4 жыл бұрын

    7:13 what do you mean with *f* and *g* as *"continuous polynomials"* ?

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Think of a polynomial as an expression of the form sum a_n y^n and what I mean is an expression of the form sum a_x y^x where x ranges over the reals

  • @cactuslover2548
    @cactuslover25482 жыл бұрын

    My mind went boom after this

  • @maxsch.6555
    @maxsch.65554 жыл бұрын

    Thanks :)

  • @BootesVoidPointer
    @BootesVoidPointer2 жыл бұрын

    What is the intuition behind the differential dy appearing as we transition to the continuous case?

  • @krzysztoflesniak2674

    @krzysztoflesniak2674

    2 жыл бұрын

    It tells to integrate wrt y and keep x fixed (the resulting function is of x variable). Integration wrt y is a continuous analog of summation over the index (also termed y at the end of the presentation, to highlight the jump from a discrete to the continuous case).

  • @patryk_49
    @patryk_494 жыл бұрын

    Wikipedia says the symbol from your thumbnail means something called ,,cross corelation'' and it's simmilar to convolution. I hope somewhere in future you will make a video about that.

  • @linushs
    @linushs3 жыл бұрын

    Thank you

  • @Aaron-zi1hw
    @Aaron-zi1hw11 ай бұрын

    love you sir

  • @blurb8397
    @blurb83974 жыл бұрын

    Hey Dr Peyam, can we perhaps see a more rigorous definition of what you mean by “continuous polynomials”, how functions can be described in terms of them, and how that leads to the convolution? I would also love to see how this connects to the view of convolution in terms of linear functionals, as Physics Videos By Eugene made an extensive video on that which at least I didn’t really understand... Anyhow, thanks a lot for this!

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    There is no rigorous definition of continuous polynomials, they don’t exist

  • @blurb8397

    @blurb8397

    4 жыл бұрын

    @@drpeyam Couldn't we define them as an integral average? like the definite integral from zero to n of a(t)*x^t dt, all of that divided by n to "cancel out" the "dt" part, if we look at it from a perspective of dimensional analysis like done in physics

  • @leonardromano1491
    @leonardromano14914 жыл бұрын

    That's cool and gives a quite natural vector product for vectors in R^n: (u*v)_i=Sum(0

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Coool!!!

  • @Brono25
    @Brono253 жыл бұрын

    I could never find an explanation of why (graphically) you have to reflect one function, multiply both and integrate. I see its too keep the indices to always sum the same?

  • @secretstormborn
    @secretstormborn4 жыл бұрын

    amazing

  • @prasadjayanti
    @prasadjayanti2 жыл бұрын

    It made sense to me in some way...I still want to know the advantages of 'reflecting' and 'shifting' a function and then multiplying that with another function. If we do not 'reflect' then what ? Shifting I can understand..we have to keep moving window everywhere..

  • @aneeshsrinivas9088
    @aneeshsrinivas90882 жыл бұрын

    do alternate notations for convolution exist. I hate that notation for convolution since i love using * to mean multiplicaiton and do so quite frequently.

  • @drpeyam

    @drpeyam

    2 жыл бұрын

    I love *

  • @amirabbas_mehrdad
    @amirabbas_mehrdad3 жыл бұрын

    It was amazing but at the moment you replaced coefficient with the function itself, I didn't understand actually how you did this. Is there anyone who can make it clear for me? Thanks.

  • @ventriloquistmagician4735
    @ventriloquistmagician47353 жыл бұрын

    brilliant

  • @poutineausyropderable7108
    @poutineausyropderable71084 жыл бұрын

    Does this mean if you convolute a function with 1 you get a taylor series?

  • @poutineausyropderable7108

    @poutineausyropderable7108

    4 жыл бұрын

    That means you could get the taylor series of sin^2x, that would be useful in solving diff equations by solving for a taylor series. You could also continue value for sinx in the infinities.

  • @poutineausyropderable7108

    @poutineausyropderable7108

    4 жыл бұрын

    Oh so i finally understood. F and g aren't time functions. They are the formula for the element of the taylor series. Sinx isn't f. F is i^(k-1)*(1/k!)* ( k mod 2)

  • @omerrasimknacstudent5049
    @omerrasimknacstudent5049 Жыл бұрын

    I understand that convolution is analogous to the multiplication of two polynomials. The intuition here is to express any signal f in terms of its impulses, just like coefficients of a polynomial. It makes sense, thanks. But I still do not understand why we convolute a signal when it is filtered. We may multiply the signal with the filter point-wise.

  • @allyourcode
    @allyourcode3 жыл бұрын

    I feel that this definitely helped me. Not really sure why you began discussing the continuous convolution though. The whole polynomial discussion is perfectly applicable in the context of discrete convolution. Anyway, for whatever reason, motivating with polynomial multiplication somehow did it for me. Thanks! I'm also finding it helpful in higher dimensions to think in terms of multiplying polynomials (the number of variables = the number of dimensions): To find the coefficient for x_1^n_1 * x_2^n_2, you multiply coefficients of the input polynomials where the exponents add up to n_1 and n_2. This kind of explains why you need to flip the "kernel" (in all dimensions) when you think of convolution as a "sliding dot product": when you flip the kernel, the coefficients that you need to multiply "pair up" (such that the exponents add up to n_i). Also, I really like your sanity check: the two arguments MUST sum to x! Sounds gimmicky, but I'm pretty sure that will help me to remember.

  • @coolfreaks68
    @coolfreaks682 ай бұрын

    Convolution is integration of *f(τ).g(t-τ) dτ* *(τ , τ+dτ)* is an infinitesimally small time period for which we are assuming the values of *f(τ)* and *g(t-τ)* remain constant. *f(τ)* is the evolution of f(t) until the time instant *τ* . *g(t-τ)* is the version of g(t) which came into existence at the time instant *τ* .

  • @matthewpilling9494
    @matthewpilling94944 жыл бұрын

    I like how you say "Fourier"

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    It’s the French pronunciation:)

  • @timothyaugustine7093

    @timothyaugustine7093

    3 жыл бұрын

    Fouyay

  • @mustafaadel8194
    @mustafaadel81943 жыл бұрын

    Actually you showed us the similarity between the two formulas , however I didn't understand convolution from that similarity 😥

  • @fedefex1
    @fedefex14 жыл бұрын

    How can i write a continuous polinomium:?

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    With convolution :)

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Dr Peyam what a convoluted reply!!! :)

  • @patryk_49

    @patryk_49

    4 жыл бұрын

    I think analogous to normal polynomial : P(x) = integral(a(t)*x^t)dt

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Patryk49 what is a normal polynomial? Is there a correspondence to a NormalSubgroup?

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    Here’s one answer: mathworld.wolfram.com/NormalPolynomial.html

  • @wuxi8773
    @wuxi87734 жыл бұрын

    This is math, simple and everything has to make sense.

  • @SIVAPERUMAL-bl6qv
    @SIVAPERUMAL-bl6qv4 жыл бұрын

    Why convolution is used?

  • @krishnamishra8598
    @krishnamishra85983 жыл бұрын

    Convolution in one Word ???? Please answer!!!

  • @Muteen.N
    @Muteen.N2 жыл бұрын

    Wow

  • @yashovardhandubey5252
    @yashovardhandubey52524 жыл бұрын

    It's hard to believe that you can take out time from your schedule to answer KZread comments

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Thank you! :)

  • @luchisevera1808
    @luchisevera18084 жыл бұрын

    My professor 7 years ago showed this by sliding a triangle into a rectangle until everything became convoluted

  • @austinfritzke9305
    @austinfritzke93054 жыл бұрын

    Was watching this at 1.5x and laughed out loud

  • @f3ynman44
    @f3ynman443 жыл бұрын

    a_k*b_x-k looked like a Cauchy Product. Is this a coincidence?

  • @vineetkotian5163
    @vineetkotian51633 жыл бұрын

    Sir I cant seem to practice this subject the right way......I'm worried the question might get twisted in the exam and my brain will freeze

  • @luisgarabito8805
    @luisgarabito8805 Жыл бұрын

    Huh? 🤔 interesting.

  • @mrflibble5717
    @mrflibble57172 жыл бұрын

    I like your videos, but the whiteboard writing is not clear. It would be worthwhile to fix that because the content and presentation is good!

  • @dougr.2398
    @dougr.23984 жыл бұрын

    Vous avez un bon accent Français!

  • @drpeyam

    @drpeyam

    4 жыл бұрын

    Merci!

  • @zhanggu2008
    @zhanggu20084 жыл бұрын

    This is good. But it feels like a start, and the goal of a convolution is not explained. why do so, why use polynomial coefficients?

  • @forgetfulfunctor2986
    @forgetfulfunctor29864 жыл бұрын

    convolution is just multiplication in the group algebra!

  • @LemoUtan

    @LemoUtan

    4 жыл бұрын

    Just what I was thinking! I only recently started reading up on group modules and thus getting my jaw slowly pulled down whilst watching this

  • @dougr.2398

    @dougr.2398

    4 жыл бұрын

    forgetful functor please explain or at least partially illuminate the Group Theory connection?

  • @LemoUtan

    @LemoUtan

    4 жыл бұрын

    @@dougr.2398 If I may, this may help (straight to the examples in the wikipedia article about group rings): en.wikipedia.org/wiki/Group_ring#Examples

  • @gosuf7d762
    @gosuf7d7624 жыл бұрын

    If you replace x with e^(I th) you see convolution theorem.

  • @dougr.2398
    @dougr.23984 жыл бұрын

    You were right to both hesitate and then ignore the possibility that you had misspelled “coefficients”. English is difficult because it is FULL of irregularities.... this is one instance of a violation of the rhyme “I before E (edited 12-12-2023) except after C or when sounding like “Eh” (“long” A) as in Neighbor and Weigh”. Had you bothered to worry about that during the lecture, it would have impeded progress and the continuity (smile) of the discussion.

  • @elmoreglidingclub3030
    @elmoreglidingclub3030 Жыл бұрын

    I do not take drugs. Never have. But now I feel like I’m on drugs. What’s the point of all this??