Understanding Multivariate Gaussian Distribution (Machine Learning Fundamentals)

#gaussiandistribution #machinelearning #statistics
In this video, we will understand the intuition and maths behind the Multivariate Gaussian/Normal Distribution. We will be doing a walkthrough from Stanford CS229 course document.
⏩ OUTLINE:
0:00 - Introduction and Formula breakdown
2:35 - Relationship between Multivariate Gaussians and Univariate Gaussians
6:00 - Covariance Matrix
8:38 - Diagonal Covariance Matrix
11:10 - Shape of Isocontours
12:35 - Heatmap density view
⏩ Document: cs229.stanford.edu/section/gau...
⏩ Organisation: Stanford University
*********************************************
If you want to support me financially which totally optional and voluntary :) ❤️
You can consider buying me chai ( because i don't drink coffee :) ) at www.buymeacoffee.com/TechvizC...
*********************************************
⏩ KZread - / @techvizthedatascienceguy
⏩ Blog - prakhartechviz.blogspot.com
⏩ LinkedIn - / prakhar21
⏩ Medium - / prakhar.mishra
⏩ GitHub - github.com/prakhar21
*********************************************
Tools I use for making videos :)
⏩ iPad - tinyurl.com/y39p6pwc
⏩ Apple Pencil - tinyurl.com/y5rk8txn
⏩ GoodNotes - tinyurl.com/y627cfsa
#techviz #datascienceguy #ml #stats #distribution #multivariategaussian #gaussian

Пікірлер: 20

  • @_jiwi2674
    @_jiwi26742 жыл бұрын

    I really like how you are making an effort to explain the intuition behind the concepts by explaining the meanings within the equations. Only a few ppl do that.

  • @TechVizTheDataScienceGuy

    @TechVizTheDataScienceGuy

    2 жыл бұрын

    Thank you so much for appreciating the effort :)

  • @shivangitomar5557
    @shivangitomar555711 ай бұрын

    Love the explanation! Thank you!!

  • @kunalmenavlikar9299
    @kunalmenavlikar92992 жыл бұрын

    Really a great explaination. I really liked the part you related the formulation of gaussian curve with the iso-contours.

  • @TechVizTheDataScienceGuy

    @TechVizTheDataScienceGuy

    2 жыл бұрын

    Thank you 😊

  • @elleelleelleelle_______
    @elleelleelleelle_______ Жыл бұрын

    what a great video

  • @arhan-
    @arhan- Жыл бұрын

    I'm an undergrad at Berkeley currently taking ML and you just saved my ass on this midterm

  • @TechVizTheDataScienceGuy

    @TechVizTheDataScienceGuy

    Жыл бұрын

    :D

  • @pythongui5199
    @pythongui51993 жыл бұрын

    Nice content

  • @reddit2yt
    @reddit2yt3 жыл бұрын

    some input on how the eigen values of the covariances matrix would give a better intuition for the ellipses

  • @TechVizTheDataScienceGuy

    @TechVizTheDataScienceGuy

    3 жыл бұрын

    That’s a good point to discuss. I missed that in the video I suppose, Let’s consider we have diagonal covariance matrix (ie. the off diagonal value go zero), then for this matrix the eigen values will be the variance only for each variable. If it’s 2x2 then we will have 2 eigen values and corresponding eigen vectors. So eigen vectors will act as 2 principal components and will point to directions of variance (ie. major and minor axis) and the spread by their respective eigen values. Diagonal covariance would mean axis aligned ellipse for 2D .. ellipsoid etc for higher dimensions. For non diagonal matrix, the same concept holds true but the ellipse becomes non axis aligned due to covariance change amongst variables and requires to calculate eigen value explicitly rather considering diagonals as the output.

  • @shashankkumar7920
    @shashankkumar79203 жыл бұрын

    Great video thank you so much sir

  • @TechVizTheDataScienceGuy

    @TechVizTheDataScienceGuy

    3 жыл бұрын

    Glad you liked it.

  • @shashankkumar7920

    @shashankkumar7920

    3 жыл бұрын

    @@TechVizTheDataScienceGuy sir I m doing mtech in machine learning related field at IIT Delhi ...ur videos are very helpful🙏

  • @MahegyaneshPandey
    @MahegyaneshPandey Жыл бұрын

    👌

  • @kuldeepkushwah8622
    @kuldeepkushwah86222 жыл бұрын

    nice

  • @ratikagarg1494
    @ratikagarg14943 жыл бұрын

    👍👍

  • @prabirdas8430
    @prabirdas84303 жыл бұрын

    nice explanation

  • @TechVizTheDataScienceGuy

    @TechVizTheDataScienceGuy

    3 жыл бұрын

    Thanks Prabir.

  • @Orioricrafts
    @Orioricrafts10 ай бұрын

    At @11.55 I think the whole term is supposed to be r1 squared and not squared root of r1. Btw wonderful explanation 👍