1 Principal Component Analysis | PCA | Dimensionality Reduction in Machine Learning by Mahesh Huddar
1. Principal Component Analysis | PCA | Dimensionality Reduction in Machine Learning by Mahesh Huddar
PCA Algorithm: • PCA Algorithm | Princi...
#1. PCA Solved Example: • 1 Principal Component ...
#2. PCA Solved Example: • 2. Principle Component...
The following concepts are discussed:
______________________________
principal component analysis,
pca machine learning,
principal component analysis example,
principal component analysis explained,
Principal Component Analysis Solved example,
Principal Component Analysis Numerical example,
PCA solved example,
pca numerical example,
pca machine learning example,
principal component analysis machine learning,
dimensionality reduction pca,
dimensionality reduction,
dimensionality reduction machine learning
********************************
1. Blog / Website: www.vtupulse.com/
2. Like Facebook Page: / vtupulse
3. Follow us on Instagram: / vtupulse
4. Like, Share, Subscribe, and Don't forget to press the bell ICON for regular updates
Пікірлер: 91
I noticed that your channel contains the entirety of Data Mining taught at the Master's level! Thank you very much, subscribing immediately!
@MaheshHuddar
Жыл бұрын
Welcome Do like share and subscribe
Super explanation.. today is my machine learning paper
@MaheshHuddar
6 ай бұрын
Thanks and welcome Do like share and subscribe
@ankitjha_03
2 ай бұрын
Mine is tomorrow!
@nurulsyuhadah984
12 күн бұрын
How was it?
Amazing step-by-step outline! I love it💌, so I subscribe!
@MaheshHuddar
Жыл бұрын
Thank You Do like share and subscribe
Super explanation..the best channel in KZread to learn machine learning and ann topics ❤❤
@MaheshHuddar
7 ай бұрын
Thank You Do like share and subscribe
Super Bhayya ...
Thanks for the video. Great explanation!
@MaheshHuddar
Жыл бұрын
Welcome Do like share and subscribe
Very clear Explanation Sir.... Thank you so much...
@MaheshHuddar
11 ай бұрын
Welcome Please do like share and subscribe
Excellent Teaching. Salute to you sir
@MaheshHuddar
8 ай бұрын
Welcome Do like share and
Thank you sir. Clear and easy to understand. Thank you.
@MaheshHuddar
Жыл бұрын
Welcome Do like share and subscribe
Thank u very much.Very clear explanation and it is to understand
@MaheshHuddar
9 ай бұрын
Welcome Do like share and subscribe
thank u for uploading like this video
@MaheshHuddar
Жыл бұрын
Welcome Do like share and subscribe
Clear and nice explanation. Thanks for the video
@MaheshHuddar
3 ай бұрын
Welcome Do like share and subscribe
This man has depth knowledge of this topic.
@MaheshHuddar
6 ай бұрын
Thank You Do like share and subscribe
Thanks sir for your explanation 🎉
@MaheshHuddar
Жыл бұрын
Welcome Do like share and subscribe
super explanation .. very easy to understand with out any hook ups sir thanks ...Inspr KVV.Prasad
@MaheshHuddar
10 ай бұрын
Thank You Do like share and subscribe
Thank you so much sir amazing explaination♥♥♥
@MaheshHuddar
Жыл бұрын
Welcome Do like share and subscribe
Thats a clear explanation i have seen
@MaheshHuddar
Жыл бұрын
Thank you Do like share and subscribe
thanks a lot for this wonderful lecture.
@MaheshHuddar
3 ай бұрын
Welcome! Do like share and subscribe
tq sir for this wonderful concept
@MaheshHuddar
6 ай бұрын
Welcome Do like share and subscribe
thank you so much you are great professor
@MaheshHuddar
5 ай бұрын
You are very welcome Do like share and subscribe
thank you sir, you were amazing🤩
@MaheshHuddar
7 ай бұрын
Welcome Please do like share and subscribe
Thank you very much sir
@MaheshHuddar
8 ай бұрын
Welcome Do like share and subscribe
Thank you so much today is my data mining and ML paper
@MaheshHuddar
2 ай бұрын
Welcome Do like share and subscribe
Thank you very much master huddar❤
@MaheshHuddar
4 ай бұрын
Welcome Do like share and subscribe
Thank you
@MaheshHuddar
5 ай бұрын
Welcome Do like share and subscribe
thank u so much
@MaheshHuddar
6 ай бұрын
Welcome Do like share and subscribe
content and teaching is very good please also provide the notes it will be helpful
@MaheshHuddar
3 ай бұрын
Thank You Do like share and subscribe
Thanks you,sir
@MaheshHuddar
5 ай бұрын
Welcome Do like share and subscribe
Sir please upload the content of ensemble methods bagging boosting and random forest
@MaheshHuddar
Жыл бұрын
Ensemble Learning: kzread.info/dash/bejne/l4Ktt8ipd6WypNY.html Random Forest: kzread.info/dash/bejne/nYSllZRxna20dZM.html
Thanks Sir
@MaheshHuddar
7 ай бұрын
Welcome Do like share and subscribe
linear discriminent analysis please make a video bhayya
thanks a lot
@MaheshHuddar
Ай бұрын
You are most welcome Do like share and subscribe
Excellent
@MaheshHuddar
4 ай бұрын
Thank You Do like share and subscribe
Sir book name please
Hello sir, thank you for your explanation.I have a doubt at 08:17 why you have considered only first equation?
@MaheshHuddar
4 ай бұрын
You will get same answer with second equation You can use either first or second no issues
Nice!
@MaheshHuddar
7 ай бұрын
Thank You Do like share and subscribe
thank you very much Sir, for ur explantion on that video. I still confused so I would like to ask how to get the value of: [-4.3052, 3.7361, 5.6928, -5.1238] how can I get the value. I still dont get. Thank u Sir
@jvbrothers5454
7 ай бұрын
yeahh im also confused how did he get im getting values diffrent 0.3761 5.6928 -5.128
Why we are not dealing with e2 means why we not do e2^T.[cov matrix]
@rohanshah8129
9 ай бұрын
Here, we had considered 2 dimension as the high dimensonal data for example. One of the most usecase of PCA is in dimensionality reduction. So, if you want you can use e2 and get second PC. But then think about it. From 2 variable, we again got 2 variables. That's why he has shown only PC1. However, in reality we generally use 2 PC axes (mostly depends on your data). If it has a lot of variables, then 3 or 4 can also be good but we don't generally go beyond that. So, in this case you will need e2, e3 and e4 as well. So this is how it works.
Can you add the concept of hidden Markov model in your machine learning playlist
@MaheshHuddar
2 ай бұрын
Sure Working on it
@Blackoutfor10days
2 ай бұрын
@@MaheshHuddar okay 👍
@Blackoutfor10days
2 ай бұрын
@@MaheshHuddar my exam is near
Hi Sir, Great explanation about PCA. But when I searched the covariance matrix for more 2 variables it's showing that covariance is only done between 2 variables. How to calculate the covariance if a dataset have more than 2 variables. Could you please give an explanation on that.....!!
@fintech1378
Жыл бұрын
you need to do for all pairwise combinations
@shahmirkhan1502
Жыл бұрын
@fintech1378 is right. You need to do pairwise combinations. For example, for 4 variables, your covariance matrix will be 4x4 with the following combinations: cov(a, a) cov (a, b) cov (a, c) cov(a,d) cov(b, a) cov(b, b) cov(b, c) cov(b, d) cov(c, a) cov (c, b) cov(c, c) cov(c, d) cov(d, a) cov(d, b) cov (d, c) cov(d, d)
@rohanshah8129
9 ай бұрын
If there are n variables, covariance matrix will be of nxn shape.
@parthibdey6005
5 ай бұрын
is this covariance for reducing 4 to 1@@shahmirkhan1502
thankyou sirr, how to calculate 2nd pc?
@MaheshHuddar
Ай бұрын
Select the second eigen vector and multiply to the given feature matrix
بحبككككككككككككككككككككككككككككككككككككككككك يا سوسو
@MaheshHuddar
Жыл бұрын
What it means..?
@abishekraju4521
Жыл бұрын
@@MaheshHuddar According to google translate: _"I love you sooo"_
devru sir neevu
@MaheshHuddar
6 ай бұрын
Do like share and subscribe