Standardization vs Normalization Clearly Explained!
Let's understand feature scaling and the differences between standardization and normalization in great detail.
#machinelearning #datascience #artificialintelligence
For more videos please subscribe -
bit.ly/normalizedNERD
Support me if you can ❤️
www.buymeacoffee.com/normaliz...
Join our discord -
/ discord
Facebook -
/ nerdywits
Instagram -
/ normalizednerd
Twitter -
/ normalized_nerd
Пікірлер: 104
This video should be nominated to the KZread Oscars/Grammy awards....
Also in Principal Component Analysis, scaled features are very important because we search for the principal axes that have the highest variance. So if we have one feature in [0,1] and the other one in [1, 100], then the latter one has a much higher variance, even though it may not contain much information to be kept by the PCA.
@NormalizedNerd
Жыл бұрын
Great point! Feature scaling is very important in pca also.
Your clarity is amazing. This helps! Sub earned
This is the first video I watched and man you have crushed it. This intuitive explanation of math was a joy to watch. Please keep them coming.
How many more people would understand math if we had explanations like this. I feel like I have been reading math papers written in French, and you just spoke in English for me. Gosh, THANK-YOU.
I was wondering where you’ve been! Nice to see you back to posting. Well covered topic - it’s easy to overlook standardization and normalization thinking they are simple. They have some important subtleties
@PritishMishra
Жыл бұрын
I saw you today in Yannic's channel as well, nice to see you again.
@NormalizedNerd
Жыл бұрын
Thanks a lot mate! Really happy to be able to upload again :D❤️
@taotaotan5671
Жыл бұрын
Hey DJ, we are waiting for you also!
@Mutual_Information
Жыл бұрын
@@taotaotan5671 lol coming soon!!
@stayinthepursuit8427
7 ай бұрын
A standardization makes the original distribution look more normal . It doesn't just make a zero mean and 1 stdev.
Great, specially good to explain the misconception with non linear transformations which for some reasons is constantly used in conversations as normalization/standarization
You're doing amazing work here, hopefully one day you will get the recognition you deserve
Very nice video! Everything became clear as soon as I watched this
thanks man, It's help me so much to understand about normalization Very helpful
So glad to see you back !
@NormalizedNerd
Жыл бұрын
So happy to hear that :)
WOWW! Absolutely loved this! Thanks
I just love your channel name so much
High quality content. Thank you!
Good video - your description and explanation is good. However relating the basic explanations to real world problems would be helpful for users. Also using a partial distribution to calculate things such as volatility based on only the negative change is interesting. Also using curve fitting of data to determine parameters for trading and models is also interesting
Great lesson! Thank you so much for you video
Very nice explanation and demonstration. Good topic.
Great videos, dude! It's a shame we no longer get this great content
@Adrielgames1
6 ай бұрын
:(
Great to see you back bro ! ✌️
@NormalizedNerd
Жыл бұрын
Thanks a lot!! :D
love it. thanks so much for the explanation
Absolutely loved the explanation!
@NormalizedNerd
Жыл бұрын
So glad!
Normalisation became new normal to me, great job dude!!!!
Great explanation boss helped a lot chaliye jaao guru
An excellent explanation...Thanks a lot for sharing ....
sir olease make more videos, your sessions are very helpful
Your explanation was damn neat!
coolest presentation!
Great video!
Jesus, thats so great. Im totally new to data science and ML and Im trying to take it slow to properly understand everything. This video was super great in doing that. I picked up new knowledge that will be helpful for when Im writing my own ML algorithm (probably KNN based image classfication)
Very good explanation.
Good that you are back!😎
@NormalizedNerd
Жыл бұрын
Thanks!! 😁
extremely beautiful viz , teaching methodology is amazing too. I too run ana analytics channel, but u inspired me more
Good Explaintion... thank you very much 😊😊😊😊😊😊
great video, to the point with great visuals, subscribed.. Btw, how did you make these nice graphics?
@Skandawin78
3 ай бұрын
can you pls respond ?
Excellent visuals!
Thanks man for the video. this was with no doubt very helpful. however i was wondering how do you make all these animations ? Thanks in advance for you kindness.
Excellent! Thanks.
Thanks to you I understood why feature scaling is imp, thank legend
Yayyyyy! Thanks for an amazing video.
@NormalizedNerd
Жыл бұрын
😁😃
Good video, content animation are amazing.
Hi. For deep learning, it best to do min-max normalization (i.e. stretch values to 0-1) or max normalization (i.e. only divide by max to keep within 0-1)? I see a problem with the former approach, as a single outlying value can significantly skew all the rest of the values, making them not very comparable to the reference values.
Thank you!
excellent visualization, thanks!
AWESOME VIDEO TYSM YOU'RE AWESOME
Great videos! May I ask what software you use to create your equations/animations?
@neha4206
Жыл бұрын
I think he uses manim
Hi there thanks a lot! I have one question on min-max normalization as I m using Stata. When I use the formula, shall I take into consideration the actual min and max values of the variable, or I should consider the potential/feasible range of values the variable can assume? E.g. I have one variable that can take values -100,+100, yet in my dataset the min is -12 and the max is 34.
Hey! I wanted to know which software/ tools you used to make videos like this?
May I ask about the technologies that have been used to create this content ? I really appreciate sharing.
superb !
could you please tell me what software you used for these visualizations
This guy explained something my lectures failed in years, in 5 minutes
Very nice!
Very helpful
thank you
Love the sound effects! lol
what software do you use for animations?
Omggggg ur back!!!
@NormalizedNerd
Жыл бұрын
Yeaaah ❤️🥹
thanks bro
How can you so perfect in explaining
You are the best!
Excellent!
@NormalizedNerd
Жыл бұрын
Many thanks!
Well explained.
@NormalizedNerd
Жыл бұрын
Thanks man!
yes ty
Great to get back nerdy notifications...
@NormalizedNerd
Жыл бұрын
:D :D
Very good. I have a doubt. I would love to hear your comment on it. In recent months, I have been reflecting on the apparent prevalence of certain predatory mega-journals, in particular MDPI's Sustainability, which stands out as the journal with the most publications on various topics, according to various tourism bibliometrics. However, this observation has led me to consider the need for further analysis. Specifically, it has caught my attention that when using the percentage of publications in relation to the specific research topic in percentage terms (number of articles on a topic divided by the total number of articles published), the magnitude of the contribution decreases drastically. To illustrate this point, let me present a hypothetical example: Journal A has published 10 articles on prospect theory in the last five years, but its total output is 600 articles. In comparison, Journal B has published 25 articles on prospect theory in the same period, but its total publication volume exceeds 49,000 articles. Some bibliometrics would say that Journal B is the one that publishes the most, however, it is just a matter of gaining by quantity. I gave the journals weights based on their percentages (Weight of journal = Percentage of Journal / Highest Percentage among journals) then I did the min-max normalisation (Normalised weight = (Weight of Journal−Min Weight) / (Max Weight−Min Weight)), Then I created a Weighted Metric with Normalisation (multiplying the normalised * their weight). The use of min-max normalisation in this one is correct? Do you think there is a better approach?
i'm new to machine learning and theres something i dont quite understand: if you scale the X(input), does it affect the Y(output)? In a real life scenario where i want to make a prediction with my model, wont the scalling affect the results? if i shrink the input wont the output also be smaller?
@yamanarslanca8325
11 ай бұрын
By looking at what you are saying: No, I don't think so (don't take my word though, I am new at ML). I'd say your weights will be computed accordingly. But I read that even scaling your outputs (before the training) is a thing, there are people who do that.
I really hope you are fine now. Your videos helped me a lot in several times. Easily you could be a teacher if you want to. Thanks!
do you use manim?
excellent.
Please add NLP course.
@NormalizedNerd
Жыл бұрын
Hey, have you checked this playlist? kzread.info/head/PLM8wYQRetTxCCURc1zaoxo9pTsoov3ipY Feel free to suggest more topics!
2:15
♥️♥️♥️
@NormalizedNerd
Жыл бұрын
❤️😍
Amazing explanation! Thank you. The datasets get normalized just like the speaker! (a joke, couldn't help it)
Can someone here help me with my data preprocessing project or know where i can find help? I am so stuck and cant get over 70%. i really wann do well but dont really know what else do in preprocessing
gg budd you opened new horizons for me
Sorry, I can't understand at 3:10 : Good old [what?] algorithm
@oscarvega1529
Жыл бұрын
Gradient Descent Algorithm
Ahhhahaaa, I was sad seeing your last video was a year ago. Your visualization is really cool and as good as intuitive ml. But he stopped making videos 3 years ago
Bro, but how can we decide which technique to use when? and if selecting normalization then which normalization such as----min-max etc.....? could you please elaborate this.
blud comes after 1 year and does not come back even after another year gone past .
Khan Academy 2.0?
i wanna be as smart as you
Don't want to see your face. Just slides please. Also avoid background music.