What is Word2Vec? A Simple Explanation | Deep Learning Tutorial 41 (Tensorflow, Keras & Python)

A very simple explanation of word2vec. This video gives an intuitive understanding of how word2vec algorithm works and how it can generate accurate word embeddings for words such that you can do math with words (a famous example is king - man + woman = queen)
Part 2 (Coding): • Word2Vec Part 2 | Impl...
Deep learning playlist: • Deep Learning With Ten...
Machine learning playlist : kzread.info?list...
Do you want to learn technology from me? Check codebasics.io/?... for my affordable video courses.
🔖Hashtags🔖
#word2vecexplained #word2vec #nlpword2vec #nlpword2vectutorial #word2vecdeeplearning #word2vecpython #wordembeddings #wordembedding #pythonword2vec #deeplearning #word2vec #deeplearningtensorflow #deeplearningWord2Vec
🌎 Website: codebasics.io/?...
🎥 Codebasics Hindi channel: / @codebasicshindi
#️⃣ Social Media #️⃣
🔗 Discord: / discord
📸 Dhaval's Personal Instagram: / dhavalsays
📸 Instagram: / codebasicshub
🔊 Facebook: / codebasicshub
📱 Twitter: / codebasicshub
📝 Linkedin (Personal): / dhavalsays
📝 Linkedin (Codebasics): / codebasics
❗❗ DISCLAIMER: All opinions expressed in this video are of my own and not that of my employers'.

Пікірлер: 121

  • @codebasics
    @codebasics2 жыл бұрын

    Do you want to learn technology from me? Check codebasics.io/ for my affordable video courses.

  • @mehmetbakideniz
    @mehmetbakideniz8 ай бұрын

    I started searching wordtovec videos after failing to understand it by following NG's lessons. That is the single video that can actually tell that the word embeddings are 'the side effects' of the training process and this is how it finally clicked for me. Thank you very much!

  • @anubhavthakur2985

    @anubhavthakur2985

    2 ай бұрын

    then you didn't searchthe youtube enough

  • @sunilgundrai6464
    @sunilgundrai64642 жыл бұрын

    As part of my NLP dissertation, I was looking for some real time use cases with some clear explanation. I found this a super useful and thank you for great demonstration with so many examples which are easy to understand. You rock with your teaching skills!!

  • @girmayohannis4659

    @girmayohannis4659

    2 ай бұрын

    nice to meet you here, in which university are you studying for your PhD?Thanks

  • @raom2127
    @raom21272 жыл бұрын

    Presenting complex understand matter in an simplified way Dhaval Sir we you are an patience,consistent, simplified ,organised way of subject presentation expert.Basics-Theroy-Coding-Pratice..with..Great Explaination.

  • @gayathrigirishnair7405
    @gayathrigirishnair74052 жыл бұрын

    This is the video that finally helped me grasp this concept. Thank You!

  • @sanjeebkumargouda1471
    @sanjeebkumargouda14713 жыл бұрын

    Great explanation .. 🙌🙌🙌 After watching many videos on this topic finally my understanding is cristal clear. You are doing awesome job sir.

  • @codebasics

    @codebasics

    3 жыл бұрын

    I appreciate you leaving a comment of appreciation

  • @assafbotzer7952
    @assafbotzer79527 ай бұрын

    So clear, so eloquent, and so concise. Your contents are gift to this world. Thank you for using your intelligence, diligence and teaching skills to make a positive mark.

  • @fidaeharchli4590

    @fidaeharchli4590

    3 ай бұрын

    i confirm

  • @ahmedgamberli2250
    @ahmedgamberli2250 Жыл бұрын

    I like how you love your homeland and use it in all examples. Greetings and Love from Azerbaijan.

  • @dhirajkumarsahu999
    @dhirajkumarsahu9992 жыл бұрын

    Great Visual way of teaching! Thank you so much Sir ❤️

  • @mimansamaheshwari4664
    @mimansamaheshwari4664 Жыл бұрын

    One of the best videos on word2vec

  • @PremKumar136
    @PremKumar136 Жыл бұрын

    Awesome explanation. Crystal Clear.

  • @Sunilgayakawad
    @Sunilgayakawad Жыл бұрын

    Crystal clear explanation!! Thanks you so much sir

  • @shubhamwaingade4144
    @shubhamwaingade41442 жыл бұрын

    Awesome explanation of the concept!

  • @list10001
    @list100012 жыл бұрын

    Thank you! The explanation was very clear.

  • @yonahcitron226
    @yonahcitron2262 жыл бұрын

    incredible content. this guy is one of the best on youtube

  • @codebasics

    @codebasics

    2 жыл бұрын

    I appreciate you leaving a comment of appreciation

  • @moni1122331
    @moni11223312 жыл бұрын

    great teacher, great explanation, great presentation, great context

  • @fidaeharchli4590

    @fidaeharchli4590

    3 ай бұрын

    thank you

  • @javierlopezcampoy5951
    @javierlopezcampoy5951 Жыл бұрын

    Great explanation! Thank you very much

  • @prasannan-robots
    @prasannan-robots3 жыл бұрын

    Thanks for this awesome tutorial waiting for coding part :)

  • @notknown9307
    @notknown93073 жыл бұрын

    Excited 😄

  • @robertcormia7970
    @robertcormia79705 ай бұрын

    This was a useful introduction, I don't have the math chops to understand it, but it was useful to hear some of these definitions.

  • @kimdaeeun6683
    @kimdaeeun6683 Жыл бұрын

    Easy explanation!! Tks much👍👍

  • @ledinhanhtan
    @ledinhanhtan4 ай бұрын

    Mind blowing 🤯🤯 Thank you!

  • @kanisrini01
    @kanisrini0125 күн бұрын

    Amazing Video 👏🌟. Thank you so much for the great explanation

  • @minruili4789
    @minruili47892 жыл бұрын

    Fantastic explanation!

  • @namansethi1767
    @namansethi17673 жыл бұрын

    Thank you Sir for this playlist

  • @shivav7379
    @shivav7379 Жыл бұрын

    A Very good explaination - really very helpful

  • @manikant1990
    @manikant19902 жыл бұрын

    superbly explained !!

  • @darshangangurde7855
    @darshangangurde78553 жыл бұрын

    thanks a lot...holly great..pls complete the playlist asap

  • @BARaaz04
    @BARaaz042 жыл бұрын

    Very good explanation. Thanks.

  • @SoftRelaxingAndCalmMusicNature
    @SoftRelaxingAndCalmMusicNature10 ай бұрын

    Well done. This is one of the best course on word2vec so far. I do have a master degree in AI and event that I did not work professionaly in the field your cour brough a lot of souvenirs haaa.. During my master 15 years ago I introduced an archaich method for resolving question/answering based on linkgramar, wordnet, verbnet and semnet. At the end of my syntactical analysis I also discovered that by just using world context it was possible to comme up with a vector representation of named entities.. The innovation is here is the use of neural network to give a value to the world. This is just brilliant. In my thesis I was already showing that language is just a code representing a subjective version of one universe and that human and animal comunicate using theirs own code.

  • @vishaldas6346
    @vishaldas63463 жыл бұрын

    Also what would be your next topic in deep learning, is it sequence to sequence models?

  • @vgreddysaragada
    @vgreddysaragada11 ай бұрын

    Super explanation ..Thank you so much

  • @madhu1987ful
    @madhu1987ful2 жыл бұрын

    Awesome man...loved it...can you pls upload some code walk through of this concept -- some gud projects

  • @PavanTripathi-rj7bd
    @PavanTripathi-rj7bd9 ай бұрын

    Great explanation!

  • @tagoreji2143
    @tagoreji21432 жыл бұрын

    Good Explanation Sir.Thank you

  • @vinaykumardaivajna5260
    @vinaykumardaivajna5260 Жыл бұрын

    Great explanation as always

  • @619vijay
    @619vijayКүн бұрын

    Very useful and informative

  • @sumit121285
    @sumit1212852 жыл бұрын

    you are the real teacher.....what should i say for you ???? thank you sir...thank you so much.........

  • @pradeept328
    @pradeept3285 ай бұрын

    Great explanation

  • @ashokkonatham8857
    @ashokkonatham88573 жыл бұрын

    Wow, very very clear . Thank you 🙏

  • @codebasics

    @codebasics

    3 жыл бұрын

    Glad it was helpful!

  • @bii710
    @bii7102 жыл бұрын

    That was a great explanation. Thanks. I have this one question in my mind. If all words in documents are unique then how word2vec will find vector for the last 2 words? Considering cbow

  • @vishaldas6346
    @vishaldas63463 жыл бұрын

    I think Dhaval, there is no non-linear activation function between the input layer and hidden layer. Correct me if I am wrong.

  • @ShahabShokouhi
    @ShahabShokouhi5 ай бұрын

    I was watching Andrew Ng's course on sequence models and his lecture on word2vec is just a bullshit. Thanks god I found your video, amazing explanation.

  • @vikaspatildod
    @vikaspatildod Жыл бұрын

    Beautiful video

  • @taufiqulhaque4987
    @taufiqulhaque49873 жыл бұрын

    would you please create a playlist on NLP?

  • @prasanth123cet
    @prasanth123cet2 жыл бұрын

    Will be get nearly identical word vectors for CBOW and skim gram methods for a particular word say 'king'?

  • @ashwinivalmiki7636
    @ashwinivalmiki76363 жыл бұрын

    Hello sir, Please make a video on GRE and IELTS preparation , this will be more useful and helpful to students like me planning to study Masters Abroad as your videos are clear, we get motivated . Thank you.

  • @phil97n
    @phil97n6 ай бұрын

    Awesome thank you

  • @neerajashish7042
    @neerajashish70423 жыл бұрын

    Approximately how many videos are going to come in this series except the existing videos, by the way thanks a lot sir, the only playlist on youtube which was way more knowledgeable for machine learning and deep learning..

  • @codebasics

    @codebasics

    3 жыл бұрын

    There will be atleast 5 to 10 videos coming up and then I will start the project series

  • @yasaswinigollapally7603
    @yasaswinigollapally7603 Жыл бұрын

    Sir your video is awesome 🙌,i have one doubt ,what is the main difference between skip gram and bag of words model?

  • @abir95571
    @abir955712 жыл бұрын

    There's a subtle mistake in your CBOW explanation at 8:34 . In CBOW the target is always the central word based on context i.e the surrounding word . That means for a substring "Emperor ordered his" and window size of 3 the target is "ordered" and features are "Emperor , this"

  • @ashwinshetgaonkar6329

    @ashwinshetgaonkar6329

    2 жыл бұрын

    so he explained skip gram

  • @ashwinshetgaonkar6329

    @ashwinshetgaonkar6329

    2 жыл бұрын

    so he explained skip gram

  • @abir95571

    @abir95571

    2 жыл бұрын

    @@ashwinshetgaonkar6329 yes

  • @thurakyawnyein6113
    @thurakyawnyein61132 ай бұрын

    superb..

  • @thamizharasim5970
    @thamizharasim59702 жыл бұрын

    Thanks a lot 😌

  • @trendyjewellery1987
    @trendyjewellery1987Ай бұрын

    Superb

  • @jongcheulkim7284
    @jongcheulkim72842 жыл бұрын

    Thank you.

  • @debatradas9268
    @debatradas92682 жыл бұрын

    thank you so much

  • @anpowersoftpowersoft
    @anpowersoftpowersoftАй бұрын

    Amazing

  • @lisali6205
    @lisali62052 жыл бұрын

    you are the best

  • @josephvanname3377
    @josephvanname3377 Жыл бұрын

    The king-man+woman=queen equation tells me that we are not embedding words into a vector space but into an affine space which is like a vector space but where we do not have a notion of a zero vector. Perhaps we can obtain a zero vector simply by taking the weighted average over all words or by doing some regularization during training so that we naturally get a zero vector. What will the zero vector mean anyways?

  • @lohitsalavadhi6912
    @lohitsalavadhi69122 жыл бұрын

    Great explained finally

  • @codebasics

    @codebasics

    2 жыл бұрын

    🙏🙏

  • @notknown9307
    @notknown93073 жыл бұрын

    thanx we are learning a lot from you

  • @codebasics

    @codebasics

    3 жыл бұрын

    Glad it was helpful!

  • @notknown9307

    @notknown9307

    3 жыл бұрын

    @@codebasics waiting for your next upload you are doing your work very well👍👍

  • @anonymous-or9pw
    @anonymous-or9pw5 ай бұрын

    He played it really well when he marked male = -1

  • @rahulsoni412
    @rahulsoni4123 жыл бұрын

    Thanks a lot for explaining this using a neural network diagram :)

  • @codebasics

    @codebasics

    3 жыл бұрын

    🙂👍

  • @rahulsoni412

    @rahulsoni412

    3 жыл бұрын

    @@codebasics can you explain how the number of weights are calculated in word embedding, I mean the number of total weights. I was getting confused while calculating the number of weights.

  • @harshvardhanagrawal
    @harshvardhanagrawal2 күн бұрын

    Where do we get the predicted output from? How do we enter it for comparison?

  • @arjunbali2079
    @arjunbali20792 жыл бұрын

    Thanks sir

  • @vitocorleone1991
    @vitocorleone19912 жыл бұрын

    Brilliant

  • @user-ns8rn8fu3z
    @user-ns8rn8fu3z11 ай бұрын

    is there standart real list for every onject given here. For example for cats, tails 0.2?

  • @kmnm9463
    @kmnm94633 жыл бұрын

    Hi Dhaval, Great video on W2V, The link for the coding part of implementing Word2Vec in Python, please?

  • @codebasics

    @codebasics

    3 жыл бұрын

    Yes that video is coming up soon. I have not yet uploaded it

  • @ibrahemnasser2744
    @ibrahemnasser27442 жыл бұрын

    What a mathematician would do when he/she hear you say "a vector is nothing but a set of numbers"

  • @sebinsaji9573
    @sebinsaji95733 жыл бұрын

    Can you say about cyber security scopes skills

  • @ChaitraC9191
    @ChaitraC91912 жыл бұрын

    Hello I have doubt in this explanation, aren't all the weights gonna be same when our neural network is trained ? what I mean is once we train a network W(T)X is what triggers a output node so how do we have different weights for every output word

  • @cherupawan3777

    @cherupawan3777

    Жыл бұрын

    Did u got answer to this

  • @bibhupadhy4155
    @bibhupadhy4155 Жыл бұрын

    Great Explanation :) Crisp and to the point , Better than Hrithik Roshan Super Hero Movie's Explanation :P :P

  • @lemoniall6553
    @lemoniall6553 Жыл бұрын

    Is word2vec using dimensional reduction too?

  • @akshansh_00
    @akshansh_00Ай бұрын

    bam! life saver

  • @wp1300
    @wp13006 ай бұрын

    7:20 Meaning of word can be inferred by surrounding words

  • @uwaisahamedimad556
    @uwaisahamedimad556 Жыл бұрын

    Hi, it is a wonderful explanation for word2vec I've ever seen.I have a question,I have my own corpus and I have built multiple wor2vec models, How to evaluate these models and how am I gonna choose the best one???

  • @codebasics

    @codebasics

    Жыл бұрын

    One approach is to take a classification or some other NLP problem in your domain and build NLP classification model using your embeddings. You can then check the performance of those models to evaluate how effective embeddings are

  • @uwaisahamedimad556

    @uwaisahamedimad556

    Жыл бұрын

    @@codebasics thanks a lot for the reply. based on your answer it seems like there is no standard or at least a well-established evaluation method for the performance of word embeddings.

  • @djelloulbouchiha-cunaamaal7848
    @djelloulbouchiha-cunaamaal7848 Жыл бұрын

    We need a course about NLP Transformers..

  • @anandakhan4160
    @anandakhan4160 Жыл бұрын

    sir, how do u unzip the json file using git bash , is not clear to me. help me plz. thanks.

  • @amanbajaj7591
    @amanbajaj7591 Жыл бұрын

    wheere is neural network link?

  • @imanqoly
    @imanqoly9 ай бұрын

    The more you dig deeper into a thing, the greater the tutor gets

  • @umerfarooque6373
    @umerfarooque63733 ай бұрын

    How to evaluate a word2vector model

  • @amanagrawal4198
    @amanagrawal41986 күн бұрын

    I think there's a mistake , because in both cbow and skip gram , the weights that made the embdeddings are always between input and hidden layer , and here in cbow you mentioned the weights between hidden and output are considered.

  • @amanagrawal4198

    @amanagrawal4198

    6 күн бұрын

    CBOW Model Architecture Review The CBOW architecture works by predicting a target word based on the context words around it. Here’s a step-by-step explanation of the flow: Input Layer: This consists of several one-hot encoded vectors corresponding to the context words. Projection Layer (Hidden Layer): Each one-hot vector is used to retrieve a word embedding from the first weight matrix (input-to-hidden weights). Unlike typical neural networks, there is no activation function here; the embeddings are simply summed or averaged (depending on the implementation) to produce a single dense vector. This vector represents the combined semantic content of the context words. Output Layer: The averaged embedding vector is then projected to the output layer using a second set of weights (hidden-to-output weights). The output layer is a softmax layer that predicts the probability distribution over the vocabulary for the target word. Role of Weights in Embedding Formation Input-to-Hidden Weights: This is essentially the embedding matrix. Each row in this matrix corresponds to the embedding of a word in the vocabulary. When context words are fed into the model, their embeddings are retrieved by indexing this matrix with the one-hot vectors. These embeddings are what you typically extract and use as pre-trained embeddings for other tasks. Hidden-to-Output Weights: These weights are used to transform the combined embedding from the hidden layer into a prediction for the target word. Each column in this matrix (since it's typically the transpose of the embedding matrix in many implementations) can be seen as a "contextual embedding" of a word when it acts as a target.

  • @wenzhang5879
    @wenzhang5879 Жыл бұрын

    I think you mean 'side products' rather than 'side effect'?

  • @Cooldude5786
    @Cooldude5786 Жыл бұрын

    The statement "King - man + woman = Queen" is well-known in machine learning. However, when we examine the characteristics of a king, they often include being super rich, having authority, and possibly not having a tail. Yet, there is a contradiction: a lion is also referred to as a king, and it does have a tail. How can a computer differentiate between a human king and an animal king? Doesn't this introduce bias since the training corpus typically associates "king" with humans rather than animals? Just because something appears less frequently or is absent from the corpus doesn't mean it lacks value or significance.

  • @houchj0372
    @houchj03722 жыл бұрын

    doesn't CBOW mean Contextual Bag of Words?

  • @codebasics

    @codebasics

    2 жыл бұрын

    Continuous Bag Of Words: analyticsindiamag.com/the-continuous-bag-of-words-cbow-model-in-nlp-hands-on-implementation-with-codes/

  • @houchj0372

    @houchj0372

    2 жыл бұрын

    @@codebasics you are correct, thank you. By the way, this video is excellent.

  • @PavanKumar-bk1sz
    @PavanKumar-bk1sz3 жыл бұрын

    Can I get an admission in bsc data science after 12th commerce in St Xavier's College Mumbai ???? and I've mathmatics in optional subject ??? please please please please please please please please please tell me I've been requesting you for 6 months 🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏🙏

  • @humanardaki7911
    @humanardaki7911 Жыл бұрын

    working?

  • @johnnysaikia2439
    @johnnysaikia24397 ай бұрын

    King of the jungle has tail though

  • @RePuLseHQKing
    @RePuLseHQKing2 жыл бұрын

    3:35 paygap lmao

  • @santoshsaklani5019

    @santoshsaklani5019

    2 жыл бұрын

    Kindly make video on vulnerability prediction using wordtovec

  • @amitmishra5474
    @amitmishra547410 ай бұрын

    Lion King has tail 😅

  • @mubashiraqeel9332
    @mubashiraqeel93324 ай бұрын

    the thing is your all videos are connected to previous I am unable to watch a whole video you always made me pause and watch a previous video that's really a problem first i was watching the text classification video you said go watch bert first then in that video you said go watch word2vec then you said go watch part 1 first then now in this video you said go watch neural network now tell do you really want me to watch a whole video because i am just opening a new tab repitively.

  • @shivangiawasthi9388
    @shivangiawasthi93888 ай бұрын

    found a better explanation here - kzread.info/dash/bejne/fKGZxMOaoKTJe84.html

  • @mmenjic
    @mmenjic2 жыл бұрын

    3:56 why horse and woman are same gender for start ????? then king minus men is gender -2 adding a woman or horse to that you get gender -1 which is men or king !?????

  • @priyeshsrivastava8025

    @priyeshsrivastava8025

    2 жыл бұрын

    no its (-1) - (-1) + (+1) = +1 i.e. queen

  • @pawansinha8442
    @pawansinha8442 Жыл бұрын

    but in case of king of jungle that is lion, he has a tail,😃 just saying...