Knowledge Graphs and Deep Learning 102
In this video, we are going to look into not so exciting developments that connect Deep Learning with Knowledge Graph and GANs… let’s just hope it’s more fun than “Machine Learning Memes for Convolutional Teens”.
GAN Explained Link : • Explaining math of GAN...
The bot in the video is R2D2, which comes after OB1's 2nd gen in Star Wars. Audio change was a bit tricky.
Topics Covered in the video
1. Graph Convolutional Networks
2. Semi-supervised Learning
3. Knowledge Graphs and Ontology
4. Embedding in Knowledge Graphs
5. Adversarial Learning in Knowledge Graphs (KBGANs)
Please contribute to the initiative by donating to us via Patreon because we need the money to scale up our efforts and bring creative weirdos and nerdy dreamers together.
Patreon Link: / crazymuse
Even something as small as 1$ per creation can make a collective difference. Join us on slack if you want to contribute to the scripts that we write for the video.
Slack Link : goo.gl/GFW2My
Contributors for the Video
1. Script Writer : Jaley Dholakiya
2. Reviewers : Arjun Shetty, Sidharth Aiyar, Saikat Paul
3. Animator and Moderator : Jaley Dholakiya
References
1. Blog on Graph Convolutional Network : tkipf.github.io/graph-convolu...
2. Semi-supervised learning in Knowledge Graphs (Gaussian Field): mlg.eng.cam.ac.uk/zoubin/paper...
3. Trans-E embedding (NIPS) : papers.nips.cc/paper/5071-tra...
4. Trans-D embedding : www.aclweb.org/anthology/P15-1067
5. KBGANs : arxiv.org/pdf/1711.04071.pdf
Current Contributors on Patreon ( / crazymuse )
1. Aaron Mathew (main contributor)
2. Parth Parikh
3. Sean Marrett
4. Laher D
5. Abhijith N
AI , Machine Learning , KBGANs, DBPedia, Facebook Graphs
Пікірлер: 44
"too far, no light... too close, lit as f....." 🤣 nice one
Nice video man! You really summarized lots of concepts in a visual and understandable way, thanks!
Nice team work guys! Great work of presenting novel stuffs and so elegantly! Liked your channel punchline!
The cute Droid in the video is r2d2, I deeply apologize to Star wars fans. I hope that my life is safe :p. I realized it after audio recording, but was too late . The audio patch was looking shabby so went ahead with mistake. Sad part is that my 3 reviewers like me, didn't watch all Star wars movies :(.so as penence, I have decided to binge watch all 5 remaining movies this weekend. Peace ✌️
@miniversity101
5 жыл бұрын
Forgive :D !
@kapilkumar2650
4 жыл бұрын
was about to write this😂😂
Good engaging video, I will look into adversarial video. I am struggling to understand transformer model, it would be great to learn with this kind of graphics and plus it has cool name :)
great video, funky presentation!
This is an underrated video.
nice presentation, thank you, although not fully understand at some point... but still quite informative...
Very good presentation.... Good clarity for a general person
nice work. it was very informative, at least for me
Thank you very informative
This video deserves many views.
OMG this video is amazingly good!
Great job!
What a video! great job!
Not only your content is so informative , your presentation and animation effects are so nice! Can you share a hint of which tool you use to make this video, and how many hour you use to create a great content like this ?
@Crazymuse
5 жыл бұрын
Its adobe after effects :)
Captivating and deep (no punt intented)
How I can learn this material as a beginner student up to a master level to actually deploy this as tool for my business? Thank you so much!
so so amazing keep diving deep
I still didn't understand fully why negative samples are required? Btw the video is very awesome...the illustration are cool and clear :)
@Crazymuse
6 жыл бұрын
Thanks :) It is required in order to learn what is "not" a relation otherwise it fails to have good representation. You can read this NIPS paper to know more papers.nips.cc/paper/5071-translating-embeddings-for-modeling-multi-relational-data.pdf
Wow!
Where are other videos, especially wavelet Transform 😢kindly upload
Very interesting video. It makes me research more about it. However, I don't understand how do you generate the weight vector? And also, why the matrix has 0s and 1s? Is it about the relation between entities? Thank you!
@Crazymuse
5 жыл бұрын
The 0s and 1s are a part of adjacency Matrix (relation between nodes) :) The weight vector is learned via back propagation, just like in case of simple neural networks.
Can I please contact you more about this issue?
Obi Wan Kenobi is not a droid! Its R2D2. Great video btw :)
@Crazymuse
6 жыл бұрын
Yaa I know it's r2d2, the audio cut paste was looking shabby so went ahead with mistake :) thanks :)
brilliant, ...lit as f***
great video
Knowledge is power
1:05 posts and ... what?
Why do you say not so exciting?
Hi
@Crazymuse
Жыл бұрын
Hi,do let me know the doubt.
Unfortunately Google removed dislike counter so I have to watch some irrelevant videos!!!
try less bad words, or just say it... the beep really hurts earphone users
Maybe more humourous and informative if you stick to Hindi