Confused which Transformer Architecture to use? BERT, GPT-3, T5, Chat GPT? Encoder Decoder Explained
This video explains all the major Transformer Architectures and differentiates between various important Transformer Models.
Which Transformer Architecture to use to solve a particular problem statement in Natural Language Understanding (NLU) and Natural Languages Generation (NLG) is explained in a simplified manner.
Over the past 6 years, Transformers, a neural network architecture, have completely transformed state-of-the-art natural language processing and the way we approach to different problem statements in NLG and NLU.
Chapters:
0:00 Introduction
1:21 Encoder Branch
1:57 BERT
2:37 DistilBERT
3:19 RoBERTa
3:59 XLM
4:50 XLM-RoBERTa
5:32 ALBERT
6:40 ELECTRA
7:19 DeBERTa
8:13 Decoder Branch
8:50 GPT
9:13 CTRL
9:54 GPT-2
10:31 GPT-3
11:30 GPT-Neo/GPT-J-6B
11:50 Encoder-Decoder Branch
12:00 T5
13:05 BART
13:46 M2M-100
14:22 BigBird
#datascience #neuralnetwork #machinelearning #naturallanguageprocessing
Пікірлер: 62
In this video, I tried to explain all the major Transformer architectures. I have also explained the differences and training objective of each one of them. If you feel this video adds value in your life then please like, share and comment on this video and subscribe to this channel. If any suggestions and feedback then please drop in comment box.
It would have been awesome if all the models had the release year mentioned along with it as well. Helps to get a birds eye view of the timeline.
@datafuseanalytics
8 ай бұрын
Hello. Yes, I am making a separate video on similar topic. It will be uploaded soon. Stay tuned my friend.
I just found this video and it's very good. I'm currently trying to understand when to use what type of model. Looking at Huggingface is just overwhelming. That's where this video jumps in and provides an excellent overview of the major models. I wish there would be a similiar video explaining the various pretraining objectives.
@datafuseanalytics
6 ай бұрын
Hello. I will definitely make a video on the same. Thanks a lot. 😀
thanks for the excellent, well-explained summary!
@datafuseanalytics
Жыл бұрын
Thank you Kevin
Thanks for sharing. It's very informative. Keep up with this work.
@datafuseanalytics
Жыл бұрын
Thank you, Santosh, for watching the video.
Very nice and to the point video, thank you !!!
@datafuseanalytics
Жыл бұрын
Hey thanks a lot Ajit 😃 🙏
Informative content Thanks for sharing this
@datafuseanalytics
Жыл бұрын
Glad you liked it!
this is really nice explaination!!!
@datafuseanalytics
8 ай бұрын
Thanks a lot Ganesh 😃 🙏
Amazing. Great work👍
@datafuseanalytics
Жыл бұрын
Thanks Milind
Great explanation. Thank you very much
@datafuseanalytics
Жыл бұрын
Glad it was helpful for you Sagar...
This is good. Keep up the good work. 🙂
@datafuseanalytics
Жыл бұрын
Thank you Saket, I will
Informative 👍
@datafuseanalytics
Жыл бұрын
Glad it was helpful and informative for you Aditya. Please do share it with your friends. More interesting videos will be uploaded soon
Well done!
@datafuseanalytics
Жыл бұрын
Thanks David.
thank you sir ! Fantastic method of explanation
@datafuseanalytics
5 ай бұрын
Hey buddy. Thanks a lot. 😀
@datafuseanalytics
5 ай бұрын
Hey buddy. Thanks a lot
Thanks for sharing
@datafuseanalytics
Жыл бұрын
My pleasure
Can you create a tutorial on Longformer and the concepts/code used to adapt an LLM for larger token sizes?
@datafuseanalytics
Жыл бұрын
Hello David. I haven't made it yet. But I will definitely make one on Longformer etc which takes a whopping 4096 tokens as input. Thanks for your feedback.
Excellent
@datafuseanalytics
Жыл бұрын
Thanks a lot Suhail.
Kudos🎉
@datafuseanalytics
7 ай бұрын
Thank you 😃
Superb 🎉
@datafuseanalytics
Жыл бұрын
Hey thanks William
Greate video!
@datafuseanalytics
Ай бұрын
Thanks a lot. Please do share it with your friends 😁
thanks a lot❤
@datafuseanalytics
6 ай бұрын
You are most welcome 😃 Do check other videos too on AI on this channel.
Nice overview
@datafuseanalytics
8 ай бұрын
Hey Thanks a lot 😃
Excellent video and I joined as a sub. Like this style of going thru the various architectures and the use case. Maybe you can also update it with GPT 4 since it’s new out there.
@datafuseanalytics
Ай бұрын
Thanks a lot for this amazing comment. I have uploaded the latest video using ChatGPT model - kzread.info/dash/bejne/f398p8OxlNLXqKQ.html Please go through it and feel free to comment
Great summary- would be good if you did an update
@datafuseanalytics
10 ай бұрын
Sure. I will make an updated video comprising of all the possible model architectures
Thx
@datafuseanalytics
Жыл бұрын
Most welcome 😃 😊
Hello, how do I contact/ connect with you, with regards to a project?
@datafuseanalytics
Жыл бұрын
Hello, please contact us via our email. datafuseanalytics@gmail.com
there's some new important ones like the newer gpt Neo models, alpaca, llama, cereus, vicuna
@datafuseanalytics
Жыл бұрын
Hello Ian. Yes. At the time of this session, these models weren't available. Thank you for your feedback. I will definitely make one video (part 2) which will encompass these models in a more simpler fashion
It seems it does not cover BERT in computer vision.
@datafuseanalytics
Жыл бұрын
Yes you are right Chen Peter
this sounds like copy pasted from online articles and just reading from them without extra info at all
@datafuseanalytics
Жыл бұрын
Hey Ko-Jap. I referred multiple books for the same and then wrote the content in my language. But I did not refer to any online blogs or articles. Only books are the reference. But thank you for your valuable feedback. I will improve so that it doesn't sound as I am reading. 🙏😀
for the algo
@datafuseanalytics
4 ай бұрын
Thank you
This is good. Keep up the good work. 🙂
@datafuseanalytics
Жыл бұрын
Hey Thanks Saket