BART: Denoising Sequence-to-Sequence Pre-training for NLG & Translation (Explained)
Ғылым және технология
BART is a powerful model that can be used for many different text generation tasks, including summarization, machine translation, and abstract question answering. It could also be used for text classification and token classification. This video explains the architecture of BART and how it leverages 6 different pre-training objectives to achieve excellence.
BERT explained
• BERT: Pre-training of ...
Transformer Architecture Explained
• Transformer Architectu...
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
arxiv.org/abs/1910.13461
Code (Facebook)
github.com/pytorch/fairseq/tr...
Code (Hugginface)
huggingface.co/transformers/m...
Connect
Linkedin / xue-yong-fu-955723a6
Twitter home
email edwindeeplearning@gmail.com
Пікірлер: 18
Understood very clearly! after this reading a research paper is very easy for me Thanks!
@deeplearningexplainer2139
2 жыл бұрын
Glad to hear that!
Great Introduction! Really like it!
@deeplearningexplainer2139
Жыл бұрын
Thank you! Cheers!
dang this is a great video very clearly explained thanks!
@deeplearningexplainer2139
Жыл бұрын
thanks for liking it!
Great video! Thanks!
@deeplearningexplainer2139
Жыл бұрын
Glad you liked it!
very well explained
@deeplearningexplainer2139
2 жыл бұрын
Glad you think so!
great video! what is the software for the floating cam?
@deeplearningexplainer2139
2 жыл бұрын
thank you Filipe! It's Loom!
kudos
@deeplearningexplainer2139
Жыл бұрын
thanks for watching!
How to calculate rouge score, i am getting recall, precision and f values while using rouge-package. Is there any formula to calculate rouge score so that i can get values as you get
@deeplearningexplainer2139
2 жыл бұрын
You're almost there Shahid. Most papers don't specify which ROUGE score they use (which they should) and most of those are the recall of ROUGE.
I never understand how bert , bart works...these language models work....and its so scary
@deeplearningexplainer2139
Жыл бұрын
all you need is to spend more time watching this video and reading the papers :)