🔥 Integrate Weights & Biases with PyTorch
Ойын-сауық
In this video, Weights & Biases Deep Learning Educator Charles Frye demonstrates how to integrate W&B into PyTorch code while avoiding interference from the Mirror Universe and a Kraken attack.
Follow along in Colab: wandb.me/pytorch-colab
Check out the Keras version: wandb.me/keras-video
Weights & Biases makes developer tools for machine learning: record and visualize every detail of your research, collaborate easily, advance the state of the art - we’re always free for academics and open source projects.
Join our community of ML practitioners to share interesting projects and meet other people working in Deep Learning. wandb.me/slack
Our gallery, Fully Connected, features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, and industry leaders sharing best practices. wandb.ai/fc
0:00 - What is W&B? How do I add it to my PyTorch code?
3:25 - Installs, imports, and setup
5:49 - Setting hyperparameters and boilerplate
12:03 - Logging metrics and gradients to W&B
15:48 - Reviewing the W&B Dashboard
18:10 - Metadata, system metrics, and model topology in W&B
23:06 - Outro
Пікірлер: 31
Great video and walk-through, I really like how you explain the details and steps Charlies
Great tutorial, Charles, thanks for sharing!
Awesome work, thanks for sharing!
The "log_freq=10" made my training loop unbearably slow (on a different model than video). Granted, by most DL standards I have a slow computer. Love your stuff! Hope this saves someone a minute.
amazing! gpu utilization? That is so useful now I can increase the batch size so much more easily without having issues with nvidia-smi...etc etc!
THIS IS AMAZING!
how does one achieve high disk utilization in pytorch? large batch size and num workers?
now I can track the gradients without a hassle? no additional get gradients functions...nice!
the gradients are numerated like modex x1.x2 what do x1.x2 refer to?
Great knight rider reference "Evil charles with a goatee"
My NN is not learning even thought I have the optimize step in my def train(model, config). Does someone have the same problem?
i have problem with connection in wandb wandb: Network error (ConnectionError), entering retry loop. windows 10 how to resolve this issue ?
Great!
Does Wandb support PyTorch Distributed Data Parallel training? I cannot make it work ...
@WeightsBiases
Жыл бұрын
yep, here's some docs: docs.wandb.ai/guides/track/advanced/distributed-training
How do things change if I am using DDP? (e.g. distributed training and a bunch of different processes are running? Do I only log with one process? That is what I usually do)
@WeightsBiases
2 жыл бұрын
There's two ways to handle it -- logging from only one process is simpler, but you sacrifice the ability to see what's happening on all GPUs (good for debugging). Explanatory docs here: docs.wandb.ai/guides/track/advanced/distributed-training
i don't understand what "log_freq=10" mean? Does it mean log the parameters every 10 epochs or batchs or steps?\
how to count number of classes in each image
what happens if we don't do .join() or .finish()? e.g. there is a bug in the middle it crashes...what will wandb do? will the wandb process be closed on its own?
@WeightsBiases
2 жыл бұрын
In the case of a bug or crash somewhere in the user script, the wandb process will be closed on its own, and as part of the cleanup it will sync all information logged up to that point. If that crashes (e.g. because the issue is at the OS level or things are otherwise very on fire), the information won't be synchronized to the cloud service but it will be on disk. You can sync it later with wandb sync. Docs for that command: docs.wandb.ai/ref/cli/wandb-sync If you have more questions like these, check out the Technical FAQ of our docs: docs.wandb.ai/guides/technical-faq
Are these clips Deep Learning articles?
wand-bee
Fonts are so small
@WeightsBiases
3 жыл бұрын
Thanks for the feedback! We're making sure that future tutorials don't have this issue
@FeddFGC
3 жыл бұрын
Go 720p or higher, it should do the trick. It's perfect already at 720p
@amitozazad1584
3 жыл бұрын
@@FeddFGC I second this, it works at high resolution
"Evil Charles with false metrics" lmao
Americans are so imprecise in their vocabulary. I understand you're trying to make the explanations more palatable but I personally prefer someone being more calm, collected and precise in their vocabulary and choice of sentences. Many academicians may prefer this. Besides that, thanks for the video.