L18.4: A GAN for Generating Handwritten Digits in PyTorch -- Code Example

Ғылым және технология

Slides: sebastianraschka.com/pdf/lect...
Code: github.com/rasbt/stat453-deep...
This video discusses 04_01_gan-mnist.ipynb
-------
This video is part of my Introduction of Deep Learning course.
Next video: • L18.5: Tips and Tricks...
The complete playlist: • Intro to Deep Learning...
A handy overview page with links to the materials: sebastianraschka.com/blog/202...
-------
If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Пікірлер: 7

  • @algorithmo134
    @algorithmo134Ай бұрын

    Does using a double for-loop like the one mentioned in the GAN original paper by Goodfellow easier to implement in practice? For example, we freeze training the generator when we train the discriminator and vice-versa.

  • @candylauuuu219
    @candylauuuu219 Жыл бұрын

    Hi sir. Can I use this code for custom image dataset? What is the file type that MNIST images fed into the dataloader and whole GAN training process actually?

  • @lewforth7147
    @lewforth71476 ай бұрын

    hello, professor, thanks for the video. but I am confused about the z in the generator_forward part at around 5:58. you said you created a vector z (C * H * W) first , then flatten it in start_dim=1. but in self.generator, the first input size is latent_dim = 100. why it is not 1*28*28 (C * H * W) ? thanks

  • @Facts-The-universe
    @Facts-The-universe4 ай бұрын

    Sir i am also working on handwritten character generation using GAN for marathi character.. Please suggest me can i reuse your code and from where can i get it.. Please reply me

  • @kafaayari
    @kafaayari2 жыл бұрын

    Hello Mr. Raschka. Thank you very much for the great lecture. I have a question though regarding necessity of detach. At 15:58, you say that it will influence the generator. But when setting up optimizers, we made two specific optimizers for generator and discriminator, and selected only relevant NN parameters respectively. Why is detach operation still needed?

  • @SebastianRaschka

    @SebastianRaschka

    2 жыл бұрын

    Yeah, that's a good point. It wouldn't update the generator params because those are not part of the discriminator-optimizer. However, I would definitely still use .detach(). (1) for efficiency reasons I.e., if you don't use it it would build the computation graph for the generator, which is wasteful. (2) And the computation graph is only destroyed when you call backward(). So, what that means is that you probably will also get weird results because it would already construct a computation graph involving generator parameters before you call the generator.

  • @kafaayari

    @kafaayari

    2 жыл бұрын

    @@SebastianRaschka Ah, now I see. Thank you very much professor!

Келесі