DeepONet Tutorial in JAX

Neural operators are deep learning architectures that approximate nonlinear operators, for instance, to learn the solution to a parametric PDE. The DeepONet is one type in which we can query the output at arbitrary points. Here is the code: github.com/Ceyron/machine-lea...
-------
👉 This educational series is supported by the world-leaders in integrating machine learning and artificial intelligence with simulation and scientific computing, Pasteur Labs and Institute for Simulation Intelligence. Check out simulation.science/ for more on their pursuit of 'Nobel-Turing' technologies (arxiv.org/abs/2112.03235 ), and for partnership or career opportunities.
-------
📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): github.com/Ceyron/machine-lea...
📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: / felix-koehler and / felix_m_koehler
💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim
🪙: Or you can make a one-time donation via PayPal: www.paypal.com/paypalme/Felix...
-------
⚙️ My Gear:
(Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.)
- 🎙️ Microphone: Blue Yeti: amzn.to/3NU7OAs
- ⌨️ Logitech TKL Mechanical Keyboard: amzn.to/3JhEtwp
- 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): amzn.to/37katmf
- 🔌 Laptop Charger: amzn.to/3ja0imP
- 💻 My Laptop (generally I like the Dell XPS series): amzn.to/38xrABL
- 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): amzn.to/3Jr4ZmV
If I had to purchase these items again, I would probably change the following:
- 🎙️ Rode NT: amzn.to/3NUIGtw
- 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): frame.work
As an Amazon Associate I earn from qualifying purchases.
-------
Timestamps:
00:00 Intro
01:03 What are Neural Operators?
01:58 DeepONet does not return full output field
02:31 About learning 1d antiderivative operator
04:33 Unstacked DeepONets
07:15 JAX-related notes (using vmap)
11:33 Imports
12:35 Download and inspect the dataset
21:30 Implementing DeepONet architecture
32:54 Training loop
41:20 Qualitative evaluation
45:35 Test error metric
50:40 Outro

Пікірлер: 10

  • @MachineLearningSimulation
    @MachineLearningSimulation3 ай бұрын

    Sorry for the short, less nice audio segments (4x ~10 seconds). I changed the recording setup for this video. It seems that it requires some further tuning ;)

  • @jesusmtz29
    @jesusmtz29Ай бұрын

    Great stuff

  • @MachineLearningSimulation

    @MachineLearningSimulation

    Ай бұрын

    Thanks 🙏

  • @sabaokangan
    @sabaokangan3 ай бұрын

    Thank you so much for sharing this with us ❤ from Seoul 🇰🇷

  • @user-kn4wt
    @user-kn4wt3 ай бұрын

    great vid! what are your thoughts on flax vs equinox? seems like flax has a few more things implemented natively, but equinox seems maybe slightly nicer to build your own custom model (FNO etc) from scratch? thanks for all the great content!

  • 3 ай бұрын

    Nice!

  • @lksmac1595
    @lksmac15953 ай бұрын

    Amazing

  • @particularlypythonic
    @particularlypythonic3 ай бұрын

    Is there a reason you used the function form of eqx.filter_value_and_grad instead of the decorator form on the loss function?

  • @digambarkilledar003
    @digambarkilledar0033 ай бұрын

    can you write the code for 1D burgers equation using DeepONet just like you did with FNO. Thanks !! what will be branch input and trunk inputs for burgers equation ?