Physics-Informed Neural Networks (PINNs) - An Introduction - Ben Moseley | Jousef Murad

🌎 Website: jousefmurad.com
Physics-informed neural networks (PINNs) offer a new and versatile approach for solving scientific problems by combining deep learning with known physical laws. Such networks are able to simulate physical systems, invert for their underlying parameters and even discover underlying physical laws themselves. In this introductory workshop and live coding session we will cover the basic definition of a PINN, their pros and cons compared to traditional scientific techniques and some of the state-of-the-art research in the field.
👉 My main channel: @Jousef Murad
ONLINE PRESENCE
================
🌍 My website - jousefmurad.com/
💌 My weekly science newsletter - jousef.substack.com/
📸 Instagram - / jousefmrd
🐦 Twitter - / jousefm2
#physics
#engineering
#neuralnetwork

Пікірлер: 30

  • @JousefLITE
    @JousefLITE Жыл бұрын

    🧠More material & talks here: community.sci-circle.com/checkout/community-member 🌎 Science Courses: courses.jousefmurad.com/

  • @meetplace
    @meetplace6 ай бұрын

    +1 for Oxford PhD saying "timesing" instead of multiplying... respect! :D

  • @hreedishkakoty6771
    @hreedishkakoty6771Ай бұрын

    at 14:30, it seems like external force will not operate on Unn. External force will be a constant term in the physics loss function.

  • @abdulwaris8
    @abdulwaris85 ай бұрын

    Thanks for sharing this recording from the workshop. Thanks, Ben!

  • @carriefu458
    @carriefu458Ай бұрын

    I love all of the questions!! 🤓 Ben is a great teacher!

  • @ajaytaneja111
    @ajaytaneja11110 ай бұрын

    We are talking of relatively simple oscillator problem. How about if we have complex geometries for which FEM methods are most suited today? I have been reading of physics informed graph nets for the purpose of complex geomeries. Do you have any references for complex domains? Lets say i have a complex shaped mechanical component subjected to pressure fir which i normslly use FEM.?

  • 7 ай бұрын

    Nice lesson and clear presentation. Thank you!

  • @vegetablebake
    @vegetablebake7 ай бұрын

    A great introduction and massive thanks for sharing the knowledge!

  • @raju-bitter
    @raju-bitter7 ай бұрын

    Fantastic introduction, much appreciated!

  • @vitezslavstembera854
    @vitezslavstembera8549 ай бұрын

    Very nice and clear presentation.

  • @jyothish75
    @jyothish756 ай бұрын

    could you please provide the example code of PINN?. Link in the comments not working.

  • @suleymanemirakin
    @suleymanemirakin3 ай бұрын

    Great work!

  • @WeiZhang-sj9sl
    @WeiZhang-sj9sl8 ай бұрын

    great work

  • @fkeyvan
    @fkeyvan5 ай бұрын

    nice tutorial. thank you.

  • @canxkoz
    @canxkoz Жыл бұрын

    Great video on this fascinating field. Thanks for sharing.

  • @JousefMuradAPEX

    @JousefMuradAPEX

    10 ай бұрын

    Sure :)

  • @muhammadsohaib681
    @muhammadsohaib681 Жыл бұрын

    Thank you for such an informative lecture on PINN.

  • @JousefMuradAPEX

    @JousefMuradAPEX

    10 ай бұрын

    Thanks for watching! :)

  • @mklu0611
    @mklu06117 ай бұрын

    OMG, very cool video!!! The training performance is highly dependent on the "lambda" value, do you have ideas about how to define its value? Many thanks.

  • @shankyxyz
    @shankyxyz8 ай бұрын

    similar question as some others. When we are solving even standard physics electrostatics, heat transfer etc, forget time domain, so only elliptic equations on complex CAD, I am wondering what applications can PINNs be used for. as opposed to using FEM. maybe shape optimization type problems? or inverse problems?

  • @AdrienLegendre
    @AdrienLegendre2 ай бұрын

    A possibly useful method would be to have the neural network identify the invariants or a Lie group for a differential equation. Another approach, compute all scalar quantities and have neural network find the right combination of scalar quantities to find a Lagrangian for a physical system.

  • @cunningham.s_law
    @cunningham.s_law6 ай бұрын

    I wonder if this give better results with PDE for option pricing

  • @tanuavi98
    @tanuavi983 ай бұрын

    code link where can I get?

  • @user-lt4zd9zj2h
    @user-lt4zd9zj2h6 ай бұрын

    well done,the trend information is also very important,and it can be involved by a partial differential equation.i think maybe the parameters of the partial differential equation can also be the parameters of the neural network PINNS

  • @rupeshvinaykya4202
    @rupeshvinaykya42028 ай бұрын

    Thanks for PINN , is code available ?

  • @aakashs1806

    @aakashs1806

    Ай бұрын

    I think MIT developed something related to this, not sure whether it is opensource

  • @sadeghmirzaei9330
    @sadeghmirzaei933010 ай бұрын

    Great 👍

  • @JousefMuradAPEX

    @JousefMuradAPEX

    10 ай бұрын

    Sure :)

  • @ihmejakki2731
    @ihmejakki27314 ай бұрын

    Very nice lesson! I'm stuck on the Task 3 though, I can't get the network to converge for w0=80. Here's the code if anyone can spot what I'm missing here: torch.manual_seed(123) # define a neural network to train pinn = FCN(1,1,32,3) # define additional a,b learnable parameters in the ansatz # TODO: write code here a = torch.nn.Parameter(torch.zeros(1, requires_grad=True)) b = torch.nn.Parameter(torch.zeros(1, requires_grad=True)) # define boundary points, for the boundary loss t_boundary = torch.tensor(0.).view(-1,1).requires_grad_(True) # define training points over the entire domain, for the physics loss t_physics = torch.linspace(0,1,60).view(-1,1).requires_grad_(True) # train the PINN d, w0 = 2, 80# note w0 is higher! mu, k = 2*d, w0**2 t_test = torch.linspace(0,1,300).view(-1,1) u_exact = exact_solution(d, w0, t_test) # add a,b to the optimiser # TODO: write code here optimiser = torch.optim.Adam(list(pinn.parameters())+[a]+[b],lr=1e-3) for i in range(15001): optimiser.zero_grad() # compute each term of the PINN loss function above # using the following hyperparameters: lambda1, lambda2 = 1e-1, 1e-4 # compute boundary loss # TODO: write code here (change to ansatz formulation) u = pinn(t_boundary)*torch.sin(a*t_boundary+b) loss1 = (torch.squeeze(u) - 1)**2 dudt = torch.autograd.grad(u, t_boundary, torch.ones_like(u), create_graph=True)[0] loss2 = (torch.squeeze(dudt) - 0)**2 # compute physics loss # TODO: write code here (change to ansatz formulation) u = pinn(t_physics)*torch.sin(a*t_physics+b) dudt = torch.autograd.grad(u, t_physics, torch.ones_like(u), create_graph=True)[0] d2udt2 = torch.autograd.grad(dudt, t_physics, torch.ones_like(dudt), create_graph=True)[0] loss3 = torch.mean((d2udt2 + mu*dudt + k*u)**2) # backpropagate joint loss, take optimiser step # TODO: write code here loss = loss1 + lambda1*loss2 + lambda2*loss3 loss.backward() optimiser.step() # plot the result as training progresses if i % 5000 == 0: #print(u.abs().mean().item(), dudt.abs().mean().item(), d2udt2.abs().mean().item()) u = (pinn(t_test)*torch.sin(a*t_test+b)).detach() plt.figure(figsize=(6,2.5)) plt.scatter(t_physics.detach()[:,0], torch.zeros_like(t_physics)[:,0], s=20, lw=0, color="tab:green", alpha=0.6) plt.scatter(t_boundary.detach()[:,0], torch.zeros_like(t_boundary)[:,0], s=20, lw=0, color="tab:red", alpha=0.6) plt.plot(t_test[:,0], u_exact[:,0], label="Exact solution", color="tab:grey", alpha=0.6) plt.plot(t_test[:,0], u[:,0], label="PINN solution", color="tab:green") plt.title(f"Training step {i}") plt.legend() plt.show()

  • @TerragonCFD
    @TerragonCFD8 ай бұрын

    Im a beginner in PyTorch and OpenFOAM since the last few years, but today i learned that my "dream" is called "PINN" 🙂