Inventing liquid neural networks

Ғылым және технология

Paper: www.science.org/doi/10.1126/s...
Publication/Event: Science Robotics
Key authors: Makram Chahine, Ramin Hasani
Mathias Lechner, Alexander Amini, and Daniela Rus
MIT News article: news.mit.edu/2023/drones-navi...
Video Director: Rachel Gordon
Videographer: Mike Grimmett

Пікірлер: 79

  • @notallm
    @notallm Жыл бұрын

    Thousands of neurons in conventional models to 19 liquid neurons performing this well is really commendable! I can't wait to see more developments!

  • @musicMan11537
    @musicMan115376 ай бұрын

    The “liquid neural network” idea in this work is a nice (re-)popularization of neural ODEs (which have been around for some time) with the modification that the integration time constant “tau” is a function of the input (as opposed to being a constant). The use of “liquid” in the model name is not the best choice of word as historically (as early as 2004 and before) there’s a class of neural models called “liquid state machines” (LSM): en.m.wikipedia.org/wiki/Liquid_state_machine (In fact, an LSM are a kind of spiking neural model and this actually more brain-like than neural ODEs as in the work of this video) It’s important to be clear that these authors’ work is a nice little innovation on neural ODEs, but this is a far cry from biological neurons as in the actual paper they even use backpropagation through time which is clearly biologically implausible (the brain does not unroll itself backwards through time). It’s also important to know the historical context as works like this one are being a bit disingenuous by not making it more clear that they are popularizing good classic ideas (e.g., from ODEs and neural ODEs).

  • @musicMan11537

    @musicMan11537

    6 ай бұрын

    ODE = ordinary differential equation

  • @Alpha_GameDev-wq5cc

    @Alpha_GameDev-wq5cc

    15 күн бұрын

    Could you please elaborate on the back-propagation critique? It’s just a computation right? Why can’t the brain just do it… I get it that in cyber implementations, it’s an external classical instruction based algorithm operating on the architecture but maybe the brain does it within the neural networks? Is that not plausible?

  • @ckpioo

    @ckpioo

    4 күн бұрын

    ​@@Alpha_GameDev-wq5cc it's not that it can't, but it doesn't and we are very sure of it, simply search up why the brain doesn't use back propagation and you will find alot of evidence

  • @SureshKumar-qi7cy
    @SureshKumar-qi7cy Жыл бұрын

    Inspirational conversation

  • @-www.chapters.video-
    @-www.chapters.video-11 ай бұрын

    00:01 Exciting times and the start of the project 01:00 Implementing smaller neural networks for driving 02:05 Revolutionary results in different environments 03:03 Creating abstract models for learning systems 04:00 Properties and applications of liquid neural networks 05:17 Challenges in implementing the models 06:32 Testing and pushing the models to their limits 07:12 Expanding to drone navigation and tasks 08:05 Extracting tasks and achieving reasoning 09:11 Surprising and powerful properties of liquid networks 10:06 Zero-shot learning and adaptability to different environments 11:27 Extraordinary performance in urban and dynamic environments 12:31 Resiliency maps provide visual and concise answers to model decision-making 13:14 Interpretable and explainable machine learning for safety critical applications 14:23 Liquid networks as a counter to the scaling law of generative AI 15:42 Under parametrized neural networks like liquid networks for future generative models 17:10 Exploring multiple agents and composing solutions for different applications 18:02 Extending liquid networks to perform well on static data and new types of data sources 18:41 Embedding intelligence into embodied agents and society

  • @berbudy
    @berbudy11 ай бұрын

    Thank you worm for leading us to this

  • @alexjbriiones
    @alexjbriiones11 ай бұрын

    I was just reading that the main bottleneck for AI is the huge and specialized data sets. This next-level genius invention is going to be revolutionary.

  • @ajithboralugoda8906
    @ajithboralugoda89062 ай бұрын

    WOW! Is Liquid Neural Networks, the Signal in the LLM sea of Noise? So exciting! Great Job folks waiting to hear more breakthroughs from the great research!

  • @bvdlio
    @bvdlio11 ай бұрын

    Great, exciting work!

  • @benealbrook1544
    @benealbrook154411 ай бұрын

    Amazing job, this is a fundamental shift and the way forward,. I am interested in seeing the memory footprint and cpu demands. Looking forward to the applications in other fields and perhaps replacement of traditional state of the art models.

  • @erikdong
    @erikdong11 ай бұрын

    Bravo! 👏🏼

  • @palfers1
    @palfers111 ай бұрын

    Thank you Mr. or Mrs. Worm!

  • @atakante
    @atakante11 ай бұрын

    Very important algorithmic advance, kudos to the team!

  • @lpmlearning2964

    @lpmlearning2964

    9 күн бұрын

    This is not an algorithmic advance lol

  • @Michallote
    @Michallote3 ай бұрын

    Why has there not been any impact of this tech? I have seen it twice already in blogs and videos but nothing reflects that on the number of articles published, the references from other authors and most of all... No replicable code or pretrained models to actually test those claims. Has anyone here been able to corroborate their results??

  • @moormanjean5636

    @moormanjean5636

    2 күн бұрын

    Its very new tech and yes it works, there's a github hosted by Mathias with a lot of activity

  • @LoanwordEggcorn
    @LoanwordEggcorn11 ай бұрын

    Thanks for the talk. Would it be accurate to say the approach models nonlinearities of natural (biological) neural networks?

  • @SynaTek240

    @SynaTek240

    10 ай бұрын

    Kinda, but it doesn't use spikes to communicate like biological networks, but rather it uses magnitudes of values just like normal ANNs. However the way it mimics biological networks is in it being temporally freed rather than clocked, if that says anything to you :D

  • @LoanwordEggcorn

    @LoanwordEggcorn

    10 ай бұрын

    @@SynaTek240 Thanks, and that seems like an important part of how it works. Biological networks are not clocked. The details of how the values propagate may be less relevant, though spikes can have higher order effects from interference effects or other interactions that magnitudes don't. Sound right?

  • @koustubhavachat
    @koustubhavachat11 ай бұрын

    do we have pytorch implementation available for liquid neural network ? how can one start with this !

  • @SapienSpace
    @SapienSpace Жыл бұрын

    Outstanding work, and very fascinating about looking at the worm's neurons, and figuring out how to practically apply it to autonomous navigation systems, with an even less number of neuron-like modules! The "liquid" aspect, I'm attempting to understand more, though it is fascinating to me, as it might be a natural characteristic of electro-magnetic, inductive coupling, between nearby synapses (the same, often perceived problem with human created wiring as interference "noise"/fuzziness, though applied here as a benefit, providing an opportunity for DE-fuzzification between synapses or overlapping/liquid states). It almost seems intriguingly similar to overlapping membership (probability/statistical distribution) functions of fuzzy states (such as what is used in Fuzzy Logic, e.g. a room is "hot", "warm", "cool", "cold"), using an application of a type of K-means clustering, or similar, to focus attention on the most frequently used regions of state space classification. One might be able to perceive "liquid time constant", just like as in Fuzzy Logic, as a method of merging knowledge (abstract, qualitative, fuzzy, noisy, symbolism) and mathematics together (through interpolation, or extrapolation), but it seems the self incriminating nomenclature of "Fuzzy" has been lost in machine intelligence (maybe via the Dunning-Kruger effect, paradox of humility). Merging "liquid time constant" (or Fuzzy Logic) with Reinforcement Learning can help naturally generate the simpler inference rules of a vast state space, and allow the machine to efficiently self learn, without an expert human creating the inference rules. I have seen this done with a neural fuzzy, reinforcement learning experiment with an inverted pendulum control experiment. Lately, I have been reading a book titled "The Hedonistic Neuron" written in 1982 to try to understand how these RL systems work more, they seem quite profoundly incredible. Thank you for sharing your incredible work!

  • @keep-ukraine-free

    @keep-ukraine-free

    11 ай бұрын

    Your idea that the "liquidity" (or the "liquid" nature of information flow) "might be a natural characteristic of electro-magnetic, inductive coupling" is incorrect. It can't be so, since information between real neurons (between any two synapses) passes not electromagnetically, but using molecules (commonly called neurotransmitters) -- which act as a key to a lock. To help you understand, the described NN model uses differential equations -- which give the model its "liquid" nature.

  • @SapienSpace

    @SapienSpace

    11 ай бұрын

    @@keep-ukraine-free the molecules are going to have electron orbits, and the orbits will inductively interact with each other via Maxwell's equations, whether you like it or not, this is natural, it is physics.

  • @Theodorus5

    @Theodorus5

    11 ай бұрын

    I think you mean 'ephaptic coupling' between neurons

  • @Niamato_inc
    @Niamato_inc11 ай бұрын

    What a time to be alive.

  • @prolamer7

    @prolamer7

    11 ай бұрын

    mental virus is in you

  • @jimj2683
    @jimj2683Ай бұрын

    The average neuron in the human brain has 10000 synapses (connections with other neurons). The nodes/neurons in the neural network should be able to make connections with other nodes/neurons to mimmick the human brain.

  • @computerconcepts3352
    @computerconcepts3352 Жыл бұрын

    Interesting 🤔

  • @77sanskrit
    @77sanskrit11 ай бұрын

    9:01 One environment that would be interesting to train it in, would be like those wind tunnel things, that they test aerodynamics of plane parts and shit. Get it trained on turbulence and loop di loops, that would be awesome!!! Just a thought🤔👍 You Guys are amazing!!!! Absolutely Genius!!!!🙏🙏🫀🧠🤖

  • @adamgm84
    @adamgm842 ай бұрын

    My wig always melts when we get into composing algorithms.

  • @bobtivnan
    @bobtivnan11 ай бұрын

    I wonder how much of this work makes other progress in this field obsolete? I hope Lex Fridman, who also works on autonomous vehicles at MIT, invites them to his podcast.

  • @sb_dunk

    @sb_dunk

    11 ай бұрын

    Does he work on autonomous vehicles? I get the impression he's not as much of an AI expert as many would lead you to believe.

  • @bobtivnan

    @bobtivnan

    11 ай бұрын

    @@sb_dunk he has mentioned many times on his podcast that he has worked in the autonomous vehicles field. I found this lecture at MIT kzread.info/dash/bejne/Y4Bktq2Tgca7pKQ.html

  • @sb_dunk

    @sb_dunk

    11 ай бұрын

    @@bobtivnan I wouldn't say that lecturing at MIT is equivalent to working on autonomous vehicles at MIT, the latter implies you're at the forefront of the research. The papers that I can see that he's published don't appear to be massively cutting edge either, nor even directly related to autonomous driving - the closest are about traffic systems and how humans interact with autonomous vehicles. My point is that we should take the supposed expertise of these people with a pinch of salt.

  • @bobtivnan

    @bobtivnan

    11 ай бұрын

    @@sb_dunk let it go dude

  • @sb_dunk

    @sb_dunk

    11 ай бұрын

    @@bobtivnan Oh I'm sorry, I didn't realize I wasn't allowed to question someone's credentials or expertise.

  • @arowindahouse
    @arowindahouse11 ай бұрын

    How can liquid neural networks have the necessary inductive biases for computer vision? If I'm not wrong, you need to add classical convolutional neural networks before the liquid for the model to be usable

  • @HitAndMissLab
    @HitAndMissLab11 ай бұрын

    How are liquid neural networks performing in language models, where differential equations are of very little use?

  • @magi-1

    @magi-1

    11 ай бұрын

    a transformer is a fully connected graph neural network and each layer is essentially a cross section of a continuous process. I the same way an RNN is a discrete autoregressive model, you can reformat LLMs as a continuous process and sample words via a fixed integration step using euler integration.

  • @realjx313
    @realjx3132 ай бұрын

    The attention focus,isn't that about labels?

  • @nikhilshingadiya7798
    @nikhilshingadiya779811 ай бұрын

    Now llms model competition is rising 😂😂😂 love you guys for your great efforts 🎉🎉

  • @DarkWizardGG

    @DarkWizardGG

    11 ай бұрын

    In the near future one of those LLM models will secretly integrate into that liquid neural network. Guys let us all welcome the "T1000" model....TAAADDDDAAAA. It's hunting time. lol😁😉😄😅😂😂😂🤦‍♂️🤦‍♂️🤦‍♂️🤖🤖🤖🤖🤖🤖🤖🤖🤖

  • @shivakumarv301
    @shivakumarv30111 ай бұрын

    Will it not be wise to do SWOT of the new technology and understand it's consequence before jumping into it

  • @qhansen123
    @qhansen12311 ай бұрын

    Why does this say “Inventing liquid neural networks”, when there’s been papers on liquid neural networks from the concept/terminology before 2004?

  • @hansadler6716
    @hansadler671611 ай бұрын

    I would like to hear a better explanation of how an image could be input to such a small network.

  • @antoruby

    @antoruby

    11 ай бұрын

    The first layers are still regular convolutional neural nets. The decision making layers (last ones), that are traditionally fully connected layers, were substituted by the 19 liquid neurons.

  • @lesmathsparexemplesatoukou3454
    @lesmathsparexemplesatoukou3454 Жыл бұрын

    CSAIL, I' m coming to you

  • @RoyMustang.
    @RoyMustang.11 ай бұрын

    ❤❤❤

  • @tachoblade2071
    @tachoblade207111 ай бұрын

    if the network can understand the reason behind the tasks or causality of them.... could it "understand" language rather than just being a auto completer like gpt?

  • @kesav1985
    @kesav19856 ай бұрын

    (Re)-inventing time integration schemes would have been dubbed as crappy if they were "invented" by some unknown academic at a non-elite university. But, hey ML hype and MIT brand would work wonders in selling ordinary stuff!

  • @joaopedrorocha4790
    @joaopedrorocha479011 ай бұрын

    This is exciting ... This kind of network would be able to forget data that doesn't fit it's needs? For example ... I give it time series data on the first training, and it learns some pattern based on it and act according to it, but them things change on the world, and the pattern it learned is no longer that useful ... This kind of network could just forget this pattern gradually and adapt based on its new input?

  • @wiliamscoronadoescobar8113
    @wiliamscoronadoescobar8113 Жыл бұрын

    About this... Tell the people and the university around ...

  • @spockfan2000
    @spockfan20007 ай бұрын

    Is this tech available to the public? Where can I learn how to implement it? Thanks.

  • @User_1795

    @User_1795

    5 ай бұрын

    Be careful

  • @DanielSanchez-jl2vf
    @DanielSanchez-jl2vf11 ай бұрын

    Guys lets show this to Demis Hassabis, Yann lecunn, Ilya Sutskever, Yoshua Bengio.

  • @vallab19
    @vallab1911 ай бұрын

    Liquid Neural Network is another level in the AI revolution.

  • @DarkWizardGG

    @DarkWizardGG

    11 ай бұрын

    Yeah, T1000 in the making. Lol😁😉😄🤖🤖🤖🤖

  • @and_I_am_Life_the_fixer_of_all
    @and_I_am_Life_the_fixer_of_all Жыл бұрын

    wow, I'm the 4th comment! What a time to be alive! Welcome to history key authors :D

  • @ldgarcia27

    @ldgarcia27

    Жыл бұрын

    Hold on to your papers!

  • @hamamathivha6055

    @hamamathivha6055

    11 ай бұрын

    Dear fellow scholars!

  • @Lolleka
    @Lolleka11 ай бұрын

    And here I was, thinking that the video was about liquid-phase computing. Silly me.

  • @arowindahouse
    @arowindahouse11 ай бұрын

    I thought Liquid State Machines had already been invented in the 90s by Maass

  • @francisdelacruz6439
    @francisdelacruz643911 ай бұрын

    Once you hv collision detection which can be a separate system you hv a certifiable self driving ecosystem. Maybe it’s time to go start-up and hv the resources to get this to an actual product. Maybe raising usd 100mn would be easy and manageable equity exposure. The drone example is a game change in the new type of Ukraine war you could use a raspberry pi equivalent board in drones and implications there will change how wars are done likely harder to invade other countries with this tech add on.

  • @BradKittelTTH
    @BradKittelTTH11 ай бұрын

    This means that the mature neurons that we humans can produce starting in our late 40s and after if in good health that operates at 10-100 times the speeds of juvenile neurons suggests that the elders could far out-think, grow new ideas, abilities, and potential just being unleashed after our 60's when we have accumulated a host of higher operating systems, synapses and neural networks that are superior to quantum computers. Given there is evidence that wii can produce 700 neurons a night, what is our potential into our 80s to get smarter too. This is the potential of humans once we understand our potential if we master the vessel Wii, all the "I"s that understand the bio-computers wii communicate through, the avatars that form the mii, with eyes watching you. A fabulous new discovery, and amazing you have been able to tap these incredible liquid neural networks that also suggest that all Beings have the potential to understand and navigate more of reality than humans ever imagined before these discoveries. If a worm can do so well with 302 neurons, what is the limitation, if any, of a billion-neuron network comprised of millions of tinier networks that intermingle on optical cable speeds. Thank you for this interview.

  • @DarkWizardGG
    @DarkWizardGG11 ай бұрын

    This is T1000 in the making. Liquid shapeshifter AI. lol😁😉😄🤖🤖🤖🤖🤖

  • @randomsitisee7113
    @randomsitisee711311 ай бұрын

    Sounds like bunch of Mumbo jumbo trying to cash in on AI train

  • @Stopinvadingmyhardware
    @Stopinvadingmyhardware11 ай бұрын

    Grokked.

  • @amarnathmutyala1335
    @amarnathmutyala133511 ай бұрын

    So worms can mentally drive cars ?

  • @MuscleTeamOfficial
    @MuscleTeamOfficial11 ай бұрын

    Elongated muskrat is comin🎉

  • @ethanwei5060
    @ethanwei506011 ай бұрын

    Another scope for this solution is for banning bad social media posts, banning bad content from children and unwanted content. Current human moderators require counselling after moderating in a day, and if this liquid neural networks can focus on the task instead of context like traditional AI, it can be game changing.

  • @ibrremote
    @ibrremote11 ай бұрын

    Task-centric, not context-centric…

  • @wiliamscoronadoescobar8113
    @wiliamscoronadoescobar8113 Жыл бұрын

    Here...And For the goodness of education i wanna insert, introduce a camp of investigation talking about the information in the world, the contamination in it and surely...touching topics like the fotons that my investigations can touch. In the web. For references the data talked in live in a conference of my friend Andres Manuel Lopez Obrador. President Of Mexico

  • @Jediluvs2kill
    @Jediluvs2kill11 ай бұрын

    This ramim guy is timewaste

  • @alexjbriiones
    @alexjbriiones11 ай бұрын

    I am sure Elon Musk is paying attention to this group and would probably try to hire them to complement Tesla's autonomous driving. Even more ominous is that China and Russia are probably setting their engineers to duplicate this invention.

  • @bobtivnan

    @bobtivnan

    11 ай бұрын

    I also thought that Musk would pursue this, for Tesla and for more general application with his new company xAI.

  • @Lolleka
    @Lolleka11 ай бұрын

    And here I was, thinking that the video was about liquid-phase computing. Silly me.

Келесі