Intel Advances in AI: Brain-Like Computing and Spiking Neural Networks Explained

In this video I discuss Neuromorphic Computing and the Future of AI
#AI
Support me on Patreon: / anastasiintech

Пікірлер: 343

  • @AnastasiInTech
    @AnastasiInTech Жыл бұрын

    Let me know what you think !

  • @danielricardolizicastro5332

    @danielricardolizicastro5332

    Жыл бұрын

    Thanks for the always great content! Happy holidays and an amazing 2023 for you! Cheers!

  • @masternobody1896

    @masternobody1896

    Жыл бұрын

    bad news i became homeless cause inflation please help

  • @Knapweed

    @Knapweed

    Жыл бұрын

    OK, I "think" you're awesome.

  • @sulimanibra5332

    @sulimanibra5332

    Жыл бұрын

    Kindly..let me contact you in private! I have serious scientific subject to discuss...

  • @BabbittdaWabbitt

    @BabbittdaWabbitt

    Жыл бұрын

    I thought holography was going to fit into all of this AI in a major way, No ?

  • @kipling1957
    @kipling1957 Жыл бұрын

    We need to get this guy on the Lex Fridman podcast.

  • @martin-fc4kk

    @martin-fc4kk

    Жыл бұрын

    would be great!

  • @armartin0003
    @armartin0003 Жыл бұрын

    This comment is for the algorithm. As an AI enthusiast, I rarely see someone this close to the bleeding edge of this technology. Great content. I had been curious how we were going to get faster when chips are already at the molecular level - and BAM a heroine comes to the rescue. Hooray! I'd subscribe to anyone for this, but it makes me especially happy to see a young lady delivering the information. Pretty and smart CAN go hand in hand! I hope Anastasi becomes far more famous than the Kardashians of the world. Girls need more role models such as this.

  • @mistycloud4455

    @mistycloud4455

    Жыл бұрын

    Combining ai and biology is important

  • @treborobotacon

    @treborobotacon

    Жыл бұрын

    My like of your comment is also for the algorithm. Well put.

  • @panpiper

    @panpiper

    Жыл бұрын

    Couldn't agree more.

  • @runeoveras3966

    @runeoveras3966

    Жыл бұрын

    Well put! Support this 100%! 😊

  • @tedmoss

    @tedmoss

    Жыл бұрын

    Who are the Kardashians?

  • @therealb888
    @therealb888 Жыл бұрын

    This is awesome. A neuromorphic approach to AI is something that I imagined even before joining engineering because it felt so intuitive. Here I am today with more questions after watching this video. This is the kind of research that excites me and makes me light up like a neuron lol.

  • @1q2w3e4r5t6zism

    @1q2w3e4r5t6zism

    Жыл бұрын

    A neuromorphic approach was started at 1996 at the Institute of Neuroinformatics in Zürich (Rodney Douglas).

  • @noldor__
    @noldor__ Жыл бұрын

    As a long time software developer who has not yet worked with AI, this type of revolutionary news can be a little scary for me =) but it is also exciting and motivational to finally start learning AI so that I won't be obsolete in the near future.

  • @JamesAwokeKnowing
    @JamesAwokeKnowing Жыл бұрын

    The format of this video is FANTASTIC. Way better than just the raw interview and better than just the pure "report". If your channel were a stock, I'd be buying. Thank you for investing so much effort in this video. You are bringing the AI, journalism, and media world in a beautiful direction. I hope the future is full of videos like this.

  • @AnastasiInTech

    @AnastasiInTech

    Жыл бұрын

    Thank you :))

  • @Slavigrad
    @Slavigrad Жыл бұрын

    Explain neuromorphic chip. ChatGPT Response: A neuromorphic chip is a type of computer chip that is designed to mimic the structure and function of the human brain. It is inspired by the way neurons and synapses in the brain work and is intended to enable more efficient and flexible computation. Neuromorphic chips are typically made up of a large number of artificial neurons and synapses that are connected in a way that allows them to process information in a similar way to the human brain. They can be used to perform a wide range of tasks, such as image recognition, natural language processing, and decision-making. One of the key advantages of neuromorphic chips is that they are highly energy efficient, as they are able to perform complex tasks using much less power than traditional computer chips. They are also able to operate in real-time and can learn and adapt to new information without the need for explicit programming. There are a number of different approaches to designing neuromorphic chips, including using traditional semiconductor materials, such as silicon, and more novel materials, such as memristors. Research in this area is ongoing, and it is hoped that neuromorphic chips will eventually be able to perform tasks that are currently beyond the capabilities of traditional computer chips.

  • @accesser
    @accesser Жыл бұрын

    Beautiful holidays, that's very nice Thank you for your videos. Hope lots more next year

  • @janpancake
    @janpancake Жыл бұрын

    This is an interesting take on this topic from the chip and architecture level that I was not expecting when I clicked on the video!

  • @416dl
    @416dl Жыл бұрын

    Another interesting exploration into this subject...looking forward to the coming year as the field and your channel grows. Buon Natale

  • @Crunch_dGH
    @Crunch_dGH Жыл бұрын

    The only channel I default playback to .75x, sometimes .5x & still have to repeat multiple times. That’s value in my book!

  • @prabakark6791

    @prabakark6791

    7 ай бұрын

    😀

  • @brandonvinson5255
    @brandonvinson5255 Жыл бұрын

    Best channel for deep insight.

  • @JonS
    @JonS Жыл бұрын

    There are quite a few other companies working on spiking neural network chips (BTW, the idea dates back to 1952!). Prophesee in France is using their SNN for event-based vision. Sony even released a couple of image sensors with Prophesee's technology built in. It's going to be interesting to watch this over the next few years. Thanks for the video.

  • @narrativeless404

    @narrativeless404

    Жыл бұрын

    Lots of discoveries were made long time ago and then forgotten for quite a while

  • @moosethompson
    @moosethompson Жыл бұрын

    Two points you made that I found very interesting. First is the low power requirements. The second is how neuromorphic chips are for targeting the distributed market as opposed to big machine learning data centers. It seems to me that between these two factors this could make a huge impact through a wide variety of mobile devices. Maybe even stop people looking at their phone from stepping in to traffic. Now wouldn't that be a wonder. :)

  • @danko5866
    @danko5866 Жыл бұрын

    Thanks for making this, I wanted to see exactly this type of content about Neuromorphic computers

  • @dchdch8290
    @dchdch8290 Жыл бұрын

    Simply great video! Thanks for sharing these insights !

  • @godsgod1677
    @godsgod1677 Жыл бұрын

    Anastasi is like an ASMR AI to me. I sometimes don't know what she's saying as I'm literally just listening.

  • @russadams3008
    @russadams3008 Жыл бұрын

    Another very interesting video. Thanks for your content. 👍👍👍

  • @Ty_Mathieson
    @Ty_Mathieson Жыл бұрын

    Informative as always, thanks.

  • @sergefrancois7405
    @sergefrancois7405 Жыл бұрын

    Thank you Anastasi for this interesting information.

  • @yonaguska2050
    @yonaguska2050 Жыл бұрын

    This is one of your most intriguing videos to date. I’ve been following neural network architectures for a long time. My first digital circuit designs were asynchronous, to gain the speed advantage of propagation over clocking. This new architecture seems to have a natural advantage over conventional deep learning systems. I look forward to hearing more about its future.

  • @WickedGamerCollector
    @WickedGamerCollector Жыл бұрын

    Always when you start about AI ... instant Terminator / Skynet flashback 😆 Wondering would something like this happen ? 🤔

  • @qjtvaddict

    @qjtvaddict

    Жыл бұрын

    Yes only a matter of when 😊 not if

  • @jesselistarseed
    @jesselistarseed Жыл бұрын

    This was an absolutely fascinating video Anastasi!!!! The future of the chip...the NN

  • @AlexGGener
    @AlexGGener Жыл бұрын

    Pretty nice video!! Well organized and clear. From someone researching on SNNs and spiking architectures just some extra info: The loihi chip is not asynchronous per se. All the logic is clock based since it is digital. It is asynchronous on a message protocol level between neurons. Although it gives amazing improvements in power consumption, there is still room for SNNs. The akida chip is more focused on feedfoward SNNs converted directly from CNNs for simple tasks as classification. Also, only LIF neurons are supported. At the end, events are packed in frames and the temp information of the spike is lost. However, they consume less than GPUs. I'm glad that you mentioned memristors and RRAM architectures, not a lot of people know about it. Just to mention there are already some teams researching on photonics architectures, Check Ipronics and the UPV in Spain. Glad to see someone talking about neuromorphics in KZread! Keep going!!

  • @kral2498
    @kral2498 Жыл бұрын

    Thanks for always bringing quality information, I enjoy your videos and I feel your breakdowns are always easy to follow :)

  • @akanhakan
    @akanhakan Жыл бұрын

    You should have your own podcast.

  • @markmalonson7531
    @markmalonson7531 Жыл бұрын

    Excellent presentation.

  • @wplaat
    @wplaat Жыл бұрын

    Thanks. Really interresing topic

  • @JamesAwokeKnowing
    @JamesAwokeKnowing Жыл бұрын

    I've been following neuromorphic since IBM's project over a decade ago. I feel a bit sad they were 'ahead of their time' but they did validate the concept in an at scale "commercial" way (not lab with wires all over) and inspire the industry. It's wonderful to see this tech now on verge of coming of age, and now way more AI engineers are out there and commercial demand for AI in edge devices is high. Excellent video.

  • @narrativeless404

    @narrativeless404

    Жыл бұрын

    Well, most of the tech arises ahead of it's time, and fades away out of either no use or being flat out discarded And then gets brought up again when it becomes actually practical and finds a demand

  • @sirus312

    @sirus312

    Жыл бұрын

    @@narrativeless404 TLDR yolo IBM/INTEL for the next decade?

  • @narrativeless404

    @narrativeless404

    Жыл бұрын

    @@sirus312 Uh, what?

  • @YogeshBaisyet
    @YogeshBaisyet Жыл бұрын

    Didnt Intel just announced Brainchip Akida Ip Foundry ? what does that mean

  • @AnastasiInTech

    @AnastasiInTech

    Жыл бұрын

    Intel will manufacture Brainchip's chips

  • @Karla-Flr
    @Karla-Flr Жыл бұрын

    Geniale Präsentation! Deine Videos sind grandios 🎊 🎉

  • @MrCtfx
    @MrCtfx Жыл бұрын

    Merry Christmas to you and the cat

  • @BruceNewhouse
    @BruceNewhouse Жыл бұрын

    Another amazing episode, thank you. Question: We are steadily moving ahead with advances in computer technology in every aspect, but it appears it is only a matter of time where the newest technologies come together and achieves improvements significantly beyond expectations. Based on your knowledge, awareness, education, intuition, gut instincts, educated guesses, etcetera, when do you predict the next major advance in home computers and business computers to hit the mainstream market? Any thoughts on what the top-of-the-line home and business computers will be like in five years? I hope you have a Very Merry Christmas and Happy Holy Days.

  • @towerparanormalinvestigations
    @towerparanormalinvestigations Жыл бұрын

    It's definitely one of my new favorite channels!

  • @stevefreeman6646
    @stevefreeman6646 Жыл бұрын

    Every now and then I stop by. I think it's because there are so few content creators for engineering and hardware on the deeper topics. The associations made with the chemistry field is interesting too. On spiking systems, I've noticed the similarity with how humans make snap decisions, a situation I'm familiar with. Maybe it was luck, but mine were usually the right ones. My point is, using these systems with others (e.g., GPT), along with expanded hardware speed and integration, could solve more complex problems like this, ones that appear unpredictable. QC? Anyway, thanks a bunch and have a great holiday.

  • @neotower420
    @neotower420 Жыл бұрын

    whoah, this is fascinating. Can't wait to see what other unique ideas come from these developments

  • @Nilmoy
    @Nilmoy Жыл бұрын

    amazing, I see there is an ingenious Revolution going on!

  • @MrFoxRobert
    @MrFoxRobert Жыл бұрын

    Thank you!

  • @FredPlanatia
    @FredPlanatia Жыл бұрын

    Another excellent video on a fascinating topic. I think you combine well the excerpts from interviews with your own commentary. Mr. Davies is eloquent in his explanations but i think i would get overloaded if there were long uninterrupted monologs from him. This way the information is quite palatable for lay persons.

  • @chad0x
    @chad0x Жыл бұрын

    Anastasias voice is so nice. I could listen to her all day talking about anything :)

  • @MozartificeR
    @MozartificeR Жыл бұрын

    You have your space man in the background. Very nice:)

  • @dreamphoenix
    @dreamphoenix Жыл бұрын

    Thank you.

  • @LorenzoVDH1996
    @LorenzoVDH1996 Жыл бұрын

    This is so exciting, what a time to be alive.

  • @BanXxX69
    @BanXxX69 Жыл бұрын

    First only comment🤓🤘🏽🥺❤️🤗 This is insanely interesting :D

  • @jerimiahsmedley3722
    @jerimiahsmedley3722 Жыл бұрын

    I really love your videos! 🥰🥰🥰please keep up the hard work. I wish I could at least do coding. But I fail or am unable to problem solve it. My education is limited and my back ground did not set me up for success. I wish I was as capable as you. Truly enjoy your work , your field of research, and your energy for this world of tech/engineering!

  • @DanielWolf555
    @DanielWolf555 Жыл бұрын

    Interesting interview with the Intel guy.

  • @benh7475
    @benh7475 Жыл бұрын

    Thanks for everything you.

  • @emilythorne8181
    @emilythorne8181 Жыл бұрын

    Wonderful, What will happen in the future 💙

  • @shadowarez1337
    @shadowarez1337 Жыл бұрын

    Thanks for these videos I'm starting to think this will be the history defining video series once AI takes over this series will show future civilization how AI take over started 🤣 great work as always thank you for always giving us a glimpse into future.

  • @AnastasiInTech

    @AnastasiInTech

    Жыл бұрын

    Thank you :)

  • @sirousmohseni4
    @sirousmohseni4 Жыл бұрын

    Thanks for making this video.

  • @martin-fc4kk
    @martin-fc4kk Жыл бұрын

    awesome content!

  • @PythonAndy
    @PythonAndy Жыл бұрын

    Good video again :)

  • @doug3691
    @doug3691 Жыл бұрын

    Merry Christmas!

  • @centuriomacro9787
    @centuriomacro9787 Жыл бұрын

    Awesome video. I almost skipped it, wasn’t clear to me that it’s about hardware

  • @waynemorellini2110
    @waynemorellini2110 Жыл бұрын

    This guy has it right. This follows what I came up when around 10 years old, about how to emulate intelligence. Lots of progress over the decades. But my model had greatly expanded, but he hasn't talked about any of the additional stuff needed for intelligence! This is still complicated lego blocks!

  • @augustinelopez1508
    @augustinelopez1508 Жыл бұрын

    Marry Christmas. To All ; In real time energy efficient, high adaption, and optimization solutions ring like jingle bells to me. And it sounds great. Keep doing excellence.😎

  • @easye8354
    @easye8354 Жыл бұрын

    Thank you and merry Christmas 🎅. As a shareholder of Brainchip, what is the significance of the recent partnership with Intel Foundries?

  • @micwin2
    @micwin2 Жыл бұрын

    This text was written in German and translated into English using chatGPT. Anastasi, thanks again for such a factual, competent and (can I say?) charming contribution. I feel validated by science. I am a software developer, administrator and AI enthusiast and now I finally have a name for where AI needs to go. You are right, classical approaches (and chatGPT probably also belongs to this category) follow a von-Neumann architecture, while the neuromorphic approach is more promising; I am a fan of decentralized structures and peer-to-peer anyway. I hope the name catches on because it hits the mark. If the project leaves the lab, Intel will probably have to make concessions to classical needs of chips and boards simply to lower costs for the end customer with third-party compatibility - if it ever gets to the end customer or prosumer! I would definitely be in if someone gave me a corresponding PCI-based card to play with :-) or even better - a USB or Lan box that I could connect to my Raspberry cluster,... my goodness, imagine if every household had an AI box, what that would change in the world,... Anyway, thanks again for the contribution, I always enjoy watching you 🙂

  • @narrativeless404

    @narrativeless404

    Жыл бұрын

    Yea, classical computers aren't doing well for AI, because they weren't designed for it to begin with They were just machines that should execute commands sequentially and consistently in sync with the entire system

  • @KazenoniKakuremi
    @KazenoniKakuremi Жыл бұрын

    We have to solve for inference and this sounds like an interesting approach

  • @josephmcmanus6385
    @josephmcmanus6385 Жыл бұрын

    Thanks!

  • @AnastasiInTech

    @AnastasiInTech

    Жыл бұрын

    Thank you !

  • @WXSTANG
    @WXSTANG Жыл бұрын

    I wonder... it was about 20 years ago I re programmed an assembly line that had some problems with data sets, and was slow. I set a comparator, to see the data in, vs the data retained in 8 bit chunks, and if nothing changed, I would skip re sending the information, until there was data that needed to update. It was massively faster, and more efficient. It almost sounds like I set a primative neural net without even knowing it. BTW... it worked fantastic.

  • @michaelmindman5897
    @michaelmindman5897 Жыл бұрын

    Good video!

  • @johnmanderson2060
    @johnmanderson2060 Жыл бұрын

    Great video 😊a directional mic or tie mic would be great, as it would suppress the room ambience. Can you give it a test ? Merry Christmas 🎄

  • @visiter127
    @visiter127 Жыл бұрын

    We're not far away from when u intro someone on ur ch , as a head of a company or engineer that's actually just something an ai has dreamed up and deepfacked a video and we won't know the different untill u tell us lol , happy Christmas and hopefully 2023 will b an amazing year for this ch

  • @Aboriginal_Alien
    @Aboriginal_Alien Жыл бұрын

    You have become the number one source for the highest quality info and current reporting on developments in one of the most important sectors of development in the world. Micro Chip technology is the new nuclear weapon. It is behind drones which will be the future of war far but more importantly we will see that a counties sovereignty is tied to their economic health and ultimately using chips for AI, Block Chain Technologies, Transport, drones etc… will prove to be what determines if a country becomes eaten and eats other countries.

  • @darrellrobinson707
    @darrellrobinson707 Жыл бұрын

    Happy holidays

  • @kipling1957
    @kipling1957 Жыл бұрын

    Such temporal filters could be a significant step towards relevance realization while avoiding combinatorial explosion in novel complex environments.

  • @wynegs.rhuntar8859
    @wynegs.rhuntar8859 Жыл бұрын

    Hi, Anastasi! Are there studies talking about brain clockspeed? What do you know about this? Take care and happy holidays!

  • @light7126
    @light7126 Жыл бұрын

    Anastasi is like a AI whisper in my ear

  • @mnomadvfx
    @mnomadvfx Жыл бұрын

    Spiking Neural Nets in silicon form are not new at all. IBM was at this years ago in 2015 on the 28nm process with their TrueNorth chip. It had 4,096 cores each with 256 'neurons' that each connected to the other for 65,536 synapses per core. So 1,048,576 neurons and 268,435,456 synapses in a single chip all pulling just 70 milliwatts. Also it was using SRAM which isn't the most area efficient memory so there was a huge room for improvement in a design predicated on a MRAM or ReRAM based compute in memory architecture.

  • @sullenstudent48
    @sullenstudent48 Жыл бұрын

    Neuromorphic computing is a type of computing that is inspired by the way the human brain works. It involves the use of hardware and software systems that mimic the structure and function of neurons and synapses in the brain. The goal of neuromorphic computing is to create more efficient and powerful computing systems that can perform complex tasks, such as pattern recognition, decision making, and learning, in a more energy-efficient and robust way than traditional computers. One of the key features of neuromorphic computing systems is that they use parallel processing, which allows them to perform many tasks at the same time. This is in contrast to traditional computers, which typically use sequential processing and can only perform one task at a time. Neuromorphic computing systems also tend to be more energy efficient than traditional computers because they use less power and are able to perform tasks using fewer transistors. This makes them well suited for use in devices that need to be small and portable, such as smartphones and wearable devices. Overall, neuromorphic computing has the potential to revolutionize the way we think about computing and could lead to the development of new technologies and applications that were previously unimaginable.

  • @gama3181
    @gama3181 Жыл бұрын

    One of the main topics of interest for me is organic computing. I expect that in the future, we can use DNA to encrypt and store digital information, proteins to build logic gates and neural networks, and whole cells as computing nodes. There are a couple of very interesting scientific papers about this, but are incipient and expensive

  • @malindrome9055

    @malindrome9055

    Жыл бұрын

    That can also help us live for thousands of years! I want to live to see fusion fully take place and traveling outside of the solar system! I'm 15 so i hope that happens in my lifetime!

  • @Sven_Dongle

    @Sven_Dongle

    Жыл бұрын

    @@malindrome9055 The way things are going, you would get to see the world decline into tribal ghet-toe stupidity for thousands of years trapped on this mudball drowing in human feces. Sounds dandy.

  • @malindrome9055

    @malindrome9055

    Жыл бұрын

    @@Sven_Dongle not if AI takes over

  • @gama3181

    @gama3181

    Жыл бұрын

    @@malindrome9055 i hope so! I'm 26yr and also I want to see the new arrival to the moon and Mars ... But certainly, this are bad times, in special for certain countries/peoples. I highly recommend this short film about the changing world :/ kzread.info/dash/bejne/qpupw89phK6zp5s.html

  • @Sven_Dongle

    @Sven_Dongle

    Жыл бұрын

    @@malindrome9055 Takes over what? The ghet-toe?

  • @JazevoAudiosurf
    @JazevoAudiosurf7 ай бұрын

    main difference of SNNs being they propagate chaotically in all directions not just layer by layer unidirectionally. still seems mathematically simple to describe, still a linear regression. and then you apply a sort of reward to certain connections, at least the brain does, and those connections strengthen through dopamine. essentially RL

  • @MozartificeR
    @MozartificeR Жыл бұрын

    Timing controls the context in which the data is interpreted by the brain, by sounds of it. That makes sense as well, because we interpret things from multiple I/O devices connected to the cantral ind peripheral nervous systems. Eg. Touch, sight, sound etc. Could you imagine if we did not do the same consciously? when interacting with the world.

  • @luizmt2
    @luizmt2 Жыл бұрын

    I just like this voice!

  • @kingmaker007.
    @kingmaker007. Жыл бұрын

    Wish you all the best nice reported

  • @springwoodcottage4248
    @springwoodcottage4248 Жыл бұрын

    Super interesting & well presented, but until we see some practical applications it is impossible to know whether these architecture ideas are practical & powerful or just very early stage technology developments that greatly lag conventional architecture in practical performance. Ideally we would need to see direct comparison on tackling several edge computational tasks between the architectures. Maybe such data exists, but as of now I haven’t seen it & that makes me suspicious about the practicality of these approaches. Thank you for sharing!

  • @danielhull9079
    @danielhull9079 Жыл бұрын

    Yay an un unpredictable clock!

  • @beautifulsmall
    @beautifulsmall Жыл бұрын

    asynchronous, capacitance , AC coupled. a far more direct way than event cameras. Mythic is back , very happy to see. the research will soon be keeping an eye on us.

  • @Ryan256
    @Ryan256 Жыл бұрын

    Because timing, tempo, and cadence are inherent to neuromorphic AI, I imagine it will be better at composing original music than traditional AI, which basically just repackages the music it was trained on that was created by human composers. Neuromorphic AI will have an intrinsic feel for music, as opposed to traditional AI that just shuffles around preexisting musical phrases.

  • @Waynesification
    @Waynesification Жыл бұрын

    You mean Chuck Moore's asynchronous patent which a lot of companies, including Intel were sugned onto. Which Chuck used to design his low energy recied holding processors and processing arrays. You could simulate this AI network type on that fully.

  • @nesseihtgnay9419
    @nesseihtgnay9419 Жыл бұрын

    awesome

  • @bogoljubivic9517
    @bogoljubivic9517 Жыл бұрын

    Nice videoo!! it seems to me that is better to have 10 normal chips connected to each other via 1 motherbord than mixing all in one.

  • @TonyDiCroce
    @TonyDiCroce Жыл бұрын

    It's interesting... when I learned NN I asked my teacher about the temporal issue and he had no idea what I was talking about.

  • @MozartificeR
    @MozartificeR Жыл бұрын

    Would you say, neuromorphic chips have processing cycles? or pulses? What is the correct way to think of it? Considering they are analogue.

  • @solosailorsv8065
    @solosailorsv8065 Жыл бұрын

    SUBBED: My senior project was a digitally controlled network of Op-Amps. 32 op-amps with 3 digital potentiometers each was a big board (in the ‘80s i used resistor ladders). It could be programmed to perform nearly instantaneous analog calculations/processing, and could feedback to ‘weight’ each node, on the fly. Do you know of anyone pursuing this approach? The new chip stacking can make these nodes as dense as FPGAs. (op amps and digital are different FAB processes) (PS, I’ve had so many patents stolen I don’t care anymore and just went SAILING ! , LoL)

  • @solosailorsv8065

    @solosailorsv8065

    Жыл бұрын

    @@marmosetman That's an interesting (worthy) application: a digitally controlled analog array might be used to create the training data across a Monte Carlo of variable analog inputs for AI training extremely quickly

  • @silvomuller595
    @silvomuller595 Жыл бұрын

    What a time to be alive

  • @2pist

    @2pist

    Жыл бұрын

    We get to witness the creation of our own prison.

  • @redberries8039
    @redberries8039 Жыл бұрын

    Anyone have opinions on which framework to adopt for spiking networks? Lava seems fair to me, but i've only taken a quick look??

  • @tedviens1
    @tedviens1 Жыл бұрын

    Our brain is primarily a massively parallel pattern matching system. Our physical body sensors provide the input to our brain's big data set. Most data fields are sensory images. Other fields are subsets containing thoughts, feelings, and inference relationship processing. Our brain has a secondary purpose of being a physical body monitoring system with feedback control of body functions. All of this processing by our brain is output at varying levels of awareness to our mind. Our mind is not located within our brain. Our mind is the envelope that wraps around our awareness, our sense of self. It is from our awareness that we relate to other beings and the physical world we share with them. It is within our awareness that we hold an imperfect knowledge of our self and our stumbling relationship to everything outside ourselves. The functioning of our mind is not indicated on any EEG fMRI, PET Scan, or other brain activity tool.

  • @imconsequetau5275

    @imconsequetau5275

    Жыл бұрын

    MRI scans clearly show that decisions are made subconsciously first, and then we consciously become aware of the decisiveness -- as we act. A great deal of practice is required to bring decision making to a methodical slow conscious act of listing the merits and relative weights of each input. This is not conducive to survival when snap decisions are required, but it is imperative for engineering.

  • @tedviens1

    @tedviens1

    Жыл бұрын

    ​@@imconsequetau5275 You are simply giving a great example of how lies are built in science. MRI, fMRI, PET scans and other brain activity sensing systems merely reveal brain activity localized to areas in the brain determined to be specific to certain thought activity of the subject. These regions were first determined by invasive wire stimulation and expanded by thousands of hours of studies of brain activity monitoring with external stimulation of photos, sounds, external spoken remarks, or requested subject thought phrases. It is pure vanity for any researcher to declare that the measured activity is related to the brain's formation of a specific thought. Just as it is vanity for me to assert that all the measured activity, cognitive and pre-cognitive, is limited to pattern matching activity around the topic generated or about to be generated in the subject's mind, which is external to the brain.

  • @YasinNabi
    @YasinNabi Жыл бұрын

    The easiest way to succeed in any field is to work on it consistantly, and consistancy is the key to success. I experienced my self. By the way, thanks for the uploads , a fellow creator.

  • @danielaraos8483
    @danielaraos8483 Жыл бұрын

    Hi Anastasia, I am a little confused. If the hardware will be spiking asynchronous based and analog, does this means that Quantum computer architecture will no longer be useful?

  • @pacobrezel
    @pacobrezel Жыл бұрын

    I'm curious about which of all the technologies will prevail in the future.

  • @jasonkocher3513
    @jasonkocher3513 Жыл бұрын

    Could you implement a smaller SNN "neuromorphic fabric" on an existing FPGA or CPLD? Would be fun to start playing with these concepts on a cheap dev board.

  • @Rising_Pho3nix_23
    @Rising_Pho3nix_2311 ай бұрын

    I swear as soon as I get a handle on what a neural network neuron is, they come up with something totally new haha

  • @qkktech
    @qkktech Жыл бұрын

    Neuromorphic analysis is ok but it can be better when that neuromorphic node can explain what it did somehow to digital computer. So there must be built some top layer that can feed and retrive info from such computer. I built in 20 years ago one very simple node on Xilinx and used numbers sorted to Gray Code to address similarity of things that tree that is built on top of gray code is backed in psychology known term priming - and also explains why such computer must sleep to reorganize and optimize coding. So when you look at that structure then it classifies problems and when there is no place where you can put that thing then it stored to short term memory and will reorganized during sleep. And what is amazing all such computers have different ideas since Gray Code can have many variations and coding is also different. It is like rising child when child understands first thing then it starts to build similar tree what is unique and is not changing as much as computers tree of ideas.

  • @KarimMarbouh
    @KarimMarbouh Жыл бұрын

    Any idea when they will be ready for public testing?

  • @axi92
    @axi92 Жыл бұрын

    Hey, I dont understand how are the async spikes are generated? Normally you have the sync clock that keeps all chips in sync but what triggers the async spikes?

  • @imconsequetau5275

    @imconsequetau5275

    Жыл бұрын

    The spikes can be created by new input or novel input.

  • @ypey1
    @ypey1 Жыл бұрын

    I have a question! Do human brains and neurons make use of quantum properties and quantum computing or do they work classicly in a clever decentralized way?

  • @nitinhshah
    @nitinhshah Жыл бұрын

    So much advancement! Which companies will bring advanced use chips to market first. Nividia seems to be ahead but IBM, Intel, Apple, Adobe, AMD all seem to forging ahead ...

  • @towerparanormalinvestigations
    @towerparanormalinvestigations Жыл бұрын

    Since the memory is not saved in the RAM bit within the neural network. If the network is shutdown, does it lose all the data and "thought"? Restarting it would make it start all over from the beginning?

  • @imconsequetau5275

    @imconsequetau5275

    Жыл бұрын

    The neural networks are preloaded with weights and initial conditions before starting, and resulting states can be stored afterwards. These services are done by general purpose CPUs.

Келесі