Future Computers Will Be Radically Different (Analog Computing)

Visit brilliant.org/Veritasium/ to get started learning STEM for free, and the first 200 people will get 20% off their annual premium subscription. Digital computers have served us well for decades, but the rise of artificial intelligence demands a totally new kind of computer: analog.
Thanks to Mike Henry and everyone at Mythic for the analog computing tour! www.mythic-ai.com/
Thanks to Dr. Bernd Ulmann, who created The Analog Thing and taught us how to use it. the-analog-thing.org
Moore’s Law was filmed at the Computer History Museum in Mountain View, CA.
Welch Labs’ ALVINN video: • Self Driving Cars [S1E...
▀▀▀
References:
Crevier, D. (1993). AI: The Tumultuous History Of The Search For Artificial Intelligence. Basic Books. - ve42.co/Crevier1993
Valiant, L. (2013). Probably Approximately Correct. HarperCollins. - ve42.co/Valiant2013
Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65(6), 386-408. - ve42.co/Rosenblatt1958
NEW NAVY DEVICE LEARNS BY DOING; Psychologist Shows Embryo of Computer Designed to Read and Grow Wiser (1958). The New York Times, p. 25. - ve42.co/NYT1958
Mason, H., Stewart, D., and Gill, B. (1958). Rival. The New Yorker, p. 45. - ve42.co/Mason1958
Alvinn driving NavLab footage - ve42.co/NavLab
Pomerleau, D. (1989). ALVINN: An Autonomous Land Vehicle In a Neural Network. NeurIPS, (2)1, 305-313. - ve42.co/Pomerleau1989
ImageNet website - ve42.co/ImageNet
Russakovsky, O., Deng, J. et al. (2015). ImageNet Large Scale Visual Recognition Challenge. - ve42.co/ImageNetChallenge
AlexNet Paper: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. NeurIPS, (25)1, 1097-1105. - ve42.co/AlexNet
Karpathy, A. (2014). Blog post: What I learned from competing against a ConvNet on ImageNet. - ve42.co/Karpathy2014
Fick, D. (2018). Blog post: Mythic @ Hot Chips 2018. - ve42.co/MythicBlog
Jin, Y. & Lee, B. (2019). 2.2 Basic operations of flash memory. Advances in Computers, 114, 1-69. - ve42.co/Jin2019
Demler, M. (2018). Mythic Multiplies in a Flash. The Microprocessor Report. - ve42.co/Demler2018
Aspinity (2021). Blog post: 5 Myths About AnalogML. - ve42.co/Aspinity
Wright, L. et al. (2022). Deep physical neural networks trained with backpropagation. Nature, 601, 49-555. - ve42.co/Wright2022
Waldrop, M. M. (2016). The chips are down for Moore’s law. Nature, 530, 144-147. - ve42.co/Waldrop2016
▀▀▀
Special thanks to Patreon supporters: Kelly Snook, TTST, Ross McCawley, Balkrishna Heroor, 65square.com, Chris LaClair, Avi Yashchin, John H. Austin, Jr., OnlineBookClub.org, Dmitry Kuzmichev, Matthew Gonzalez, Eric Sexton, john kiehl, Anton Ragin, Benedikt Heinen, Diffbot, Micah Mangione, MJP, Gnare, Dave Kircher, Burt Humburg, Blake Byers, Dumky, Evgeny Skvortsov, Meekay, Bill Linder, Paul Peijzel, Josh Hibschman, Mac Malkawi, Michael Schneider, jim buckmaster, Juan Benet, Ruslan Khroma, Robert Blum, Richard Sundvall, Lee Redden, Vincent, Stephen Wilcox, Marinus Kuivenhoven, Clayton Greenwell, Michael Krugman, Cy 'kkm' K'Nelson, Sam Lutfi, Ron Neal
▀▀▀
Written by Derek Muller, Stephen Welch, and Emily Zhang
Filmed by Derek Muller, Petr Lebedev, and Emily Zhang
Animation by Ivy Tello, Mike Radjabov, and Stephen Welch
Edited by Derek Muller
Additional video/photos supplied by Getty Images and Pond5
Music from Epidemic Sound
Produced by Derek Muller, Petr Lebedev, and Emily Zhang

Пікірлер: 13 000

  • @anishsaxena1226
    @anishsaxena12262 жыл бұрын

    As a young PhD student in Computer Science, your explanation of how neural networks come to be and evolved, and the math behind it, is the cleanest and most accessible that I have come across. As I focus on computer architecture, I came to this video without much expectations of learning anything new, but I am glad I was wrong. Keep up the great work!

  • @deepblue3682

    @deepblue3682

    2 жыл бұрын

    From USA?

  • @alex.g7317

    @alex.g7317

    2 жыл бұрын

    There’s a reason he has 11, 000, 000, 000 subscribers after all 😏

  • @unstable-horse

    @unstable-horse

    2 жыл бұрын

    @@alex.g7317 Wow, that's more than the population of Earth. Where does he find all those subscribers??

  • @exoops

    @exoops

    2 жыл бұрын

    @@unstable-horse Mars

  • @alex.g7317

    @alex.g7317

    2 жыл бұрын

    @@unstable-horse omg, lol 😂. That was a typo! I meant 11, 000, 000!

  • @KraftyB
    @KraftyB2 жыл бұрын

    Fun fact, that Graphics Card he’s holding at 18:00 is a Titan xp with an msrp of $1200, he says it draws 100w but it actually draws about 250w, so that tiny chip that only draws 3w is even more impressive

  • @Matthew-rl3zf

    @Matthew-rl3zf

    2 жыл бұрын

    Let's hope these new analog chips can solve our GPU shortage problem 😂

  • @justuskarlsson7548

    @justuskarlsson7548

    2 жыл бұрын

    In general when doing machine learning you are only using the CUDA cores of a graphics card so the wattage never gets close to its maximum. A lot of the processing units are simply not being used, for example shaders and 3D processing units. For example on my GTX 1080 I sit between 60-90w out of 200w when doing Pytorch machine learning. So I think 100w out of a maximum effect of 250w seems reasonable.

  • @chrisoman87

    @chrisoman87

    2 жыл бұрын

    you can underclock GPU's, thats what they do in cryptomining to improve their profit margins, depending on the chip they can operate effeciently at a fraction of their nominal power

  • @AC3handle

    @AC3handle

    2 жыл бұрын

    man, I'm old enough to remember when a $1200 card was considered EX PENS >IVE And not...'going price'.

  • @chrisoman87

    @chrisoman87

    2 жыл бұрын

    @@AC3handle 1200 wont buy you enough power for a decent DL rig either. An RTX 3090 goes for ~$3000 USD

  • @avinashkrishnamurthy6251
    @avinashkrishnamurthy6251 Жыл бұрын

    Analogue was never meant to die; the technology of that time was the limiting factor IMO. It appears like Analogue - Digital hybrid system can do wonders in computing.

  • @DigitalJedi

    @DigitalJedi

    2 ай бұрын

    I know this is an old comment, but I figured I'd add that as far as physical packaging goes, nothing stops us from putting one of these next to a conventional CPU. Cooling it would be the hard part as the temperature would swing the outputs by introducing noise. Might be better as an M.2 PCIE device.

  • @goldenhate6649

    @goldenhate6649

    2 ай бұрын

    @@DigitalJedi Its incredibly unlikely this will ever expand into the home. These would likely be built entirely different from traditional computers.

  • @DigitalJedi

    @DigitalJedi

    2 ай бұрын

    @goldenhate6649 As we've seen they can be build on NAND processes already, which are widely adopted by consumer electronics. The use case provided of low-power wake-word and condition detection seems like a great application if they can find the right product in the consumer space.

  • @williamtell1477
    @williamtell1477 Жыл бұрын

    AI researcher here, you did a great job on this. For anyone interested the book Perceptrons by Minsky/Papert is a classic with many proofs of limitations and explorations of the limits of the paradigm. It still holds up today and its fascinating to read what scientists were thinking about neural networks during the year of the moon landing!

  • @musbiq

    @musbiq

    6 ай бұрын

    Great recommendation. Thanks.

  • @christophertown7136

    @christophertown7136

    4 ай бұрын

    A Logical Calculus of the Ideas Immanent in Nervous Activity

  • @5MadMovieMakers
    @5MadMovieMakers2 жыл бұрын

    Hyped for the future of computing. Analog and digital could work together to make some cool stuff

  • @teru797

    @teru797

    2 жыл бұрын

    True AI is going to be the end of us. Why would you want that?

  • @kalindibang9578

    @kalindibang9578

    2 жыл бұрын

    @@teru797 true AI wont be possible for the next 200 years and by then if humanity kept on living how they are we aint gonna survive anyways

  • @michaelschiller8143

    @michaelschiller8143

    2 жыл бұрын

    @@teru797 it would still take quantum computers to be able to have the memory necessary to run

  • @jpthepug3126

    @jpthepug3126

    2 жыл бұрын

    @@teru797 cool

  • @jonathanthomasjohn8348

    @jonathanthomasjohn8348

    2 жыл бұрын

    @@teru797 we are already the end of us

  • @belsizebiz
    @belsizebiz2 жыл бұрын

    For amusement only: My first day at work was in1966, as a 16 year old trainee technician, in a lab dominated by a thermionic valve analogue computer (or two). These kept us very warm through the winter months and cooked us during the summer. The task was calculation of miss distances of a now-obsolete missile system. One day I was struggling to set up a resistor/diode network to model the required transfer function, but the output kept drifting. I found the resistors were warming up and changing value. Such are the trials and tribulations of analogue computing....

  • @_a_x_s_

    @_a_x_s_

    2 жыл бұрын

    Thus the temperature coefficient is very important for recent precision devices. And a high accuracy low ppm resistor is expensive, which is one of the reasons why it costs so much for high-end electronics instruments.

  • @mikefochtman7164

    @mikefochtman7164

    2 жыл бұрын

    I was going to comment that one disadvantage of analog computers is keeping them calibrated. If you want a precise amount of 'voltage' or movement to represent a real-world value, you have to keep it calibrated. Older mechanical ones had wear/tear, electronic ones have issues as well.

  • @stefangriffin2688

    @stefangriffin2688

    2 жыл бұрын

    Ah!? But what if the resistors were warming up, digitally?

  • @Cat-ir8cy

    @Cat-ir8cy

    2 жыл бұрын

    @@stefangriffin2688 you can't have a digital resistor

  • @aravindpallippara1577

    @aravindpallippara1577

    2 жыл бұрын

    @@stefangriffin2688 yeah digital signals works with gates - on or off

  • @NoahSpurrier
    @NoahSpurrier Жыл бұрын

    I remember seeing an analog differential calculator in high school in my physics and electronics teacher’s lab. It was more of a museum piece. It was never used. RIP Mr. Stark

  • @elliott8596

    @elliott8596

    Жыл бұрын

    To be fair, many of the tools we use are analog. We just don't call them "analog computers"... even though, they kind of are.

  • @rogerphelps9939

    @rogerphelps9939

    Жыл бұрын

    Exactly. Museums is where analog computers belong.

  • @certainlynotmalo1.0.06

    @certainlynotmalo1.0.06

    2 ай бұрын

    @@rogerphelps9939 The words of someone who knows nothing but his own little world. And he is content with it. Honestly, i'm jealous. For real, stay that way or life will get an awful lot harder. I would give everything i have to aquire such luxury.

  • @YolandaEzeagwu
    @YolandaEzeagwu Жыл бұрын

    I had a business analysis course that tried explain the perceptron and I didn't understand anything. I don't have a strong maths background. This video is pure genius. The way you explain ideas is amazing and easy to understand. Thank you so much. This is my favourite channel

  • @jeffc5974
    @jeffc59742 жыл бұрын

    One of the first things I learned in Electrical Engineering is that transistors are analog. We force them to behave digitally by the way we arrange them to interact with each other. I'm glad there are some people out there that remember that lesson and are bringing back the analog nature of the transistor in the name of efficiency and speed.

  • @jasonbarron6164

    @jasonbarron6164

    2 жыл бұрын

    At the expense of accuracy?

  • @JKPhotoNZ

    @JKPhotoNZ

    2 жыл бұрын

    Well, semi analogue. Don't forget the bias (voltage drop) before you get current amplification. Also, to say that analogue computers are more power efficient that digital is pretty hard to back up. A $2 microcontroller can run on a few mA for the desired task, then sleep on uA. You'll need at least 5mA for an analogue computer to start with and you can't make it sleep.

  • @danimayb

    @danimayb

    2 жыл бұрын

    @@JKPhotoNZ Great point. And with current Nano transistor technology, That efficiency (along with raw power) is going far beyond what a true analogue system could produce.

  • @rahulseth7485

    @rahulseth7485

    2 жыл бұрын

    Yeah but then you'll never know at which zone is it on? Because amplification happens differently for different input parameters. And not all transistors from the same batch would perform the same, i.e. it will lack repeatability (as Derek mentioned).

  • @mycosys

    @mycosys

    2 жыл бұрын

    the insoluable (even in theory) problems of analog are noise and signal integrity, which is why he didnt even mention them. This channel has gone to poop honestly.

  • @Septimius
    @Septimius2 жыл бұрын

    I see Derek is getting into modular synthesizers! Also, funny to see how the swing in musical instruments to go from analog to digital and back is being mirrored in computing generally.

  • @paradox9551

    @paradox9551

    2 жыл бұрын

    My first thought when he pulled out the analog computer was "Hey that looks like a modular synth!"

  • @toddmarshall7573

    @toddmarshall7573

    2 жыл бұрын

    Witness Audio Modeling (search for it on KZread).

  • @p1CM

    @p1CM

    2 жыл бұрын

    Music has always been an AI task

  • @theisgunvald4219

    @theisgunvald4219

    2 жыл бұрын

    As a semi-professional music producer with almost half a decade of working with professional musicians I would agree - and this is mainly because people feel a lack of “soul” in music. Those small human errors that we’ve spent decades trying to get rid of with Autotune, Drum machines, Sequencers, digital synthesiser and digital samplers (the last two CAN create sounds that will always come out the same way as long as the input stays the same - however there are exemptions). This is probably something the people I know in the music industry refer to as “The generation-rule”, in brief the music today is a result of what our parents and grandparents heard combined with new technologies and pop culture. - If you’re interested in music and maybe want to stay ahead of the game look it up. Some refer to it as the “30 year rule” as well.

  • @PetraKann

    @PetraKann

    2 жыл бұрын

    @@p1CM AI has no tasks

  • @lc5945
    @lc5945 Жыл бұрын

    I remember the first time I heard the term "deep networks", it was back in 2009 when I was starting my MSc (using mainly SVMs), a guy in the same institute was finishing his PhD and introduced me to the concept and the struggles (Nehalem 8 cores era)...the leaps in performance made in NN since then thanks to GPGPU are enormous

  • @asg32000
    @asg320006 ай бұрын

    I've watched a lot of your stuff for years now, but this is the best one by far. Great job of explaining something so complex, difficult, and relevant!

  • @PersonaRandomNumbers
    @PersonaRandomNumbers2 жыл бұрын

    My professor always said that the future of computing lies in accelerators -- that is, more efficient chips that do specific tasks like this. He even mentioned analog chips meant to quickly evaluate machine learning models in one of his lectures, back in 2016! It's nice seeing that there's been some real progress.

  • @xveluna7681

    @xveluna7681

    2 жыл бұрын

    That's pretty much where things have always been. Using basic building blocks that do specific functions. A linear voltage regulator has the job of maintaining a constant output voltage for a given set of current levels and different input voltages. You can buy an Opamp and using resistors to make a function called a schmitt trigger. Or you might just buy a schmitt trigger from Texas Instruments and put it onto a board with less board space consumed. Or a schmitt trigger might be embedded for free in certain other ICs (intergrated circuit). The major computing engines I have seen so far have been effectively GPUs, CPUs, and FPGAs. Xilinx &Altera (now Intel) have specialized in making FPGAs. An FPGA's basic internal components are logic elements with flipflops with reset, aysnc reset inputs, 4-input look up table, etc. Cascade these to make larger units like a multiplexer, floating point arthimetic unit, etc. Its programmable so you can effectively emulate a worse performing specialized CPU. A CPU is still more effcient at doing CPU type functions. A GPU does specific stuff as well. The idea of doing analog computations honestly just sounds like another building block to add into a complex system. Only there simply hasn't been a large enough demand to require the generation of specialized hardware like what was described in this video. That one start-up sounds like its developing a chip that will do a series of very specific functions and will need to be integrated into a large systems to accomlish a specific task.

  • @matsv201

    @matsv201

    2 жыл бұрын

    Well that have sort of always been the case. I don´t know what was the first accelerators, but one of the fairly early once was the FPU. We nu just take it for granted. Sprite accelerator was also fairly early. Then graphics accelerators. Then video decoder/encoder Then MMU accelerators Then 3D accelerators. Them SIMD accelertors. Then T&L accelerators Then physics accelertors Then raytracing accelerators Then deep learning accelerators.

  • @LundBrandon

    @LundBrandon

    2 жыл бұрын

    ASIC devices have existed for decades...

  • @calculator4482

    @calculator4482

    2 жыл бұрын

    @@LundBrandon they will soon become obsolete though due to reconfigurable computing devices line FPGAs

  • @LundBrandon

    @LundBrandon

    2 жыл бұрын

    @@calculator4482 FPGAs have also been around for decades, plus they draw more power. I'm a computer engineering student right now currently designing a CPU to be synthesized onto an FPGA. I'm not dumb.

  • @funktorial
    @funktorial2 жыл бұрын

    started watching this channel when I started high school and now that I'm about to get a phd in mathematical logic, I've grown an even deeper appreciation for the way this channel covers advanced topics. not dumbed down, just clear and accessible. great stuff! (and this totally nerd-sniped me because i've been browsing a few papers on theory of analog computation)

  • @gautambidari

    @gautambidari

    2 жыл бұрын

    Absolutely. Love the way he covers the concept for everyone. Those who don't know in depth about it can still go away with a sort of basic understanding. And those who do understand it in depth will enjoy discovering new areas of invention that they can further explore. Looking forward to reading some papers on using analog computing in neural network applications

  • @victorymorningstar

    @victorymorningstar

    2 жыл бұрын

    I'm smart too.

  • @mentaltfladdrig

    @mentaltfladdrig

    2 жыл бұрын

    Same here. But i didnt go to high school and mylife became a total mess and i havent graduated whatsoever :)

  • @SteveAcomb

    @SteveAcomb

    2 жыл бұрын

    “nerd-sniped” lmao I feel exactly the same and here I was thinking I was way ahead of the curve on alternate computing 😂 jokes on me

  • @HrLBolle
    @HrLBolle8 ай бұрын

    Mythic Gate's approach to work: Kind of reminds me of the copper filament memory planes with ferromagnetic rings representing the bits used as memory for the AGC (Apollo Guidance Computer). The video on which this memory is based was released by Destin, aka Smarter Every Day, and accompanied his and Linus Sebastian's meeting with Luke Talley, a former IBM employee and, at the time of the Apollo missions, a member of the data analysis teams responsible for the Analysis, evaluation and processing of the telemetry data received from the Apollo instrument ring.

  • @bishalpaudel5747
    @bishalpaudel574710 ай бұрын

    This is very well explained video on analog computing. Never could I have thought the topic of analog computing can be put out in 20 minute video with such a phenomenal animation and explanation. Respect your work and effort to make science available to all for free. Respect 🙏

  • @SandorFule
    @SandorFule Жыл бұрын

    I am a process control engineer, born in 63. In the 80-ies we used analog computers to calculate natural gas flow - for the oil and gas company. A simple flow computer was around 10 kilos, full of op amps and trimmer pots. It was a nightmare to calibrate it. :)

  • @deang5622

    @deang5622

    Жыл бұрын

    Op amps with their offset voltages and input bias currents leading to inaccuracy. Sounds like a nightmare. Constant recalibration required?

  • @rogerphelps9939

    @rogerphelps9939

    Жыл бұрын

    Absolutely.

  • @victorblaer

    @victorblaer

    Жыл бұрын

    @@rogerphelps9939 just calculating the uncertainty at each step sounds like a nightmare.

  • @percutseituan

    @percutseituan

    Жыл бұрын

    but you can mix with digital control for adjusting and decision

  • @benoitroehr4100

    @benoitroehr4100

    Жыл бұрын

    I think Nasa (or was it still Naca at the time?) was able to simulate flight caracteristics with analog circuits too. I'm thrilled to see this tech coming back !

  • @TerryMurrayTalks
    @TerryMurrayTalks2 жыл бұрын

    As a 70-year-old boomer my technical education involved building and testing very basic analog devices. Thanks for this video, it helped me to a better understanding of neural networks.

  • @magma5267

    @magma5267

    2 жыл бұрын

    You must be really healthy because you dont even look close to 70! :D

  • @TerryMurrayTalks

    @TerryMurrayTalks

    2 жыл бұрын

    @@magma5267 Thanks for the heads up. I've got a good women, 6 children, 8 grandchildren and a recently placed stent in my heart that keeps me going :)

  • @vedkorla300

    @vedkorla300

    2 жыл бұрын

    @@TerryMurrayTalks Good for you my man! I am still 20 and don't know what to do in life :(

  • @vource2670

    @vource2670

    2 жыл бұрын

    Yep your 70 mate

  • @TerryMurrayTalks

    @TerryMurrayTalks

    2 жыл бұрын

    @@vedkorla300 You have plenty of road left to travel, follow what you love, enjoy the journey, don't let the bumps in the road stop you and if you can get a soul mate to share it with you it will all be good.

  • @kasparsiricenko2240
    @kasparsiricenko2240 Жыл бұрын

    When I was in institute back in 2016 I was thinking of this specific “gates” as well as undergraduate. I knew someone was already implementing it but still missing the time I could be part of innovations. What a genius way of reimplementing circuits for neural networks. Maybe that’s what the FPGA is future is - neural networks

  • @Psrj-ad
    @Psrj-ad Жыл бұрын

    this make me want Derek to talk about Neural networks and AI related topics a lot more. its not just extremely interesting but also constantly developing.

  • @suivzmoi
    @suivzmoi2 жыл бұрын

    as a NAND flash engineer that bit about the usage of floating gate transistors as analog computers is interesting. particular because in flash memory there is a thing known as "read disturb", where even low voltages applied to the gate (like during a flash read) to query it's state can eventually cause the state itself to change. you would think it is a binary effect where if it's low enough it would just be a harmless read but no...eventually there will be electron build up in the gate (reading it many times at low voltage has a similar effect to programming it one time at a high voltage). in this particular application the weight would increase over time the more you query it, even though you didn't program it to be that high in the beginning. it's interesting because, it's sort of analogous to false memories in our brains where the more we recall a particular memory, the more inaccurate it could potentially become.

  • @donkisiko

    @donkisiko

    2 жыл бұрын

    Underrated comment!

  • @Xavar1us

    @Xavar1us

    2 жыл бұрын

    Absolutely love this comment! This has been on my mind for at least an hour now, the point you make is intriguing and a bit haunting, thanks for that!

  • @JeyeNooks

    @JeyeNooks

    2 жыл бұрын

    Fkin right on!!

  • @Lassana_sari

    @Lassana_sari

    2 жыл бұрын

    Very interesting.

  • @sampathsris

    @sampathsris

    2 жыл бұрын

    Underrated comment. Then in Eternals style we will have to reprogram the memories of our servants every now and then.

  • @DomDomPop
    @DomDomPop Жыл бұрын

    It’s funny, for those of us who are into electronic music production, analog never left! There are lots of great analog synthesizers out there that can produce all kinds of complex waveforms, and some of us have been known to tack an oscilloscope on to modular gear to view and use those waveforms. Even some relatively simple gear can produce complex, “3D” structures with the correct cable patches. A lot of what you described at the beginning is the backbone of synthesis for music, and the same principles obviously apply to mathematical operations.

  • @rogerphelps9939

    @rogerphelps9939

    Жыл бұрын

    You can do everything digitally that an analog system can do and more. An example is resampling in order to change the frequency scale of a recording. This can be done in real time using digital methods, not so much for analog methods.

  • @DomDomPop

    @DomDomPop

    Жыл бұрын

    @@rogerphelps9939 Depends on what you’re doing and what’s important to you. Analog synths are great for experimenting with the knobs and patch bay (if available) and learning what exactly each change has on the overall waveforms. They’re really great for learning what exactly you’re doing and what you’re getting as a result. Yeah, there are software synths meant to emulate hardware knobs and a patch bay, but I haven’t found clicking through all that as valuable as plugging and experimenting yourself. That stuff really depends on the person, though. What doesn’t depend on the person, and is arguably more important, is the fact that aliasing can end up being a problem on digital synths. When you start doing some crazy cross modulation between sources and/or you’re dealing with lots of harmonics, if the processor can’t keep up, your sound will suffer. Same with super high frequencies. Depends on the synth, of course, but analog synths can tend to have a warmer, purer sound to them as well, because you don’t have to emulate all those harmonics. It really comes down to the same arguments being made here regarding analog computers: there’s no processor overhead needed to create some very complex shapes, and to do so perfectly accurately, on analog. I use both types of synths, as lots of people do, and I would never say that one somehow makes the other unnecessary. Hell, there are hybrid synths that give a mostly analog signal path while allowing for, say, a digital sample and hold circuit and the ability to save certain parameters. People make those kinds of things for a reason, you know?

  • @victorpereira8000

    @victorpereira8000

    Жыл бұрын

    Pythagoras discovered math with music I think right? Really like your comment

  • @RAndrewNeal

    @RAndrewNeal

    Жыл бұрын

    @@rogerphelps9939 Difference is that you need billions to trillions of transistors to do digitally what can be done using tens to hundreds of transistors analogously.

  • @rogerphelps9939

    @rogerphelps9939

    Жыл бұрын

    @@RAndrewNeal Wrong. The errors arising from component tolerances, noise and temperature dependent offsets make anything complicated pretty much impossible in analog. Transistors in digital processors are extremely cheap. Provided you have good DACs and ADCs you can do anything to whatever precision you need in digital.

  • @di380
    @di3803 ай бұрын

    I agree, one point I was going to mention regarding analog computers is that they are susceptible to voltage fluctuations, environmental noise and the accuracy of your results are directly dependent on the accuracy of your equipment reading the output voltages. There is that, but this makes sense when talking about specific applications like this one 👌

  • @zebaerum7
    @zebaerum79 күн бұрын

    I used this content and visuals for my Electronics Engineering final year technical seminar. I loved the content, and the way it's put together. Thanks for choosing the most interesting stuff to put in my feed.

  • @dust7962
    @dust79622 жыл бұрын

    The problem with this system of computing is that interference is a huge factor. When you only test to see if there is voltage or not then you don't need to worry about interference. But when you get into building systems that use varying voltages say 0v 0.5v 1v then you need to worry about interference, and the more numbers you add the greater this factors into being an issue. Interference could come in the form of microwaves, radiation, mechanical vibration (think fans cooling off a server rack,) and the list drags on as almost anything can cause interference. That ostiliscope used in the example is an expensive piece of equipment that minimizes interference, but the cost is far higher than with a binary number system.

  • @introprospector

    @introprospector

    2 жыл бұрын

    Binary computers have to deal with interference too, that's handled by error correction. Error correction is already baked into the infrastructure of every digital component, to the point where we don't realize it's there. They suggested one method of error correction in the video, and they're probably not even scratching the surface of what's possible.

  • @fltfathin

    @fltfathin

    2 жыл бұрын

    i think the crux is the medium, the AI models brain which is so good at re-building itself, and it uses electron and chemicals to convey information. our transistors are too limited to mimic that interaction. for example the new 3w chip needs to be custom made for each model if i got it right.

  • @dust7962

    @dust7962

    2 жыл бұрын

    @@introprospector Yes, but with binary error correcting is simpler as interference isn't as much of a burden on the architecture. When the job is to check if there is, or isn't voltage it is a lot less complex than checking 8 different voltage thresholds.

  • @dust7962

    @dust7962

    2 жыл бұрын

    @@fltfathin This is called an ASIC (Application specific circuit) the computer is pretty much just sent to the landfill after it's outlived it's usefulness instead of having the ability to be repurposed. Which is another concern about where computing in general is heading as PCBs use less and less semi-precoius, or precious metals there is less incentive to recycle.

  • @JayJay-dp8ky

    @JayJay-dp8ky

    2 жыл бұрын

    @@dust7962 Yeah but I put my mobo in the case first and then the radiator wouldn't fit, so I had to take it out and install the radiator first. It was really annoying. I didn't watch this video, but I'm assuming this is what he was talking about.

  • @Ave117
    @Ave1172 жыл бұрын

    This actually helped me a lot to understand how neural networks work in general. For me it was kinda like black magic before. It still is to an extend but to know that moden Neural Networks are kind of more complex multi-layered perceptrons helped a lot.

  • @chibicitiberiu

    @chibicitiberiu

    2 жыл бұрын

    Yes, indeed. Something I find fascinating are recurrent networks, where some neurons feed back into the network which would allow some information to be saved from one image to the next one. This would allow the AI to process things that change in time, like music and video. For example, if you're tracking a subject with a camera and he turns around, a recurrent AI would be able to continue tracking the subject.

  • @connorjohnson4402

    @connorjohnson4402

    2 жыл бұрын

    Yea in the end to some of it is kind of voodoo black magic thought i mean they call them black boxes for a reason.

  • @Blox117

    @Blox117

    2 жыл бұрын

    speak in english pls

  • @aladdin8623

    @aladdin8623

    2 жыл бұрын

    The video seems to contain some quite biased infos though. The top 5 error rate of humans of 5,1% is of course not accurate. If human beings were that bad, we accordingly would have much, much higher car accident rates. Those kind of inaccurate percentages come from statistics based on captchas. And several conditions do distort the results there. - human users often don't bring up the needed concentration and attention to solve captchas as they actually could. In fact they are angered by them and often times just click quickly through them. In traffic while driving a car human beings are much higher on alert and do tremendously less mistakes. Here human beings still beat autonomous drive computers by several orders of magnitudes, measured in car accidents per million driving hours. - the captchas often do not meet the actual human perception. The captcha images are often unclean, got low resolution and distortions. In the real world humans perceive much higher quality from their surroundings than some crippled captchas. A more clear image increases the recognition dramatically. It is really crucial in educational videos from whom and from where you take your numbers. Science is not always that objective as we are told, especially when corporations are funding them financially with own interests. Other than that the video is quite interesting. I also wished though that many common misconceptions would have been cleared up. For example many people still believe, that computers would work like human brains. This is plain nonsense, mostly spread by science fiction. The brain still does pose big mysteries to us especially "the big problem of consciousness".

  • @anteshell

    @anteshell

    2 жыл бұрын

    It is still kind of voodoo or black magic. While the overall working mechanisms are well known and the output can be estimated based on the input, how the neural network exactly reaches to the answer is nearly impossible to inspect because of the sheer amount of variables. In essence, you feed a black box with something and you can expect it to give you a particular answer with some confidence, but no-one has any damn idea what exactly happens inside the black box.

  • @davidchristensen4643
    @davidchristensen46434 ай бұрын

    It's interesting how circular technology is. Back in the 1970's my first job out of uni was with a national research association focused on all things to do with ships in the UK. Whilst the primary focus of my work was providing QA services to the various research teams, including maintaining and enhancing language systems like RATFOR, and system management of the ICL, IBM, Perq, CV & DEC systems, I was also involved in developing two specific analogue/digital hybrid projects. One was focused on managing and monitoring loading balances for bulk cargo ships and the other was simulating ship navigation into ports in real-time. Both of these projects involved interfacing the analogue data from real-time sensors to digital monitoring and mapping algorithms. Unfortunately, at that time, analogue was seen as a historical burden and both were eventually canned. Now, almost 50 years later, it's great to see that our ideas of the 70's are coming back into fashion.

  • @fierybones
    @fierybones4 ай бұрын

    I happened to watch this just after playing with a modular (audio) synthesizer. In these, each module is controlled by a voltage, originating from an oscillator, a keyboard, or a "sequencer". The concept that makes a modular synth interesting is, the voltage pattern (waves) output from a module can either be used as an audio signal (if it's in the audio spectrum), or to control another module. In the simplest case, output from a voltage controlled oscillator (VCO) can be routed to a speaker to produce a sound. But it can also be routed to a module that filters a signal in some way, based on the output voltage of its predecessor. Maybe the thing that makes "ambient" music's slowly-shifting textures interesting is that they mimic the neural networks of our brains.

  • @certainlynotmalo1.0.06

    @certainlynotmalo1.0.06

    2 ай бұрын

    Aot of the actually do! You can even (kind of) help your brain waves to synchronise with the oscillations, it's not by brute force (you have to play along or it doesn't work very well) but it can greatly help for sleeping, learning and related stuff. Reminds me of old hardware synthts, where you had to connect each of the synth parts with cables. But tat gave you an amazing amount of flexibility! BUT, no one cared to write the configurations down... That was te funniest and most awful part at the same time...

  • @IronAsclepius
    @IronAsclepius Жыл бұрын

    My undergraduate work was actually with a professor who did research in the brain as an analog computer and using neural networks and analog computing as an attempt to achieve super-turing computation. A researcher who's name is worth looking into in all this from my research would be Hava Siegelmann. At the time I understood much less about the problem. My task was essentially to try and prove that analog computation could be modeled with a neural network on a digital computer. Not sure if my comment will be buried or not, but it's an area worth looking into if you're more deeply interested in this problem.

  • @AayushPatel-gc3fw

    @AayushPatel-gc3fw

    Жыл бұрын

    I have never done much/extremely deep research on a topic, but this seems very interesting.

  • @raystir98

    @raystir98

    Жыл бұрын

    id like your comment to not be burried.

  • @noblenessdee6151

    @noblenessdee6151

    Жыл бұрын

    0s and 1s . high and lows. voltage and no voltage (digital representation of numbers); have absolutely nothing to do with brain neurons. it's complete BS . For all we truly know about the brain there could be a near unless amount of information with every firing of a neuron. We have no idea what format conciseness information is in and likely never well as "in time" humans.

  • @AayushPatel-gc3fw

    @AayushPatel-gc3fw

    Жыл бұрын

    @@noblenessdee6151 Engineers : well I will approximate every thing a neuron is saying, to just two numbers. 🙂...

  • @beeswaxlover

    @beeswaxlover

    Жыл бұрын

    @@AayushPatel-gc3fw words are the limitations, not numbers, all words can be expressed in code, not all humanity can be expressed in words.

  • @melanezoe
    @melanezoe2 жыл бұрын

    Freaked me out to see that opening analog plug board. That’s how I learned programming in my first data processing class at Fresno State University-in 1964. Eerie to have that memory rise.

  • @Ozhull

    @Ozhull

    2 жыл бұрын

    Damn you're old! Glad you're still kicking around

  • @jimmysyar889

    @jimmysyar889

    2 жыл бұрын

    @@Ozhull he’s only like 75 chill

  • @amaan06

    @amaan06

    2 жыл бұрын

    Lol

  • @beesharp9503

    @beesharp9503

    2 жыл бұрын

    Woo fresno!

  • @PaulJosephdeWerk

    @PaulJosephdeWerk

    2 жыл бұрын

    I graduated Fresno State in 1993 with a BS in CS (after a stint in the military). I even took an artificial intelligence class. I still have my perceptron book.

  • @activision4170
    @activision41704 ай бұрын

    Great video. Never knew this was a thing. Very useful. Might one day just be an extra part on the motherboard designed for fast approximation calculations

  • @javierperea8954
    @javierperea8954 Жыл бұрын

    That's so beautiful. Using a photocell as an analog to digital interface, with the advantages of both systems applied effectively in a system.

  • @koborkutya7338
    @koborkutya7338 Жыл бұрын

    i recall our control system teacher at the university in the '90s said Space Shuttle flight controls contained analogue computing because it had to process like several thousand sensors' input to produce outputs and digital was just too slow for the job.

  • @rogerphelps9939

    @rogerphelps9939

    Жыл бұрын

    He lied.

  • @TARS-CASE

    @TARS-CASE

    10 ай бұрын

    @@rogerphelps9939 the Space Shuttle did indeed use analog computing for some of its flight control systems. the Space Shuttle used a hybrid digital/analog system for flight controls. Most of the high-level control logic was handled by digital computers, but critical low-level control functions were performed using analog circuits. The analog components were able to process sensor inputs and produce control outputs much faster - on the order of microseconds - compared to even the fastest digital computers of the era, which took milliseconds. This speed was essential for stability during flight.

  • @user-tg5sv5ps2i

    @user-tg5sv5ps2i

    2 ай бұрын

    I can imagine it too be also just more fault tolerant. Discrete = hard, continous = easy. An overflow in digital can literally crash a whole system. In analog there is more room for error.

  • @quietcanadian5132
    @quietcanadian51322 жыл бұрын

    I’ve been an engineer for 44 years. Great video. I actually worked on analog computers in the 70s when digital processing was still new. Never to this level though. Great job!

  • @apollochaoz

    @apollochaoz

    2 жыл бұрын

    🇨🇦🏳‍🌈

  • @raijuko

    @raijuko

    2 жыл бұрын

    Its amazing how fast all of this is evolving. Looking at this, and comparing it to facial recognition software in simple phone apps we have now really shows how much all of this has influenced what kids and teens easily use today.

  • @johndoh5182

    @johndoh5182

    2 жыл бұрын

    I didn't catch the part where he quit talking about analog systems though when he went to the logic systems being used for matrix operations, because that was digital. There may have been analog inputs into the system, but there's an A to D conversion, and everything he showed at the end was strictly digital, so a bit misleading there. Current systems for AI are digital.

  • @GeovanniCastro666

    @GeovanniCastro666

    Жыл бұрын

    @@raijuko yes but i still believe in God . And i am a fan of science experimenting and inventing

  • @jasonturner1045
    @jasonturner1045 Жыл бұрын

    How have I not found this channel before now?? Fascinating topics.

  • @gregseljestad2793
    @gregseljestad279311 ай бұрын

    I just found out that the SR71 engines had a hydraulic computer that ran the system. That would be amazing to see. I worked at Caterpillar and a friend of mine was tasked with converting a craper transmission module from a hydraulic base to electronic. It was a very old design and all engineers had passed on. They had a team of engineers that had to replicate all hydraulic functions into an electrical equivelant. It was fascinating to me. One of the functions they had to replicate is going up a steep hill with a full load and being able to shift without rolling backwards. Holding the load, sharing the load with two clutches, and increasing one clutch while reducing the other clutch to make it a seemless shift. So enjoy this topic. Thanks!

  • @SaanMigwell

    @SaanMigwell

    4 ай бұрын

    Most nuclear power plants are pneumatic computers. Well, the old subs and breeder reactors anyway.

  • @robertb6889
    @robertb68892 жыл бұрын

    As a guy who helps manufacture flash memory I find this really intriguing: especially because flash memory is continuing to scale via 3-D layering, so there’s a lot of potential, especially if you can build that hardware for multiplication into the chip architecture.

  • @ravener96

    @ravener96

    2 жыл бұрын

    You are still strugling with interconects from one side to the other

  • @Zeuskabob1

    @Zeuskabob1

    2 жыл бұрын

    @@ravener96 With many ML algorithms you can split problems into multiple sub-problems for different networks to handle. I wonder if developing that area of ML would be helpful to make effective analog systems? For an example, in image processing a pixel at the top left of the image has little interaction with a pixel in the bottom right of the image compared to nearby pixels. If you wait to compare them until multiple layers later, it speeds up processing the image and allows for algorithms to become more adept at finding sub-patterns in the image.

  • @martiddy

    @martiddy

    2 жыл бұрын

    @@Zeuskabob1 Depends on what kind of images processing the neural network is doing, if the computer wants to identify a face in a person maybe it doesn't need to process all pixels once it has processed all the pixels near the face, but in some cases distant pixels can indeed be correlated, like the images from a camera in an autonomous car identifying the white lines of a street, where it could be 99% sure it is a straight line but the corner pixels clearly indicates that is curve line.

  • @robertb6889

    @robertb6889

    2 жыл бұрын

    @@ravener96 Yeah, but interconnects can be designed around with clever architecture to an extent. It's still quite interesting.

  • @seldompopup7442

    @seldompopup7442

    2 жыл бұрын

    Flash cells are micron scale while the AI accelerators doing integer operation are built with the latest 4 nm technology. And floating gates have really limit life compared to pure logic circuit.

  • @FragEightyfive
    @FragEightyfive2 жыл бұрын

    Using an analog computer to demonstrate differential equations is a perfect teaching tool. I really wish tools like this were used in colleges more often.

  • @Elrog3

    @Elrog3

    2 жыл бұрын

    They already use crap like this far too often. This isn't something use for for a differential equations course. Maybe it would be ok for a circuits course or even Computer Science course focused solely on analog computers. In math, just give us the numbers and the logic... Don't waste are time with this stuff.

  • @Elrog3

    @Elrog3

    2 жыл бұрын

    @@JackFalltrades I am an engineering student.

  • @Elrog3

    @Elrog3

    2 жыл бұрын

    @@JackFalltrades I'm not calling letting students know of use-cases for things crap. I'm calling taking up class time that is meant for teaching students the logic of how to solve differential equations (because that is the class the original poster said it would be good for) and instead using that class time to teach something that only a tiny fraction of the class would ever use.

  • @quotidian8720

    @quotidian8720

    2 жыл бұрын

    it is used in control systems

  • @Noootch

    @Noootch

    2 жыл бұрын

    @@Elrog3 He never said it should be used in a differential equations course. You just sound like the type of students that go to university and ask which courses they need in order to get a high salary position in industry.

  • @spoidermon2515
    @spoidermon2515 Жыл бұрын

    Damn Man!! You explained it pretty well!! All that history and theory wrapped in 22 mins! Incredible!

  • @Paul-rs4gd
    @Paul-rs4gd Жыл бұрын

    I can see this analog technology being used in special purpose AI processors attached to normal digital computers. It makes sense - they could provide very large scale, cheap and energy efficient Neural Net acceleration. Since it appears that 'scale' is the most important thing for AI, it is really important to bring down the cost and energy consumption, so we can all run GPT3 on our laptops :)

  • @The1wsx10
    @The1wsx102 жыл бұрын

    wow that analog chip sounds extremely competitive. im surprised they already have something that good. mad props to the guy who figured out the hack with the flash storage

  • @dorusie5

    @dorusie5

    2 жыл бұрын

    I wonder how temperature sensitive it is.

  • @hughJ

    @hughJ

    2 жыл бұрын

    @@dorusie5 I'm mostly curious about the write-cycles and lifespan of the flash cells. Is the network going to get Alzheimer's after a few days?

  • @SharienGaming

    @SharienGaming

    2 жыл бұрын

    integrated circuits like that have always been really efficient - the downside is that they are extremely specialized... as the guy said: its not a general computation chip it can literally only do matrix multiplication - but that it can do really damn efficient (though slightly imprecise - which likely still is good enough for neural network purposes since they arent interested in the exact value of the result) so...sure thats competitive for that one purpose - but useless for anything that isnt that purpose but if the type of calculation that they can do is in high demand - they likely can sell a lot of specialized hardware either for specific devices or plug-in cards for computers that supply fast matrix multiplication operations

  • @wouterhenderickx6293

    @wouterhenderickx6293

    2 жыл бұрын

    I've been wondering about analogue usage of SSDs for a long time. It's an oversimplification, but each cell holds a voltage which can also be interpreted as an analog signal. If we take music as an example, you could basically write the value of one sample point to a cell, writing 16 bits worth of information to 1 NAND cell. This of course makes it impossible to compress the music, but it would allow to store music 'losslessly' at the same cell usage as a compressed 256kb/s file on TLC storage. Of course, NAND reproduction isn't perfect (and as such, music reproduction wouldn't actually be lossless), but I wonder how close this would come compared to the compressed digital file. I think this could be potentially useful for offline caches and downloads from Spotify for example, as the data can be checked and corrected when a high speed network connection is actually available.

  • @JustNow42

    @JustNow42

    2 жыл бұрын

    Already? We did this before 1960.

  • @tenou213
    @tenou2132 жыл бұрын

    I'm a little disappointed by the title but impressed by the content. It's less "we're building computers wrong" and more "old method is relevant in a niche application". There's also the eventual plans for fully commercial quantum supercomputing clusters and ever faster internet connections which might further limit the applicability of these chips going forward. However, building processing-specialized chips instead of relying on graphics cards seems really promising in the short term so long as the market stabilizes.

  • @johnbotris8187

    @johnbotris8187

    2 жыл бұрын

    Derek actually made a video a few years ago explaining why veritasium would start using clickbait titles (to appease the youtube algorithm)

  • @internettoughguy

    @internettoughguy

    2 жыл бұрын

    It got you to click didn't it?

  • @dinglesworld

    @dinglesworld

    2 жыл бұрын

    It’s for the click bro. And for good reason. If any channel deserves to clickbait, it’s this one.

  • @alwinsebastian7499

    @alwinsebastian7499

    2 жыл бұрын

    @@dinglesworld agreed 100%

  • @Blaketarded

    @Blaketarded

    2 жыл бұрын

    its not really niche when ai and algorithms are used everywhere.

  • @emmateedub9672
    @emmateedub9672 Жыл бұрын

    An interesting video covering some of the beginnings of AI, how computers work and also environmental considerations. I would like to find out more about Rosenblatt however i was expecting something of the idea of mechanical computers. Good information good video, thanks!

  • @mbharatm
    @mbharatm Жыл бұрын

    Amazing, thought provoking 2 part video on analog computing. Veritasium never disappoints!

  • @harrybarrow6222
    @harrybarrow62222 жыл бұрын

    Rosenblatt’s Perceptron was essentially a one-neuron network, although he could perform logical operations on the binary data inputs before passing results, which gave it more power. Minsky and Papert at MIT were concerned that Rosenblatt was making extravagant claims for his Perceptron and scooping up a lot of the available funding. In their book, “Perceptions”, Minsky & Papert proved that one-neuron networks were limited in the tasks they could perform. You could build networks with multiple Perceptions, but since Perceptrons had binary outputs, nobody could think of a way to train networks. That killed funding for neural networks for decades. In the late 1980s, interest was re-kindled when John Hopfield, a physicist, came up with a training technique that resembled cooling of a physical spin-glass system. But the big breakthrough came when the error back-propagation technique was developed by Rumellhart, Hinton &Williams. In this, neurons were modified to have a continuous non-linear function for their outputs, instead of a thresholded binary output. Consequently, outputs of the network were continuous functions of the inputs and weights. A hill-climbing optimisation process could then be used to adjust weights and hence minimise network errors. The rest is history.

  • @3nertia

    @3nertia

    2 жыл бұрын

    And now, we're "evolution" but with awareness and intent heh

  • @slatervictoroff3268

    @slatervictoroff3268

    2 жыл бұрын

    Critically wrong. Not one-neuron - that doesn't even make sense. One *layer*.

  • @brunsky277

    @brunsky277

    2 жыл бұрын

    ​@@slatervictoroff3268 I have to disagree. Perceptron is one-neuron (one neuron that receives multiple inputs and puts out one output). This makes it also one layer network I would say

  • @meateaw

    @meateaw

    2 жыл бұрын

    @@brunsky277 thinking about it though, the inputs all had their own weights, those weights correspond to a neuron. A modern ai model has inputs, and the weights exist on the layers. Therefore the perception had 400 inputs, 400 weights, and 1 output signal. That implies to me 400 neurons, in a single layer, leading to a single output value.

  • @WilisL

    @WilisL

    2 жыл бұрын

    @@slatervictoroff3268 No, one layer can be multiple perceptrons, it’s technically one-neuron (which is technically a one layer though)

  • @TimeBucks
    @TimeBucks2 жыл бұрын

    Amazing video!

  • @sheikhsumibegum2108

    @sheikhsumibegum2108

    2 жыл бұрын

    Wow nice video

  • @kishungamer4036

    @kishungamer4036

    2 жыл бұрын

    Nice video

  • @michaelperry9180
    @michaelperry91804 ай бұрын

    Funnily enough, this video series helped me understand a bit better how analog music production works. "Modular setups" look a lot like the computer you used to model the Lorenz System.

  • @santiagojimenezpinedo3473
    @santiagojimenezpinedo347311 ай бұрын

    This is really cool, and there is another startup that have a different approach using analog but instead of using voltage and currents, they use light, so it is really interesting how the analog is coming back. I would really appreciate it if you would make a video about this. The startup is Lightelligence. As always, thanks for these videos.

  • @frightenedsoul

    @frightenedsoul

    3 ай бұрын

    Terrible name, though lol. Lightelligence. I get the idea behind it but it just doesn’t work as a satisfying portmanteau

  • @lonewulf0328
    @lonewulf03282 жыл бұрын

    This was one of the best layman's explanations of neural net training models that I have ever seen. Awesome content!

  • @duongchuc1834

    @duongchuc1834

    2 жыл бұрын

    ok

  • @patakk8145

    @patakk8145

    2 жыл бұрын

    but it isn't, he literally said he's going to skip back propagation (which is how models are trained nowadays)

  • @PaulAVelceaVSC

    @PaulAVelceaVSC

    2 жыл бұрын

    i am a layman I did not understand a bit of it, pun intended

  • @Deveyus
    @Deveyus2 жыл бұрын

    A couple missed points: Things like google's Coral are also pushing incredibly high values, and to my knowledge are doing it digitally as an ASIC. Large models are expensive to train, there's no contention here, from mythic, you, or the wider AI community, but several advancements have been made in the last couple of years, that are letting models be compressed and refined to less than 1% of their original size. This makes them incredibly small and efficient operations, even on traditional CPUs.

  • @Zeuskabob1

    @Zeuskabob1

    2 жыл бұрын

    I'd love to read about that! I've been dipping my toes in ML algorithms and many of the really interesting networks require an immense amount of memory to function, on the order of tens of gigabytes. I'm curious why those models require such an immense amount of memory, and what can be done to improve that.

  • @siddharthagrawal8300

    @siddharthagrawal8300

    2 жыл бұрын

    @@Zeuskabob1 u don’t really need 10s of gigabytes to get a good model that can perform well on a task (usually). Most people still use models of size less than 5gb or so.

  • @vigilantcosmicpenguin8721

    @vigilantcosmicpenguin8721

    2 жыл бұрын

    +

  • @flightrisk7566

    @flightrisk7566

    2 жыл бұрын

    thanks for pointing this out 🙄 seems like it was deliberately ignored for the sake of promoting this dumb startup

  • @moonasha

    @moonasha

    2 жыл бұрын

    just another case of Veritasium making a bait video to make experts respond

  • @joaoluizpestanamarcondes6219
    @joaoluizpestanamarcondes6219 Жыл бұрын

    Bro, this channel is crazy top shelf stuff, im amazed, thank you for that

  • @grabdoel
    @grabdoel4 ай бұрын

    I learned more through your video than i did in engineering class :(. thanks a lot and it opens a great perspective on new innovations where analog is combined with digital. Will dive into it.

  • @NotWhatYouThink
    @NotWhatYouThink2 жыл бұрын

    Great episode. Hadn’t considered the mix of digital and analog computers in a complementary fashion. I guess it’s not what I thought!

  • @WeponizedAutism

    @WeponizedAutism

    2 жыл бұрын

    True, but the actual impact of this is not what you think.

  • @mushin111

    @mushin111

    2 жыл бұрын

    Jesus, could you astroturf a bit harder please?

  • @LeoStaley

    @LeoStaley

    2 жыл бұрын

    Until the 90s, US war ships used mechanical calculators to calculate aiming the guns, something that would be perfect for your channel.

  • @deusexaethera

    @deusexaethera

    2 жыл бұрын

    I see what you did there.

  • @dieSpinnt

    @dieSpinnt

    2 жыл бұрын

    BS! Fourier ... ROTFL

  • @aetre1988
    @aetre19882 жыл бұрын

    My dad's "Back when I was your age" stories on computing were about how he had to learn on an analog computer, which, according to him, you "had to get up and running, whirring at just the right sound--you had to listen for it--before it would give you a correct calculation. Otherwise, you'd input 100+100 and get, say, 202 for an answer." he hasn't been able to remember what make/model that computer was, but i'm curious. any old-school computer geeks out there know what he may have been talking about? Era would have been late 60s or early 70s.

  • @GDScriptDude

    @GDScriptDude

    2 жыл бұрын

    It sounds like your dad's computer was before the invention of the transistor. There was an analog computer at the electronics lab at the university of Hull, UK (when I was a student there in the 80s) that had moving parts. I remember when it became unstable and the professor sprinted across the lab to shut it down before it self-destructed. Something spinning suggests a sine wave generator for example.

  • @sapinva

    @sapinva

    2 жыл бұрын

    Yeah, just like analog synthesizers. You have to let them warm up to a stable temperature first or they would constantly drift out of tune while playing. This was later solved with digital controllers.

  • @murmamirrmohaimen2271

    @murmamirrmohaimen2271

    2 жыл бұрын

    Maybe the older mechanical calculators. Linus Tech Tips did a video on those. Super interesting stuff.

  • @urlkrueger

    @urlkrueger

    2 жыл бұрын

    I can't address your question directly but in the later half of the 1960's I worked on a helicopter simulator, used to train military pilots, in which all computations simulating flight were performed by analog circuits made up of transistorized (no IC's) operational amplifiers and servo motors with feedback. This whole machine was housed in a 40 foot long semi trailer. In the rear of the trailer was a cockpit from a CH-46 helicopter including all the controls and instruments but the windows were frosted over so you were always flying IFR in a fog, i.e. no visuals. Next as you moved forward was an operator's station where you could control parameters such as air pressure and temperature, activate failures such as engine fire or hydraulic failure and such. The remainder of the trailer contained a row of electronics racks on each side housing the amplifiers, servos and other circuits that performed all the calculations. We can look at main rotor speed as an example of how it worked. Rotor speed was represented by the position of a servo motor from 0 to 120 degrees. The position of the motor was determined by the output of an amplifier whose inputs were derived from many variables such as engine power (there were two), collective control position and altitude. Attached to the servo motor was a potentiometer whose output drove a cockpit instrument but was also fed back to amplifiers/servos which were used to calculate engine power and such. There were many such subsystems with feedback loops interconnecting them so that failures were very difficult to diagnose. Often the only way to resolve a problem was to take a guess at which part might have failed and replace it. Also routine maintenance was very labor intensive as the many potentiometers would wear and need to be cleaned and then realigned which might take an hour for each one. As a young man I was totally amazed and fascinated by this technology. As an old man I can't believe that it really worked at all. But it did, at least some of the time.

  • @dick7540

    @dick7540

    2 жыл бұрын

    Back in the day, circa 1957, I was an Electrical Engineering student at the City College of New York. In one of the labs we constructed an Analog Computer using physical components like Motors, Gears, etc. There was absolutely nothing binary/digital involved except weather you passed or failed the course. A couple of years later I worked with a Bendix G15 computer with an optional DDA (Digital Differential Analyzer). The DDA was an analog computer Input and output were analog. You can look upon Google. Search for " Bendix G15 computer with dda "

  • @user-ou8qw2sg3d
    @user-ou8qw2sg3dАй бұрын

    This blows my mind. Thank you. It's so cool to learn this way about algorithms.

  • @SIBUK
    @SIBUK2 жыл бұрын

    The most interesting thing I found in this was when he was saying that in the chip they had to make it alternate between analog and digital signals to maintain coherence. It's interesting because the brain does something similar where it alternates between electric pulses and chemical signals.

  • @chrisfuller1268

    @chrisfuller1268

    2 жыл бұрын

    The problem is machine learning is still not capable of being used commercially in general environments (e.g. security cameras) because they can't handle unpredictable situations. The brute force method of AI is still the only solution for general environments (e.g. self driving cars)

  • @riskyraccoon

    @riskyraccoon

    2 жыл бұрын

    @@chrisfuller1268 people also suffer from the brute force nature of processing information, aka confirmation bias. Thankfully we can take steps to correct this, but many people lack the tools and mindset to make these self corrections.

  • @chrisfuller1268

    @chrisfuller1268

    2 жыл бұрын

    @@riskyraccoon Yes, humans are flawed, but we are capable of recognizing objects no matter what else is in our field of view. This is a task machine learning will never be able to solve in 100% of all possible (infinite) environments. Brute force AI requires more development effort, but is capable of also identifying objects in many environments. This is why machine learning is a step backwards in technology and why it should never be used in life critical applications.

  • @chrisfuller1268

    @chrisfuller1268

    2 жыл бұрын

    @Adam H Amen, I never thought of the beast as an AI! The beast will be cast into the lake of fire so I believe he will be flesh and blood human with a soul, but the 'image of the beast'!

  • @chrisfuller1268

    @chrisfuller1268

    2 жыл бұрын

    @Adam H yes, that is a very interesting way of looking at it! I think we're a very long way from an AI being able to reason, but we've been using AI to kill people for decades.

  • @carterbentley9030
    @carterbentley9030 Жыл бұрын

    Back in the mid-1960s my uncle, Joseph Grandine, designed a combination analog/digital computer that could optimally combine the two modes to solve complex problems in signal processing and data analysis. He called his computer the Ambilog 200. At that time, digital computing won the day. Now it sounds like he was a few generations ahead of his time.

  • @dinoschachten

    @dinoschachten

    Жыл бұрын

    Amazing. Just found two articles about it in the Internet Archive.

  • @IAreBean

    @IAreBean

    Жыл бұрын

    That is awesome

  • @LeKhang98

    @LeKhang98

    Жыл бұрын

    That's amazing. We should show him this video and ask him what does he think about it.

  • @stwessboi

    @stwessboi

    Жыл бұрын

    cap

  • @hamzahbalogun4220

    @hamzahbalogun4220

    Жыл бұрын

    I would love to know him

  • @snerttt
    @snerttt7 ай бұрын

    I'd be interested to see a digital computer adopt an analogue component, possibly to be utilized in situations of physics simulation, much like how a GPU is utilized to independently create graphics from the CPU.

  • @timobakenecker7314
    @timobakenecker731411 ай бұрын

    This video really has put new aspects to my knowledge of AI in total. Thanks for that!

  • @TheWhatnever
    @TheWhatnever2 жыл бұрын

    This is missing any mention to the other big alternative: Photonics. Startups like Lightmatter have shown that this is another very potent alternative. And I believe its benefits, of not being limited by electronic bandwith/losses and the ability to use one circuit to calculate the same calculation multiple times at the same time by using multiple colors/wavelengths is just astonishing. It was also left out that a big problem of these systems is the bottleneck in the conversion from general compute to these analog domains.

  • @Xenko007

    @Xenko007

    2 жыл бұрын

    Hopefully he covers this topic in the future

  • @perc-ai

    @perc-ai

    2 жыл бұрын

    how are u so smart

  • @KWifler

    @KWifler

    2 жыл бұрын

    Probably because it is also an emerging system. But also because photons are used like electrons as the actor, a new actor, while the video is explaining two fundamentally different ways to act.

  • @ChristopherCricketWallace

    @ChristopherCricketWallace

    2 жыл бұрын

    I was waiting for him to get to photonics, too. It's a HUGE opportunity to crazy amounts of parallel processing. And then there's quantum computing white whale, too...

  • @blueredbrick

    @blueredbrick

    2 жыл бұрын

    I want my positronic brain patch

  • @BrianBoniMakes
    @BrianBoniMakes2 жыл бұрын

    I used to calibrate analog computers that ran experiments and test equipment. They were often odd mixtures of analog and digital technologies. Near the end I had to keep a few machines alive as they aged out of tolerance, there was always a way you could tweak out some more performance by shifting the calibration away from areas you didn't need in a much more forgiving way than any new digital could.

  • @nenmaster5218

    @nenmaster5218

    2 жыл бұрын

    Anyone knows some Good Science-Channel for me to cxheck out?

  • @yash1152

    @yash1152

    2 жыл бұрын

    thanks a lot Brian Boni for your valuable input (keys: computers: mix of analog and digital)

  • @yuro5833

    @yuro5833

    2 жыл бұрын

    @@nenmaster5218 Nile red and Nile blue

  • @nenmaster5218

    @nenmaster5218

    2 жыл бұрын

    @@yuro5833 Thx! Know Hbomberguy?

  • @yuro5833

    @yuro5833

    2 жыл бұрын

    @@nenmaster5218 I actually thought I didn’t then realized I had seen several of his videos and forgot about him so thank you as well

  • @NR-bt7yz
    @NR-bt7yz6 ай бұрын

    I've recently started learning ML and this video helps so much. You just made me a Patreon supporter. Thanks Derek!

  • @photorealm
    @photorealm8 ай бұрын

    When I started thinking about artificial neural nets, I just assumed they would really only happen on specialized analog computers in the future. Then google and others along with more powerful digital computers made it work pretty darn great. I love being in this time of history, watching so much science fiction slowly become reality.

  • @nicholasjayaputra5754
    @nicholasjayaputra57542 жыл бұрын

    I thought there was no other part to the first part, thank you for the satisfaction you have given me through the knowledge I got from this video

  • @zaksmith1035

    @zaksmith1035

    2 жыл бұрын

    Can't wait to watch this with my kids. I forgot it was coming, we were waiting so long for it.....

  • @AxxLAfriku

    @AxxLAfriku

    2 жыл бұрын

    NO! NO! NO! Many people say I am sick in the head. NOOOO!!!! I don't believe them. But there are so many people commenting this stuff on my videos, that I have 1% doubt. So I have to ask you right now: Do you think I am sick in the head? Thanks for helping, my dear nico

  • @byronvries3826

    @byronvries3826

    2 жыл бұрын

    @@AxxLAfriku 0

  • @nicholasjayaputra5754

    @nicholasjayaputra5754

    2 жыл бұрын

    @@zaksmith1035 That's awesome mate

  • @nicholasjayaputra5754

    @nicholasjayaputra5754

    2 жыл бұрын

    @@AxxLAfriku I'm honoured to have your bot-like reply in my comment. As for the answer, well, I don't know, but have a good day mate!

  • @adamkallaev3573
    @adamkallaev3573 Жыл бұрын

    If it makes my graphics card cheaper, I'm all for it

  • @hridayawesomeheart9477

    @hridayawesomeheart9477

    Жыл бұрын

    Finally, a fellow PCMR member

  • @cdreid9999

    @cdreid9999

    Жыл бұрын

    you dreamer you

  • @jerycaryy4342

    @jerycaryy4342

    Жыл бұрын

    @@hridayawesomeheart9477 finally, an average redditor

  • @BlueDrew10

    @BlueDrew10

    Жыл бұрын

    It sounds like it could make GPUs more power efficient. GPUs are starting to use AI to make certain computations more accurate, so maybe an analog chip on our GPUs could handle that instead.

  • @notisike3553

    @notisike3553

    Жыл бұрын

    @@BlueDrew10 I agree, but the first major bottleneck is, like he said in the video, the massive power requirement to teach each AI, each needing 3 household's combined annual energy usage, mass production seems inefficent.

  • @ChrisWalker-fq7kf
    @ChrisWalker-fq7kf4 ай бұрын

    That analog neural network was really interesting. But to me it's still essentially digital, i.e. discrete. In a normal digital solution you might have 16 possible values for the weights which would be encoded as 4 bits and would need to undergo addition/multiplication. But in the "analog" solution you encode the weights by setting one of 16 distinct voltage levels. The available voltage levels are quantised, not continuous so it's still a discrete system. It's great that you can do addition by just summing currents and multiplication by changing resistance. But you can even do this with binary: AND gates are multipliers and OR gates are adders if you only have 1 bit of data (1 OR 1 gives an overflow condition but the "analog" design will need enough voltage levels to avoid overflow also e.g. 7 + 13 would give the answer of 16 if this was the highest voltage level). I'd say it's still digital but it's not binary. It's multi-level logic.

  • @dominikhauk4638
    @dominikhauk4638 Жыл бұрын

    This has to be the most insightful and entertaining channel on youtube

  • @steveipsen6293
    @steveipsen62932 жыл бұрын

    One of my first "computer" classes in engineering school was learning to wire up an analog computer and solve differential equations. Because I had to "assemble" the hardware for the process, it felt much more hands-on than when I took a punch deck to the little window, and waited for up to 20 minutes for the compiler to tell me I had no idea how Fortran worked. At the time, I really appreciated that parameters on the analog could be changed quickly in order to see how different currents, voltages, resistance, etc. affected the outcome. Of course, now with the speed of digital processors, the efficiency of Python libraries, and the Interwebs, I have largely gotten to appreciate the digital world. Now, Derek has got me jazzed to buy a portable analog. $200 on Ebay?

  • @neeneko

    @neeneko

    2 жыл бұрын

    Yeah, my computer classes in engineering school had a similar thing, though with us it was opamps. It was not a full class, but we did it around the same time as learning FPGAs and having to implement complex programmable digital logic, so it was a good reminder of 'digital logic with an ADC/DAC pair is not always the best or simplest solution'

  • @swapode

    @swapode

    2 жыл бұрын

    While it's absolutely not the same thing, I encourage newish programmers to write a 6502 emulator. It's about as close as one can realistically get to building your own CPU hands on, which IMHO gives a worthwhile different perspective to the field than the now common approach to never leave the comfort of interpreters and virtual machines.

  • @joesterling4299
    @joesterling42992 жыл бұрын

    The biggest issue is distortion. Inexact calculations due to imperfect components, degradation of the data when transmitted (wired or wireless), external EM interference, all conspire to make the use of analog a special challenge. Mixing digital and analog to play to the strengths of each along the way intrigues me. I'm old enough to have experienced the full evolution of digital computing. My mindset is therefore quite biased toward it. What you propose would be quite the eye opener for me, if it actually can be made to work as prolifically as current digital technology.

  • @WilcoVerhoef

    @WilcoVerhoef

    2 жыл бұрын

    I assume there's a lot to be discovered on the topic of self-correcting algorithms, or even error-correcting analog circuits that compensate partially for the inaccuracies. Like what Hamming codes are for digitally transmitted data.

  • @slippio

    @slippio

    2 жыл бұрын

    nature exists in chaos, technology is more and more approaching the chaos orchestra.

  • @StevenSiew2

    @StevenSiew2

    2 жыл бұрын

    Distortion really? I am under the impression that the biggest problem with analog computer is NOISE. You can never get rid of noise in an electrical system. Even if the hardware has no distortion, the inherent thermal noise in the system will cause some small calculation error.

  • @leftaroundabout

    @leftaroundabout

    2 жыл бұрын

    @@StevenSiew2 that's true, but noise is something that AI needs to deal with anyway because the inputs will always be noisy to begin with. It can actually be useful to _add artificial noise_ while training a digital NN, to avoid overfitting issues. (Stochastic gradient descent can also be seen as a way of making the training “noisy”). As long as the pertubations are small and random, training won't be affected negatively. Distortions however are hard to deal with. You may be able to train a model on a particular chip that has such and such distortion; because the distortion properties don't fluctuate and constant-but-unknown biases, the weights will ruthlessly overfit to this particular chip, and then it probably won't work at all on another copy.

  • @Opsse

    @Opsse

    2 жыл бұрын

    As a PhD student in this field, I can answer some of your questions. Yes, we usually talk more about noise than distortion. And thermal noise is not the only issue, there is read and write variability, resistance drift in time, the resistance of interconnections, ... However, it is true that neural networks can sometimes take advantage of the noise to avoid overfitting, but only a reasonable amount of noise and only in some cases. Self-correcting algorithms and error-correcting are options, but it's not that easy. Usually, this kind of method sacrifice the performance or requires more energy (which is the opposite of what we want). About the mixing digital and analog, they presented it nicely in the video, but the digital/analog converters required a lot of energy (sometimes more than the vector-matrices multiplication itself). So we don't want to do it too often.

  • @metimulugeta8062
    @metimulugeta8062 Жыл бұрын

    I was thinking that the hardware is not competent for the software advancement that is taking place and u just gave the right amount of knowledge to clearly see through the fog.

  • @gmeast
    @gmeast11 ай бұрын

    25 years ago, designed and built an analog computer using a handful of Summing and Differencing Amps, Resister Arrays, Log- and Anti-Log Amps and more. These components were intrerconnected by a whole bunch of addressable Cross-point/Cross-bar switches and Buffers. An array of inverting and non-inverting Buffers served as analog inputs and variables. A digital word was shifted onto the switches from a PC. You could "build" any math equation. It was eerie seeing a real-time answer emerge as variables and data were being input. Because OP amps were a major part of the architecture, speed was limited by the slew-rates if the Amps.

  • @ElectroBOOM
    @ElectroBOOM2 жыл бұрын

    Awesome information!

  • @Mani_Umakant23

    @Mani_Umakant23

    2 жыл бұрын

    I gave you your first like 😁

  • @N____er

    @N____er

    2 жыл бұрын

    @@Mani_Umakant23 Why would you like such an unoriginal comment that provides so little value or thought?

  • @Mani_Umakant23

    @Mani_Umakant23

    2 жыл бұрын

    @@N____er Aise hi sexy lag rha tha.

  • @40.vedantdubey8c6

    @40.vedantdubey8c6

    2 жыл бұрын

    @@N____er Don't say anything bad about ElectroBOOM he is such a wonderful creator

  • @40.vedantdubey8c6

    @40.vedantdubey8c6

    2 жыл бұрын

    Hi sir I am a big fan of yours \

  • @StratEdgyProductions
    @StratEdgyProductions2 жыл бұрын

    This was a banger of an episode. I was enraptured the entire time. Tight story telling with a great hook and title. You're a pro, man.

  • @Strawberry_ZA

    @Strawberry_ZA

    2 жыл бұрын

    Fancy seeing you here ❤️

  • @oDxrk

    @oDxrk

    2 жыл бұрын

    hm

  • @trec_log

    @trec_log

    2 жыл бұрын

    hook, line and thinker

  • @memyselfandi6364

    @memyselfandi6364

    2 жыл бұрын

    Damn Canadians keep blowing my mind. TELL THEM TO STOP IT!

  • @killercuddles7051

    @killercuddles7051

    2 жыл бұрын

    SARS CoV-2 was patented with UNITED STATES after being developed by Pirbright Institute in UK

  • @rule1dontgosplat
    @rule1dontgosplatАй бұрын

    Holy crap… I remember seeing the ALVINN van somewhere in the 1980s. Not sure if it was on PBS or something like that. That’s hilarious.

  • @dt-wq7ql
    @dt-wq7ql7 ай бұрын

    Excellent presentation. My brain never got much past my spirograph set. It was functional at some stage . 😮

  • @siemensmolders4131
    @siemensmolders41312 жыл бұрын

    Interesting video, but felt a little bit too hyped up for me ^^ The discussed challenge appears to be a highly specific application; matrix multiplication. The solution shown here was an analog ASIC (application-specific integrated circuit), which is a type of chip we've been making for over half a century. Once a tasks becomes both computationally expensive and very specific, the fastest method has always been to make a specific chip for it. Nor is analog multiplication anything new, I remember being taught the little analog multiplier circuit with the Gilbert cell over a decade ago.

  • @matteod2567

    @matteod2567

    2 жыл бұрын

    most of his videos are like this lol

  • @aceman0000099

    @aceman0000099

    2 жыл бұрын

    I believe Derek found a little niche to focus on since he did the video on the ancient Greek analogue computer, which had an almost identical conclusion

  • @ejpmooB

    @ejpmooB

    2 жыл бұрын

    I feel he is on to something here ... maybe the real benefit is that you don't have to make all these specific chips, because in principle one fairly big analog one could do everything you threw at it. But it feels a bit scary to me too, because you are getting closer to biological systems.

  • @danielraymond3045

    @danielraymond3045

    2 жыл бұрын

    Yeah, the reduction in power consumption I'd imagine is mostly due to it being an ASIC, not being analog. There are quite a few digital AI inference ASICs coming onto the market as well - I'm curious to see which ones will reign supreme

  • @mori3327

    @mori3327

    2 жыл бұрын

    Hi: Unfortunately I can not speak English, so I have to use the program I have installed on my phone for translation, except that I can speak Persian, I do not speak any language other than Persian and no other language. I can not speak, so if I said words or sentences and the special program mistranslated into your language, I apologize in advance to your esteemed father, from the bottom of my heart and from the bottom of my heart, that the letters and words Call me to the end and call me a person orDo not consider me a rude person and do not consider me a rude person and judge me correctly and after reading my writings, just put yourself in my place for a few minutes and imagine yourself in my place, maybe if you have an awakened conscience and There was love and affection in your hearts, of course, if you did not have pride and arrogance, understand me and give me the right, and again, maybe, maybe, maybe you did something and you did it for me and you took me from You saved this great tribulation that I hope will not happen to any living thing, of courseI do not very much hope that anyone will take my hand and save me from this misfortune, but I am writing so that I can at least be comfortable in front of my conscience and not blame myself later if I cry out for this cruelty, oppression, and captivity that has fallen on me. All the doors are closed to me, even the eyes of God are blind to see the oppression that is being done to me, and the ears of God are blind to hearing the cries of my constant cries, my midnight cries, my daily sufferings, my daily prayers and the jurists and The cries of my every moment from this oppression, oppression and cruelty that from the beginning of my life, fromThe first events of my life that I remember are deaf. So how can I hope for others when God has done nothing for me and trapped me in a cruel, cruel, cruel, and oppressive family? I am Morteza, I am from Iran, I am blind, I am 34 years old, I am unemployed because in our country, Iran, there is no work for healthy people, let alone disabled people like me. I live with my family in a small town in Iran. Of course, in appearance, they are my family, but in reality, they see me as their own enemy, and with me, who am their child, only bI was born blind, and I was not to blame for this, but my parents, because they were illiterate, considered me as a disgrace only to myself for my blindness and disability, and were always tortured as severely as possible. They beat me to the point where they threatened me with a knife. They put me to sleep and put the knife on my neck, and I was terrified and scared. Get up and until fullThe thin skin of my hand burned, they would not let me go. And they tortured me many, many times, to the point that my brothers, with the support of my parents, tortured me in front of my parents' indifferent eyes, and told me that I was blind. You are and you should be tortured to the extent that they created the belief in me and in my mind that anyone who is blind or disabled should be tortured because it is a disgrace to the family and society. And unfortunately, their torture is still going on, and only a kind of tortureThey do not consider it for him, so how can I complain to someone who considers my law as his property, of course, the current laws in Iran. Unfortunately, the government does not have a place for people like me to go and live. I really have no choice, either I have to commit suicide or I have to stay in the same house that my family has turned into hell for me, under the severe mental and physical torture that my family inflicts on me. And more than ever with a horrible gradual death that tormentedI want to stay. So if you still have a little mercy, fairness, conscience, compassion, love, humanity, knowledge, ideology, humanity and altruism, help me, hold my hand, reach my cry. If you are in contact with institutions and organizations affiliated with the International Committee of the Red Cross, or if any of you are a member of human rights organizations or the International Committee of the Red Cross or Red Cross organizations in free and pro-human rights countries, listen to my writings, my voice and my cry. Come on, maybe they're a little fair and think of me and a way toSave me from the clutches of ostensibly wolves, of course, if human rights institutions and organizations and the Red Cross actually support human rights, if you are a citizen of a free country or a citizen of a free country like the United States, Canada, Australia, Great Britain, Norway, You have Sweden, Denmark, Finland, and the European Union. Send me invitations, arrange for me to leave this house, which is worse than hell, so that maybe I too can taste freedom. If you have capital, you are rich, you have money, at least help meLeave the messengers, so that all the photos on my personal pages in the messengers and social networks are from 8 years ago, that is, for 8 years ago and 5 years ago. I have not even traveled for many years, because our city is a small city. It is sad for a disabled person to leave home, it is sad for a disabled person, the streets of our city are not adapted so that a disabled person can easily be at least a little out of home. And I'm really depressed at home, especially with this family that is always torturingAre. Of course, I do not care if I travel or not, because my problems are so great that not traveling travels much more than my other or other problems. In your opinion, can a person who eats only one meal in 48 hours, which is a good meal, even a moderate meal, but also any garbage he gets, take pictures, travel or not, and other things? Slowly, someone who breaks his heart at every moment, of course, the work of my heart is no longer broken and my heart is on fire andIt is burned, and this fire is getting more and more hot and burning, and it is burning and ashes my whole being. So I hope that if you do not hear the sound of my heart breaking, at least be fair and feel the smell of my heart burning, away from arrogance, arrogance, misguidedness, ethnicity, race, skin color, shape and appearance for the sake of humanity and for Humanity and honor that you have, just imagine yourself in my place for a few minutes, see yourself in my place and think, if far away from you, God forbid, you would be in my place.Did you ??? Were you satisfied that someone was making fun of you or disrespecting you, or did you laugh at you in response to your message and pain and heart ?, so my dear friends and those who wrote my writings to you Please, if you do not want to do anything for me, at least do not make fun of me, do not laugh, and if you do not want to help extinguish the fire inside me that burns my heart, at least do not spill oil and gasoline with ill-considered language, words and expressions, and this is far from knowledge, honor, It is humanity and family originality. If you want to give me anyPlease help me do not send me a message on KZread because I can not transfer messages from KZread to the translation program that I have installed on my phone, and as a result, because I am not fluent in any language other than Persian, I can not understand the meaning of messages. Let me know what you sent me on KZread. If you wanted and could and the possibility of helping me in any way, whether financially, materially or spiritually, please contact me on WhatsApp or Telegram, because in WhatsApp and Telegram I can easily send messages.00989358205228 00989182804420 00989358205228 00989182804420 00989358205228 00989182804420 00989358205228 😞😞😞

  • @dekev7503
    @dekev75032 жыл бұрын

    This just goes to show that no knowledge is useless. When I was in my final year of my undergraduate degree ( Electrical Engineering) I took a course on analog computers and the general consensus was that this field was obsolete. That year was the last year that the course was taught as it was phased out in the new curriculum.

  • @yourright4510
    @yourright451011 ай бұрын

    While it may be true that we are reaching a limit we’re not quite certain what computational power some new neural networks will need. This is for future applicable outputs needed. Hinting at the new analog data calculations coming into the forefront.

  • @granitfog
    @granitfog27 күн бұрын

    A small point, refering the sum of inputs needed to stimulate a neuron, you called it "bias" but "threashold" is a better descriptor of the phenomenon. In fact the offical term is "threashold potential" (potential referring to charge needed to do work, the work being depoarization of the membran and propagating an impulse)

  • @masterbulgokov
    @masterbulgokov2 жыл бұрын

    "Better suited" is the key. Quantum computing will fall into the same clause: there some things quantum computing is "better suited" for.

  • @BreaksFast

    @BreaksFast

    2 жыл бұрын

    quantum computers (one that use physical q-bits) are only hypothetical, but people talk as if they already exist in reality. They don't, there is not a single, fully functional quantum computer on the planet, and there might never be.

  • @ninjafruitchilled

    @ninjafruitchilled

    2 жыл бұрын

    @@BreaksFast Sure they exist, they just don't have very many q-bits.

  • @RyanGrissett

    @RyanGrissett

    2 жыл бұрын

    @@BreaksFast The computers do exist, but there is a lack of understanding in programming them to do classical computing problems.

  • @scyfrix

    @scyfrix

    2 жыл бұрын

    @@BreaksFast They can and do exist, albeit with very limited qubit counts. The first experimental demonstration of one was in 1998. D-Wave Systems are selling computers with 2048+ qubits right now.

  • @jamesx9881

    @jamesx9881

    2 жыл бұрын

    @@BreaksFast Tell that to IBM?

  • @aidanl.9946
    @aidanl.99462 жыл бұрын

    i've always mused about this to myself, i always thought 'why not use analogue to calculate certain things', theres lots of stuff in physics that's extremely hard to calculate, but just 'happens' in the real world in an efficient way, the surface of a bubble for instance minimises surface area very rapidly in a way that takes no effort on the bubbles part, but is incredibly hard for a digital pc to calculate. the tricky part (and the reason people doing this are smart scientists/engineers and i'm not) is figuring out how to wrangle "the bubble" into a portable and responsive piece of hardware, and it's super cool to see efforts made in this direction are having success

  • @jimmysyar889

    @jimmysyar889

    2 жыл бұрын

    Same thought. I used this technique to figure out a way to solve mazes super efficiently with flowing water. I think that’s what’s happening with quantum computers also.

  • @yoshienverde

    @yoshienverde

    2 жыл бұрын

    It always comes back to the drawbacks Derek mentions at the beginning of the video: Analog processing is single-purpose, error-prone, and hard to repeat. As such, for your physics example, it would invalidate A LOT of the data you get back, since you cannot guarantee a certain level of falseability, auditability, and error margins. You CAN get there, but you start requiring A LOT of boilerplate circuitry around the actual solution solving hardware. As a silly and basic example that is almost trivial nowadays, but still there, you can think of the necessity of adding a lot of surge protection and current stabilization to a circuit to ensure that the natural unsteadiness of current in the power grid won't skew your results. And that's even just taking into account discrete and "simple" issues to calculate. Imagine processing data for some chaos-related physics theory, and basically getting pure rubish at the end, because even the slightest micro-volt level of disturbance automatically distorts everything. How about external interference? Or electromagnetic interference between the actual wires in the circuit? As I said, not imposible to tackle, but you suddenly have an overhead of 90% boilerplate just to make the results useful on anything practical. I can't even imagine all the engineering that must have gone into those Mythic semi-analog chips for AI, just to keep everything tidy. The fact a Realtek-sized chip can give you one third the performance of some nVidia Quadra (or similar) card, for a fourth of the power consumption of a cheap entry-level mobile Core i3 is just astounding!

  • @yoshienverde

    @yoshienverde

    2 жыл бұрын

    To be clear, these Mythic chips point towards a future resurgence of analog processors not dissimilar to what digital ones brought in with their unparalleled versatility. Outside of very bespoke chips for very high amounts of money, probably in the realms of very high research, science, etc; I can see a general idea of modularity at a functional level. Say, you manufatcure analog chips that can do some very important but expensive math calculations that are common for most science in some specific branch (say, a lot of transformations, or integration, maybe some Lorentzians, and so on). Then, at research groups, institutes, university, they go and do the same as electronic engineers do with good old breadboards, and DIY some complex formulae on the fly, test their hypothesis, and iterate over the formuale as needed. Imagine those astrophysicists doing 2k term polynomials, being able to duct tape a dozen chips together, the same way electronic engineers use logic gates as basic digital units, and getting the results out in a couple of hours, instead of having to write a piece of software that will take a couple of days to run, a week to write, and any mistake or failed result requires another week to debug just to make sure it failed because you were wrong, and not because you input a 5 where a six should have gone when writing all 1500 terms for one of the formulae

  • @squeakybunny2776

    @squeakybunny2776

    2 жыл бұрын

    Yes I've always thought this too. Aside from the negatives mentioned in the vid and comment above: "if you can't calculate it, let nature do it" I've used the term 'calculate' here, but I think it applies in a broader sense. If something is too hard to manufacture / produce precisely maybe nature can do it better.

  • @DrVonJay

    @DrVonJay

    2 жыл бұрын

    @@yoshienverde wish I understood what you were saying but great rebuttal

  • @Tony770jr
    @Tony770jr Жыл бұрын

    I worked with machine learning applications 6 years ago on resource constraint microcontrollers. After understanding how neural networks actually worked I came to the realization that an analog equivalent would operate much faster. I mentioned this to my engineering manager at the time and he laughed at the idea. But I knew I was right!

  • @sushaanpatel1337
    @sushaanpatel1337 Жыл бұрын

    He has changed the title and thumbnail of this video for the thrid time and I watch it every time with the same curiosity

  • @KarthiSrinivasan
    @KarthiSrinivasan2 жыл бұрын

    There's an entire field of research called neuromorphic computing/engineering looking into this very problem. It was pioneered by Carver Mead in the 90s and has seen a lot of interest lately.

  • @lxschwalb

    @lxschwalb

    2 жыл бұрын

    I was waiting for him to either mention the words "neuromorphic" or "memristors"

  • @jecelassumpcaojr890

    @jecelassumpcaojr890

    2 жыл бұрын

    I remember reading about Mead's analog stuff in the 1980s, something related to hearing. Perhaps my memory is wrong.

  • @Anomynous
    @Anomynous2 жыл бұрын

    "Simple tasks like telling apart cats and dogs." You can find more difficult task but this is already an incredibly complex task expecially when they are images

  • @henrypetchfood

    @henrypetchfood

    2 жыл бұрын

    This is exactly the point though. Trivial for a human to do, hard for a computer.

  • @Cyrribrae

    @Cyrribrae

    2 жыл бұрын

    @@henrypetchfood I literally just had a friend tell me a story about their mother misidentifying a pomeranian as a cat haha. Maybe not always trivial.

  • @anders5611

    @anders5611

    2 жыл бұрын

    @@henrypetchfood It's trivial for a human because evolution produced neural circuits capable of solving this very hard problem. Our own minds are the least aware of what they do best.

  • @duckseverywhere8119

    @duckseverywhere8119

    2 жыл бұрын

    True, but Derek's point is that in the grand scale of what we'd hope to achieve with analogue computers (in the future), telling apart cats and dogs is a simple expectation - yet it's still hard to do with current technology.

  • @pbinnj3250
    @pbinnj32506 ай бұрын

    I cannot express all of my appreciation for this video. I understood it and I gained an enormous amount from it. If I sound unduly excited, it’s because I thought this stuff was beyond me. Thank you.

  • @ericpham7773
    @ericpham7773 Жыл бұрын

    Lense of light and tolerance threshold can make it no difference than digital but resolution now goes so small as a 10^-65 m size by remove the zero so no exact target possible so the executioner never feel guilty because zero was to used as a curse or label or target in semetic design

  • @keithsmith3118
    @keithsmith31182 жыл бұрын

    When I was in the Navy I worked on the Fresnel Lens Optical Landing System. There was no 1% error. It was .005 VDC tolerance over a minimum 5 VDC base. The computers had a lot of math to solve to target the hook touch down point for each aircraft. It was completely analog and op amp driven and has been around for over half a century. I've witnessed many many old analog machines in manufacturing since then. Analog technology isn't new technology, it's forgotten technology pushed aside by the digital technologies. I'm happy to see it hasn't completely died.

  • @pavanagrawal6397
    @pavanagrawal63972 жыл бұрын

    Fantastic video and I learnt a lot being a biologist. Small correction, neurons (the real ones) are indeed analog in the sense that they can tweak their output and fire, fire more, fire less just like an analog computers. This happens by a combination of changes in neurotransmitters, their dumping at the synapses and adding neuropeptides that can change ‘gain’ from the neural networks.

  • @michaelmeichtry316

    @michaelmeichtry316

    2 жыл бұрын

    Exactly! The analog behavior of neurons is closely modeled by the analog current/voltage exhibited by the tweaked transistor cells, as so well demonstrated and visualized in the video.

  • @kalliste23

    @kalliste23

    2 жыл бұрын

    Neurons have a lot going on inside, and things are happening outside, that affect what they do and when they do it. It amazes me that computer neural networks work at all, let alone as well as they do.

  • @vyor8837

    @vyor8837

    2 жыл бұрын

    Ya, so take what he's wrong about in the field you know and apply it to the field I know(comp sci) and suddenly the entire video is a load of rubbish.

  • @grumpystiltskin

    @grumpystiltskin

    2 жыл бұрын

    @@kalliste23 Don't get me started about the neurons in a squid vs a human... they have fewer, bigger and more complex neurons.

  • @blucat4

    @blucat4

    2 жыл бұрын

    @@vyor8837 Not a load of rubbish, just amazingly primitive compared to what it's trying to mimic. And also use specific. And not really capable of learning new kinds of tasks. ;-)

  • @gg-qj3gc
    @gg-qj3gc Жыл бұрын

    For anyone considering buying the "The Analog Thing" Computer. The site says "offering our low, not-for-profit unit price". Well, they increased the price from 299€ to 499€ in late 2022.

  • @canfloph
    @canfloph4 ай бұрын

    this reminds me of the cores from portal 2, that line in the start of the game " All Aperture Science personality constructs will remain functional in apocalyptic, low power environments of as few as 1.1 volts. "

  • @2011littleguy
    @2011littleguy2 жыл бұрын

    Fascinating! I was one of the first engineers to 'train' a computer to recognize handwritten numbers. it was used for reading zip codes for post office sorting. It worked quite well and the method I dreamed up is what is used today. Namely, getting many samples (I sent pages around the office asking people to fill in boxes with zero to nine numbers. The variability in human handwriting was amazing. Then I separated each box into nine areas and a program determined if an area had a mark or not. By playing with the various combinations, and tweaking it for often confused numbers like 5 and 6, we got a very low error rate. I'm happy to see I was on the right track sixty years ago.

  • @emmanueloluga9770

    @emmanueloluga9770

    2 жыл бұрын

    Great job, optimization is wonderful.

  • @pratheekec

    @pratheekec

    2 жыл бұрын

    Bruh wtf....u r a legend

  • @anothercomment3451

    @anothercomment3451

    2 жыл бұрын

    Indeed! Great brain!! Now, learn about and teach folks why we have "zip" codes, and precisely why. You'd be surprised.

  • @abetb.

    @abetb.

    2 жыл бұрын

    Wow! Just wow!!!

  • @macdeep8523

    @macdeep8523

    2 жыл бұрын

    Great Work Ser

  • @hepasb
    @hepasb2 жыл бұрын

    Some years ago as I was finishing up on my computer science education, I had to do a project for my finals (this is not university, btw, just a secondary kind of route available to me in my country) and I had always been fascinated by artificial intelligence and neural networks, so I picked to do something in that realm. I never had anything to do with AI prior to that, so my knowledge was quite shallow when it came to how artificial neural networks actually work. I had been programming for quite some time at that point so I had my basic tools in place, so to say, but I really didn't know where to start from there when it came to what I picked as a "task" I wanted to solve by this method. The task I picked was driving a little simulated 2D-car along a randomly generated road without any human input and unfortunately, the language I was most proficient in at the time was Java, so I tried to implement it in Java. When I first started the project, I read a lot about neural networks and even included some scientific papers on it in my reading, but not stemming from any kind of scientific background, I really struggled to understand the deeper concepts presented in those, which eventually lead me to just abandon reading up on the topic and just approach it with the general concept of "How could/would it work?". Looking back, I think I have never in my life experienced a time when I was more challenged by a task I set out to complete and also never before I was thinking that much about how our brains actually process information, as that is the key idea behind it. Yet, slowly but surely, I ended up with a pretty similar concept to what most neural networks do, even though probably far less efficient than using something that was already available like TensorFlow. Where am I going with this story? Well, after a while it actually worked, the little network I created "learned" how to drive the car along the road after countless iterations of doing it wrong and to this day, it just absolutely blew my mind on so many levels that this was possible. Even more so considering that all of these complex processes take place in our brains every microsecond we exist with much more proficiency. I absolutely loved that project from beginning to end and if not for the rest of that educational path I took and years of aimlessly programming for fun, it just made me know that I truly love the entire concept of using programming to recreate something in nature. Kind of an pointless anecdote, sorry.

  • @Velocitist

    @Velocitist

    2 жыл бұрын

    It’s not pointless, that was a good story.

  • @hepasb

    @hepasb

    2 жыл бұрын

    @@Velocitist Thank you, I just felt like reminiscing and sharing the experience, which was actually quite enlightening to me, after watching.

  • @DAzZuLK

    @DAzZuLK

    2 жыл бұрын

    No pointless at all! Many of us are developers, and that means working and studying with pet projects. From my point of view, the way to learn is developing from scratch, and in parallel learning already built up libraries or dedicated software for it (I'm playing with a bunch o neurons made from scratch, and opencv). Cheers!

  • @obviouslymatt6452

    @obviouslymatt6452

    2 жыл бұрын

    that’s sick

  • @wattafakka4186
    @wattafakka41862 ай бұрын

    great video, I always wondered about neural networks. Now I got it!!👍👍

  • @coleballenger4595
    @coleballenger45952 ай бұрын

    12:48 They did my man Alex so wrong there lol! Great video as usual.

  • @mikey92362
    @mikey92362 Жыл бұрын

    I was a finalist in the state science fair competition back in the 4th grade. So around 1977. My project was a board about 2 foot by three foot, full of 2 and three position switches and colored lights. It was a logic board that could solve various types of equations. Pretty cool in a time when almost no one had ever touched an actual computer. In the end I learned absolutely nothing about computers. But I learned to solder really, really well from it. Moral of the story, if you can't learn to code, at least learn to solder. :)

  • @no-ld3hz

    @no-ld3hz

    Жыл бұрын

    Never too late to learn :) Start with arduinos, they're amazing little chips that can do wonders. Even something simple like a temperature sensor might be fun.

  • @cdreid9999

    @cdreid9999

    Жыл бұрын

    lol that's amazing

  • @Kenshiroit

    @Kenshiroit

    Жыл бұрын

    whats a solder?

  • @mikey92362

    @mikey92362

    Жыл бұрын

    @@Kenshiroit Google will help you. But basically it's how you fuse wires together to form a bond for passing electricity.

  • @moshjustice4988

    @moshjustice4988

    Жыл бұрын

    @TheMurchMan then turn it into a hobby... Maybe do it for fun on weekends or just let it be your side hustle if your location allows it

  • @carstenpxi
    @carstenpxi2 жыл бұрын

    Analog computers are actually a hardwired set of circuits “programmed” for a particular task. They excel is massive parallism, og true real time performance. In additional to analog circuits built using transistor or tubes, optical devices such as prisms (or rainbows) do real time spectrum analysis at light frequencies. And have real time color displays. To duplicate the performance of an optical prism at those frequencies using digital circuitry, would require massive arrays of digital hardware multiplier/accumulators. I did the calculation once in the mid 1990s and at that time it would require about 600 MW power. Early spectrum analysers developed for military applications, took audio or radio waves, upconverted them to light, and used a prism to make the spectrum analysis.

  • @LeTtRrZ

    @LeTtRrZ

    2 жыл бұрын

    If need be, could analog computers be made to go digital temporarily? If so, it would mean that they can perform accurately for a time and then go back to analog for complexity.

  • @srpenguinbr

    @srpenguinbr

    2 жыл бұрын

    @@LeTtRrZ I don't think so, they are so fundamentally different it would be hard to integrate them. It would be easier to have 2 circuits

  • @andreafedeli3856

    @andreafedeli3856

    2 жыл бұрын

    @@LeTtRrZ There are tons of studies on reconfigurable architectures, and the theory of how good a digital computer can be at representing analog behaviors (so, I reckon, the opposite of what you were conjecturing) is well known, as are the implied constraints, but the matter remains about what bricks set should be part of the reconfigurable architectural fraction, and in which abundance each. As soon as you decide the number of components available, and the maximum degree of connection reconfigurability, you define a limit on what you can represent in a given amount of time. As a passage in the video suggests, there are studies about the utilization of architectures with digital boundaries between analog slices, but error correction possibility is very often, I'd dare to say always, a consequence of what you know you want to represent, simply because if you don't know what you're representing, you cannot tell whether you're doing right or wrong... At best you may exploit some underneath characteristics of the representational space: e.g.: if you know that your values should fall on one element of a grid you may correct an analog result choosing the nearest grid element, which means, somehow, re-digitalizing the result... But knowing what you're representing poses a constraint on the freedom of the intermediate representations...

  • @LeTtRrZ

    @LeTtRrZ

    2 жыл бұрын

    @@andreafedeli3856 Why not just allow the computer to ration between digital and analog based on the demand of the task it’s attempting?

  • @smithsmithington

    @smithsmithington

    2 жыл бұрын

    @@LeTtRrZ He says that in the video. It's exactly what they do. @ 18:56

  • @mikegiles1821
    @mikegiles1821 Жыл бұрын

    Very informative. Thanks for posting!

  • @dickslocum
    @dickslocum11 ай бұрын

    Looks just like the breadboard programing process I was taught during the late 60s in my Introduction to Digital Programing associate degree program. . If you are using electricity to produce the output it is not an analog computer. it is using digital technology to regulate the voltage, current and amperage for your O-scope.

  • @Crowald
    @Crowald2 жыл бұрын

    So, this was Harold Finch's solution in Person of Interest. His ability to create an autonomous observant AI to identify dangerous behavior was the result of Rosenblat, and he did it 15 or 20 years before anyone else would even attempt to do so. Missed an opportunity to mention him in PoI. Neumann was mathematics, Turing is the father of modern computing, but Rosenblat was a maverick on the nature of neural networks.

  • @johndawson6057

    @johndawson6057

    Жыл бұрын

    Oh my god thank you for bringing this up. Ever since i watched that show I have been set on learning everything and anything about AI. It has inspired me set me on my current course in Comp Science.