How Nvidia Won AI

When we last left Nvidia, the company had emerged victorious in the brutal graphics card Battle Royale throughout the 1990s.
Very impressive. But as the company entered the 2000s, they embarked on a journey to do more. Moving towards an entirely new kind of microprocessor - and the multi-billion dollar market it would unlock.
In this video, we are going to look at how Nvidia turned the humble graphics card into a platform that dominates one of tech’s most important fields: Artificial Intelligence.
Links:
- The Asianometry Newsletter: asianometry.com
- Patreon: / asianometry
- The Podcast: anchor.fm/asianometry
- Twitter: / asianometry

Пікірлер: 616

  • @Asianometry
    @Asianometry2 жыл бұрын

    What would you like to see on the channel?

  • @bjliuyunli

    @bjliuyunli

    2 жыл бұрын

    Thanks a lot for the video! Would be great to see a video about power semis like IGBTs and Silicon carbide.

  • @2drealms196

    @2drealms196

    2 жыл бұрын

    You've covered Nvidia, could you cover Nuvia and Nivea?

  • @masternobody1896

    @masternobody1896

    2 жыл бұрын

    yes more gaming video is what I like

  • @fulcrumR6

    @fulcrumR6

    2 жыл бұрын

    Can't wait to see. You should do a video on the companies behind modern day tanks, no matter the country. The history on many companies (General Motors, Etc) and them taking the time to design the tanks and make them functional is very interesting. I'd love to see a video on that.

  • @screwcollege8474

    @screwcollege8474

    2 жыл бұрын

    Marvell technology pls

  • @wilsonaaron3930
    @wilsonaaron39304 күн бұрын

    I bought NVDA shares at $300, $475 and 300% up, I kept adding in the managed holdings .. I believe NVDA is at least 5 years ahead in innovation.. I wouldn't be surprised if they keep going higher.

  • @mondimlotshwa3958

    @mondimlotshwa3958

    3 күн бұрын

    NVIDIA hitting the scene so far! You definitely crushing it

  • @donnahensley2459

    @donnahensley2459

    3 күн бұрын

    Spot on! How effective is your managed tradings and which stocks wave should I consider on my fidelity right now?

  • @wilsonaaron3930

    @wilsonaaron3930

    3 күн бұрын

    Ma’am stocks is not only about anticipating moves based off trends but anticipating through participating behind real top performers and attain how they execute perfectly. coach Frost hilda take good care of my holdings giving me an edge to successful interest.

  • @donnahensley2459

    @donnahensley2459

    3 күн бұрын

    l've been getting suggestions to use a proper enlightened top tier, similar to your viewpoint. I’m trying to figure out your selected holdings at the moment

  • @wilsonaaron3930

    @wilsonaaron3930

    3 күн бұрын

    I currently own tech equities and ETFs constructively, having gained full insight. In terms of price to earnings ratio (P/E ratio) NVDA, AMD and few others has been my main growth drivers in my holdings. Feel free to imitate my steps with a high achiever.

  • @okemeko
    @okemeko2 жыл бұрын

    From what a professor in my university told me, they didn't only "work closely" with researchers. They straight up gifted some cards so research centers in some cases. This way, not only did nvidia provide a good platform, but all the software made was naturally made for CUDA

  • @eumim8020

    @eumim8020

    2 жыл бұрын

    My master's thesis supervisor has 5 professors submitting a request for a GPU each, NVIDIA covers all their monopolistic anticompetitive core with a whole system for helping public university systems, if i'm lucky my final DL model will be trained in his little office server with those GPUs

  • @slopedarmor

    @slopedarmor

    2 жыл бұрын

    i think i member that nvidia gifted a gtx980ti to the developers of kingdom come deliverance (a kickstarter computer game), to supposedly help them with development? haha

  • @monad_tcp

    @monad_tcp

    2 жыл бұрын

    Ah that old trick from Microsoft of gifting goodies. Like giving away office licenses or the entire Internet Explorer for free if you bought Windows.

  • @steveunderwood3683

    @steveunderwood3683

    2 жыл бұрын

    If you don't provide some help to early adopters, how are you ever going to build a thriving environment? Providing cards, software, training and support to academics was a good thing. The sleazy stuff they did was to cook studies to make the benefits of a GPU look much greater than it really was, in applications where the benefits of GPU were marginal at best. GPGPU is great for some things, and weak for others. The early nVidia sponsored papers were so heavily rigged, it took some serious analysis to figure out where GPGPU was a real boon, and how big that boon might be.

  • @monad_tcp

    @monad_tcp

    2 жыл бұрын

    @@steveunderwood3683 Yeah, its the environment, the benefits stated on those papers were rigged to nVidia's side, but they would be feasible in a computational level with a open environment for study. But the industry is too locked on Cuda/Intel x86 . At least now, things are going to change a bit, as if we could say ARM is different...

  • @e2rqey
    @e2rqey2 жыл бұрын

    Nvidia does a really good job of identifying new burgeoning industries where their products could be leveraged, then integrating themselves into the industry from so early on that as the industry matures Nvida's product become essential to the functioning of that industry. I remember visiting a certain self driving car company in California about 4 years ago and seeing a literal wall of Nvidia 1080Ti GPUs. They had at least a couple hundred of them. Apparently they had all been gifted to them by Nvidia. I've heard Nvidia will also send their engineers out to work with companies and help them optimize their software or whatever they are doing, to get the maximum performance out of the GPU for whatever purpose they are using them for.

  • @zerbah

    @zerbah

    2 жыл бұрын

    Nvidia has great support for AI and game development. When I was talking with a small indie game studio about their game, they confirmed that Nvidia sent them two top of the line founder's cards for development free of charge and offered to optimize drivers for their game when the final build is ready. Meanwhile, the AMD cards were crashing and having black screen monitors because of buggy drivers making it complete pain to test the development version of the game on them...

  • @aamirsiddiqui9957

    @aamirsiddiqui9957

    2 жыл бұрын

    @@zerbah How long will AMD take to be as good as Nvidia

  • @cyranova9627

    @cyranova9627

    2 жыл бұрын

    I remember one. that some game developer actually get invited to dining with Nvidia person to talk about their game development with nvidia GPU. not AMD one. all they just do sweet talk to game developer

  • @tweedy4sg

    @tweedy4sg

    2 жыл бұрын

    True they do... but is not exactly successful everytime. Remember how they joined the mobile AP (application processor) market with the Tegra series, which now seem to have fizzled out into oblivion.

  • @graphicsRat

    @graphicsRat

    Жыл бұрын

    @@tweedy4sg Yes not every bet will win. In fact most bets will fail. But the 1 out of the 5 that succeed will more than pay for the failures and much more. That's how investments work. Venture capitalists for example know this too well. Not all their investments will pay off. But every now and then they invest in tomorrow's Google scale company and that's where they make their money.

  • @zombielinkinpark
    @zombielinkinpark2 жыл бұрын

    Despite both Google and Ali cloud developed their own NPU for AI acceration. They are still buying large quantities of Nvidia Delta HGX GPUs as their own AI development platform. Programming for CUDA are far easier then their own proprietary hardware and SDK. Nvidia really put a lot of effort in the CUDA sdk and make it to be industry's standard.

  • @scottfranco1962
    @scottfranco19622 жыл бұрын

    Nvidia is a real success story. The only blemish is (as illustrated by Linus Torvald's famous giving the middle finger to them) is their completely proprietary stance on development. Imagine if Microsoft had arranged so that only their C/C# compilers could be used to develop programs for Windows. CUDA is a closed shop, as are the graphics drivers for Nvidia's cards.

  • @janlanik2660

    @janlanik2660

    2 жыл бұрын

    But msvc can be used only on Windows.

  • @theairaccumulator7144

    @theairaccumulator7144

    2 жыл бұрын

    @@janlanik2660 imagine using windows, much less mvsc

  • @scottfranco1962

    @scottfranco1962

    2 жыл бұрын

    @@janlanik2660 I think you misread what I said. Microsoft (or any OS maker, Apple included) could have easily made it so that only their compilers could be used on their systems, no GCC, no independent developers. That is what Nvidia has done.

  • @janlanik2660

    @janlanik2660

    2 жыл бұрын

    @@scottfranco1962 ok sorry for the misinterpretation. But even so, I have only used CUDA, which is indeed Nvidia only, but I believe that there are some cross platform solutions, e.g. OpenCL, so you don’t have to use proprietary tools to run something on Nvidia, or am I wrong?

  • @Ethan_Simon

    @Ethan_Simon

    2 жыл бұрын

    @New Moon You don't need something to be proprietary to pay your engineers to work on it.

  • @0MoTheG
    @0MoTheG2 жыл бұрын

    CUDA was originally not targeted at machine learning or deep neural networks, but molecular dynamics, fluid dynamics, financial monte carlo, financial pattern search, MRI reconstruction, deconvolution and very large systems of linear equations in general. A.I. is a recent addition.

  • @TheDarkToes

    @TheDarkToes

    Жыл бұрын

    Back in the day, we would have 64 cuda cores and we thought we were hot shit hitting 800mhz. Look how far it's come.

  • @christopherpearson8637

    @christopherpearson8637

    Жыл бұрын

    You stumble into the right choices sometimes.

  • @deusexaethera
    @deusexaethera2 жыл бұрын

    Ahh, the time-honored winning formula: 1) Make a good product. 2) Get it to market quickly. 3) Don't crush people who tinker with it and find new uses for it.

  • @heyhoe168

    @heyhoe168

    2 жыл бұрын

    Nvidia dont really follows (3), but it have a very strong (2).

  • @cubertmiso

    @cubertmiso

    Жыл бұрын

    @@heyhoe168 agree on that comment. 3) corner the market 4) raise prices

  • @peterweller8583

    @peterweller8583

    Жыл бұрын

    @@heyhoe168 3 Thay's too bad because that is where the most honey comes from.

  • @shmehfleh3115

    @shmehfleh3115

    Жыл бұрын

    @@heyhoe168 Neither does Apple, unfortunately.

  • @locinolacolino1302

    @locinolacolino1302

    Жыл бұрын

    3* Create an accessable proprietary toolkit (CUDA) that's become mainstream in legacy content, and crush anyone who tries to leave the Nvidia ecosystem.

  • @mimimimeow
    @mimimimeow2 жыл бұрын

    I think it's worth mentioning that the a lot of recent advances in GPU computing (Turing, Ampere, RDNA, mesh shaders, DX12U) can be traced to the PlayStation 2's programmable VU0+VU1 architecture and PlayStation 3's Cell SPUs. Researchers did crazy stuff with these, like real time ray tracing, distributed supercomputing for disease mechanism research and USAF's space monitoring. PS3 F@H program reached 8 Petaflops at one point! Sony and Toshiba would've been like Nvidia today if they provided proper dev support to make use of these chips' capability and continued developing, than just throwing the chip to game devs and said "deal with it". I feel like Sony concentrated too much on selling gaming systems and didn't realize what monsters they actually created. Nvidia won by actually providing a good dev ecosystem with CUDA.

  • @dhargarten

    @dhargarten

    Жыл бұрын

    Didn't Sony at one point encourage and support using PlayStations for science computing, only to later block it completely? With the PS4 if I recall correctly?

  • @FloStyle_

    @FloStyle_

    Жыл бұрын

    @@dhargarten It was the PS3 and running linux native on the console. Later that caused exploits and hacks of the hardware and sony closed the ecosystem really fast. That caused lawsuit that took years into the PS4 lifespan to conclude.

  • @Special1122

    @Special1122

    11 ай бұрын

    ​@@FloStyle_geohot?

  • @yadavdhakal2044
    @yadavdhakal2044 Жыл бұрын

    Nvidia didn't invent the graphics pipeline. It was invented by Sillicon Graphics or SGI. SGI developed the language OpenGL as far as 1992. They mainly used to target cinema and scientific visualizations market. They used to manufacture entire work station with their own OS (IRIX) and other specialized servers. What Nvidia did was to target the personal entertainment market. This made Nvidia competent because of decreased overall unit cost. Later OS such as linux were able to run these GPUs in cluster and thus here too SGI loosed. SGI could easily be like Nvidia if they were on right track. SGI is now reduced to a conference known as SigGraph. And mainly is research based peer program. And still contributes to computer graphics especially through OpenGL and Vulkan API specification!

  • @lookoutforchris

    @lookoutforchris

    10 ай бұрын

    The original GeForce card was so groundbreaking they were sued by Silicon Graphics for copying their technology. SGI won and nVidia paid royalties to them. Everything nVidia had came from SGI 😂

  • @PhilJohn1980
    @PhilJohn1980 Жыл бұрын

    Ah, geometry stages with matrices - I remember my Comp Sci computer graphics class in the 90's where our final assignment was to, by hand, do all the maths and plot out a simple 3D model on paper. Each student had the same 3D model defined, but different viewport definitions. Fun times.

  • @BaldyMacbeard
    @BaldyMacbeard Жыл бұрын

    The secret of their success for many years was: working closely with developers/customers to gain advantage over their competitors. For instance, Nvidia would give free cards to game developers and send out evangelists to help optimize the game engines. Obviouly resulting in a strong developer bias towards Nvidia cards. Which is how and why they were outperforming AMD for many years. In the machine learning space, they are being extremely generous in their public relations to academia, once again giving away tons of free GPUs and helping developers out. It's a fairly common tactic to try and bring students on board so once they graduate and go on to work in tech companies, they bring a strong bias towards software & hardware they're familiar with. In the server market, Nvidia has been collaborating closely with most manufacturers while offering their DGX systems in parallel. They also have a collaboration with IBM that solders Nvidia GPUs onto their Power8 machines, giving a ginormous boost to bandwidth between GPU and CPU and also PS5-like storage access. And don't forget about the Jetson boards. Those things are pretty amazing for edge computing use cases like object recognition in video and such. They dominate like they do by not trying to sell a single product, but offering tons of solutions for every single market out there.

  • @409raul

    @409raul

    Жыл бұрын

    Genius move by Nvidia. Jensen Huang is the reason why Nvidia is where they are today. One of the best CEOs in the world (despite the greed LOL).

  • @TherconJair

    @TherconJair

    Жыл бұрын

    It's quite easy when your main competitor was nearly extinguished by anti-competitive measures of their much larger rival, Intel, and has to stay afloat somehow while bleeding money. Nvidia made so much money with gaming cards when AMD couldn't compete due to lack of funds for RnD with, that they had an extremely calm "blue ocean" to work with and could comparatively cheapely build up their de-facto monopoly in the space. AMD will need to invest a lot of money to somehow break into the now very "red ocean" of the Nvidia monopoly of CUDA. I don't see them able to survive in the long term against two much larger rivals, and we'll be all losers for it.

  • @Magnulus76

    @Magnulus76

    Жыл бұрын

    Yeah, Nvidia offered alot of support. I know there's alot of fanboys that think NVidia must have some kind of secret sauce, but the truth is that CUDA's performance isn't necessarily any better than OpenCL. And I say that as somebody that owns an NVidia card. NVidia just spent alot on support and generated alot of influence/hype.

  • @Quxxy
    @Quxxy2 жыл бұрын

    I don't think you're right about what "clipping" means at 2:56. Occlusion (hiding things behind other things) is done with a Z-buffer*. As far as I recall, clipping refers to clipping triangles to the edge of the screen to avoid rasterising triangles that fall outside of the visible area, either partially or fully. As far as I'm aware, no one ever did occlusion geometrically on a per-triangle basis. The closest would be in some engines that will rasterise a simplified version of a scene to generate an occlusion buffer**, but that's not handled by the geometry engine, it's just regular rasterisation. *Except on tile-based rasterisers like the PowerVR lineage used in the Dreamcast and some smartphones, notably the iPhone. (Not a graphics programmer or expert, just an interested gamer.) *Edit*: Also, for 7:46 about the fixed function pipeline being totally gone: from what I remember this is not entirely true. GPUs still contain dedicated units for some of the fixed functionality; from memory, that includes texture lookups and blending. Reminds me of an old story from someone who worked on the Larrabee project who mentioned that one of the reasons it failed to produce a usable GPU was that they tried to do all the texturing work in software, and it just couldn't compete with dedicated hardware.

  • @Asianometry

    @Asianometry

    2 жыл бұрын

    Thx. I'll look into this and see if a clarification is needed

  • @Quxxy

    @Quxxy

    2 жыл бұрын

    @@Asianometry I doubt it. It's an inconsequential detail that doesn't change anything about the substance of the video. I mean, I doubt anyone is watching a video about nVidia's AI dominance looking for an in-depth technical description of the now long-obsolete fixed function pipeline. :)

  • @musaran2

    @musaran2

    2 жыл бұрын

    Clipping is the general removal of what does not need rendering: view volume, backface, occlusion…

  • @tma2001

    @tma2001

    2 жыл бұрын

    yeah I was about to post the same nitpick - also the setup and window clipping part of the fixed function pipeline is still there in hardware its just not programmable (nor should it be). The raster ops backend is not programmable either - just configurable. The Painters algorithm is an object based visibilty test that clips overlapping triangles against each other whereas the z-buffer is an image based per pixel visibilty test.

  • @vintyprod

    @vintyprod

    Жыл бұрын

    @@Quxxy I am

  • @CarthagoMike
    @CarthagoMike2 жыл бұрын

    Oh nice, a new Asianometry video! Time to get a cup of tea, sit back, and watch.

  • @TickerSymbolYOU
    @TickerSymbolYOU Жыл бұрын

    This is literally the best breakdown on KZread when it comes to Nvidia's dominance of the AI space. Love your work!

  • @409raul

    @409raul

    Жыл бұрын

    Nice to see you here Alex! Nvidia for the win!

  • @prashantmishra9985

    @prashantmishra9985

    Жыл бұрын

    ​@@409raul Being a fanboy of a corporate won't benefit us.

  • @havkacik

    @havkacik

    Жыл бұрын

    Totally agree 👍 :)

  • @hgbugalou
    @hgbugalou2 жыл бұрын

    I would buy a shirt that says "but first, let me talk about the Asianometry newsletter".

  • @Sagittarius-A-Star
    @Sagittarius-A-Star2 жыл бұрын

    I don't want to know how much effort it was to put all this information together. Thanks and thumbs up. P.S.: At Nvidia they are insane. Just try to find out which GPU you have and how it compares to others or if they are CUDA capable ..... You will end up digging through lists of hundreds or thousands of cards.

  • @killerhurtalot

    @killerhurtalot

    2 жыл бұрын

    That's the thing though. Nvidia usually actually has 6-7 actual chips that they manufacturer. They don't manufacturer tens or hundreds of GPUs each generation... The main difference is that due to manufacturing defects, the GPUs are just binned and has different sectors enabled. The 3090 and 3080 are actually the same chip. The 3080 just has around 15% less pipelines/CUs and less tensor cores enabled...

  • @Baulder13

    @Baulder13

    2 жыл бұрын

    This man has no quit! The amount of research he puts in and how much content that has been coming out is ridiculous.

  • @Hobbes4ever

    @Hobbes4ever

    2 жыл бұрын

    @@killerhurtalot kind of like what Intel does with their Celeron

  • @marksminis

    @marksminis

    2 жыл бұрын

    @@killerhurtalot yes that is correct. A large silicon wafer is a huge investment. By testing each core, defective cores can be coded out, so you still have a working chip to sell. Throwing out a large expensive chip just for having a few bad cores would be insane. Only a small percentage of chips coming off the huge wafer are totally perfect, and those are mostly near the center of the wafer.

  • @RandomlyDrumming
    @RandomlyDrumming2 жыл бұрын

    A small mistake, right at the beginning - Geforce 256 had hit the market in 1999, not 1996. In the mid-90's, Nvidia was, more or less, just another contender, chipping away at the market dominance of the legendary 3dfx. :)

  • @shoam2103

    @shoam2103

    2 жыл бұрын

    So theirs wasn't the first GPU? I think the PlayStation had an albeit very basic one..

  • @shoam2103

    @shoam2103

    2 жыл бұрын

    Okay 5:55 clears it up a bit..

  • @RandomlyDrumming

    @RandomlyDrumming

    2 жыл бұрын

    @@shoam2103 Well, technically, it was, as it handled the entire pipeline. Interestingly, the first *programmable* graphics chip for PC was Rendition Verite v1000 (RISC-based), released back in 1995, if I'm not mistaken. :)

  • @DM0407

    @DM0407

    Жыл бұрын

    Yep, I had bought an RIVA TNT2 to play Asheron's Call in 1999. I guess the 256 was out at this time but I couldn't afford it bad the TNT2 was still a massive jump in performance. Going from a choppy software renderer to "hardware accelerated" graphics was amazing at the time.. The paths had textures! Who knew? I don't remember the original Geforce being that big of a deal, but I remember lusting after the Geforce 2 AGP.

  • @conradwiebe7919
    @conradwiebe79192 жыл бұрын

    Long time viewer and newsletter reader, love your videos. I just wanted to mention that the truncated graph @ 15:28 is a mistake, especially when you then put it next to a non-truncated graph a little later. The difference between datacenter and gaming revenue is greatly exaggerated due to this choice of graph. I feel it actually diminished your point that datacenter is rapidly catching up to gaming.

  • @BenLJackson
    @BenLJackson2 жыл бұрын

    I felt some nestalgia, good vid 👍 deciphering all this back in the day was so much fun. Also I love your explanation of AI and what it really is.

  • @ted_1638
    @ted_16382 жыл бұрын

    fantastic video! thank you for the hard work.

  • @Doomlaser
    @Doomlaser2 жыл бұрын

    As a game developer, I've been waiting for a video like this. Good work

  • @DavidSoto90
    @DavidSoto902 жыл бұрын

    such a valuable video, great work as usual!

  • @Matlockization
    @Matlockization Жыл бұрын

    Thank you for explaining some of the details in the beginning.

  • @ministryofyahushua3065
    @ministryofyahushua30652 жыл бұрын

    Love your channel, very well presented.

  • @NeilStainton
    @NeilStainton2 жыл бұрын

    Thank you for your excellent work in condensing and analysing NVIDIA’s progress.

  • @dipankarchatterjee8809
    @dipankarchatterjee88092 жыл бұрын

    A very well researched presentation. Thank you Bro.

  • @supabass4003
    @supabass40032 жыл бұрын

    I have spent more money on nvidia GPUs in the last 20 years than I have on cars lol.

  • @mspy2989

    @mspy2989

    2 жыл бұрын

    Goals

  • @heyhoe168

    @heyhoe168

    2 жыл бұрын

    Same. Btw, I dont have a car.

  • @wazaagbreak-head6039

    @wazaagbreak-head6039

    Жыл бұрын

    I have no reason to update my ancient corolla it's a piece of crap but it gets me to work each day

  • @LimabeanStudios

    @LimabeanStudios

    Жыл бұрын

    I have only purchased one of each and same lmao

  • @prateekpanwar646

    @prateekpanwar646

    Жыл бұрын

    @@wazaagbreak-head6039Is it 750 TI / 760?

  • @rzmonk76
    @rzmonk762 жыл бұрын

    Subscribed, really nice presentation!

  • @helmutzollner5496
    @helmutzollner5496 Жыл бұрын

    Excellent overview. Thank you.

  • @Magnulus76
    @Magnulus76 Жыл бұрын

    They had neural networks being used in computer games even back in the early 90's, to a limited extent (mostly a few strategy games). The reason there's hype about neural nets now, is that the raw computing power of a GPU allows companies to develop neural networks that can mimic human visual perception and pattern recognition.

  • @ttcc5273
    @ttcc5273 Жыл бұрын

    Thank you for this video, it was informative, digestible, and I learned more than I expected to. 👍

  • @Meta5917
    @Meta59172 жыл бұрын

    Great video. Keep it up, proud of you

  • @drewwollin3462
    @drewwollin34622 жыл бұрын

    Very good as always. A good explanation of how graphics cards work and how they have evolved.

  • @jmk1727
    @jmk17272 жыл бұрын

    man your videos are all always amazing......PERIOD.

  • @skipsteel
    @skipsteel2 жыл бұрын

    Thanks really well done, you made the complex simple thanks.

  • @Zloi_oi
    @Zloi_oi Жыл бұрын

    This is really interesting!Thanks for your work, sir!

  • @punditgi
    @punditgi2 жыл бұрын

    First rate information well presented! 👍

  • @Socrates21stCentury
    @Socrates21stCentury Жыл бұрын

    Nice job, very informative !!!

  • @christakimoto8425
    @christakimoto84256 ай бұрын

    This is an outstanding and informative video. Thank you so much!

  • @MohammadSadiqurRahman
    @MohammadSadiqurRahman2 жыл бұрын

    insightful. loved the content

  • @richardm9934
    @richardm99345 ай бұрын

    Fantastic video!

  • @jdevoz
    @jdevoz3 ай бұрын

    Amazing video!

  • @harrykekgmail
    @harrykekgmail2 жыл бұрын

    a classic in your stream of videos!

  • @screwcollege8474

    @screwcollege8474

    2 жыл бұрын

    how you posted 2 months ago?

  • @2drealms196

    @2drealms196

    2 жыл бұрын

    @@screwcollege8474 patreon members get access to his vidoes first. Later on he makes the makes the videos public. Another way is through his college partnership program.

  • @AaronSchwarz42
    @AaronSchwarz422 жыл бұрын

    Excellent analytics on market diffusion of COTS //

  • @kokop1107
    @kokop11079 ай бұрын

    This is a very good and acurate explaination

  • @ChristianKurzke
    @ChristianKurzke Жыл бұрын

    I love this, very well researched, and the correct level of technology for the average executive,.. who isn't a math genius. ;)

  • @LimabeanStudios
    @LimabeanStudios Жыл бұрын

    Just found this channel the other day and it's amazing. One thing I don't see mentioned in the comments is that Nvidia is often rated one of the best companies in the world to work at. It's a lot easier to do big things with happy employees lol

  • @nailsonlandim
    @nailsonlandim2 жыл бұрын

    Excellent video. funny fact is I passed the day dealing with CUDA and a CV application I'm working on

  • @BartKus
    @BartKus2 жыл бұрын

    You do really good work, sir. Much appreciate.

  • @BillHawkins0318
    @BillHawkins03182 жыл бұрын

    What I never understood is why NVIDIA Attemped to cool the HEAT SINK with a 3 cent fan. Especially from 2001 to 2010. When said 3 cent fan goes out the GPU can burn up. 🔥. As passive cooling Has never been enough. We don't have to worry about AI. 3 cent fans will make short work of that.

  • @Tartar

    @Tartar

    2 жыл бұрын

    GPU fans are still very cheap these days. Hopefully we the Asus Noctua collaboration is a sign of things to come with GPU's with premium and quiet cooling solutions.

  • @rayoflight62

    @rayoflight62

    2 жыл бұрын

    A good percentage of computers from that period were because of the failed fan. It was that transparent plastic melted all over the GPU heatsink. The GPU subsequently failed, usually shorting the +5 V bus...

  • @musaran2

    @musaran2

    2 жыл бұрын

    By the time it happens, they consider it is obsolete and you are supposed to upgrade. I hate it.

  • @Bialy_1

    @Bialy_1

    2 жыл бұрын

    "As passive cooling Has never been enough." Nope some cards got good passive cooling, and you always can replace the fan on your own when its starting to make noise...

  • @GBlunted
    @GBlunted2 жыл бұрын

    This is cool content, I liked this video! I like the explanation of low-level processes as well as the history lesson of how it all evolved to where it is today...

  • @ADHD55
    @ADHD552 жыл бұрын

    Nvidia is what happens when the CEO is a engineer not a short term thinking mba

  • @user-lx7kx1dd3q

    @user-lx7kx1dd3q

    2 жыл бұрын

    It's in Chinese blood. An engineer

  • @xraymind

    @xraymind

    2 жыл бұрын

    @@user-lx7kx1dd3q Correction, Taiwanese blood.

  • @user-lx7kx1dd3q

    @user-lx7kx1dd3q

    2 жыл бұрын

    @@xraymind there's no such thing as Taiwanese blood. Taiwan is a land not a race.

  • @ADHD55

    @ADHD55

    2 жыл бұрын

    @@user-lx7kx1dd3q huh? NVIDIA is a American company not Chinese

  • @user-lx7kx1dd3q

    @user-lx7kx1dd3q

    2 жыл бұрын

    @@ADHD55 since when I said Nvidia isn't American company. You talked about it CEO. And who do you think it's CEO? It's Jason Huang. A Chinese Taiwan that has became US citizen. Are you still a kid that I need to spell everything for you???????

  • @hc3d
    @hc3d2 жыл бұрын

    wow, amazing analysis.

  • @FrancisdeBriey
    @FrancisdeBriey2 жыл бұрын

    Subscribed !

  • @Bianchi77
    @Bianchi772 жыл бұрын

    Nice video,thank you for sharrng it :)

  • @emulegs5
    @emulegs52 жыл бұрын

    Please remember to leave a video link to the last video and the first in a series and the first aswell, I would have clicked links to them based off your intro alone

  • @Campaigner82
    @Campaigner822 жыл бұрын

    You make so good videos! The pictures I’m intrigued by. You’re doing a good job!

  • @valenganev5774
    @valenganev57742 жыл бұрын

    what do think about the fujitsu Celcius PC? where do you place this among other PC's? What is the future of fujitsu?

  • @zodiacfml
    @zodiacfml2 жыл бұрын

    beaten me to this critique which is the most important part of Nvidia's luck/success. I recall it took years before Nvidia finally got to CUDA support/programming. Researchers using GPUs is also the reason why AMD bought Ati. There was a whitepaper from AMD that computing will move/focus to graphics from then on, they were just more than a decade way too early with that prediction. Another thing to note, it is the gamers/consumers that made all this possible paying for the R&D of graphics cards that will be used to sell products for the datacenter. Ray tracing hardware for example is a poor feature for use in gaming currently but it is excellent for industrial use.

  • @markhahn0

    @markhahn0

    2 жыл бұрын

    in some ways, it's remarkable how poorly AMD has done. they've never delivered on anything like a sleek cpu-gpu-unified infrastructure, even though they have all the pieces in hand (and talked about things like HSA). it'll be ironic if Intel manages with oneAPI, since for so long, they were defending the CPU like a castle...

  • @zodiacfml

    @zodiacfml

    2 жыл бұрын

    agreed. though the hardware on the latest gaming consoles were impressive when they were announced, just ok when the consoles became available. AMD also doesn't have a foot in Arm where Nvidia has on the Nintendo Switch and Apple on M1. My last two PCs are Intel i3-8100 and recently a i3-12100 since I have some use of the iGPUs.

  • @PlanetFrosty
    @PlanetFrosty2 жыл бұрын

    Good presentation

  • @markhahn0
    @markhahn02 жыл бұрын

    important to point out that no one really uses Cuda for AI - they use pytorch or tensorflow. that means that Nv doesn't have any real lock on the market - alternatives are highly competitive.

  • @Stef3m

    @Stef3m

    Жыл бұрын

    That is an important point that is too rarely bring out

  • @kotokotfgcscrub

    @kotokotfgcscrub

    Жыл бұрын

    ML frameworks came to existing later and were built upon cuda and cudnn, and are way more optimized for nvidia even after starting to support other hardware.

  • @jem4444
    @jem4444 Жыл бұрын

    Extremely well done!

  • @shmehfleh3115
    @shmehfleh3115 Жыл бұрын

    This video filled in a lot of gaps for me. I work with the things and I wasn't sure how GPUs evolved into general computing devices.

  • @cfehunter
    @cfehunter Жыл бұрын

    "Early graphics processing broke scenes up into triangles".... they still do.

  • @Palmit_
    @Palmit_2 жыл бұрын

    thank you John. :)

  • @etherjoe505
    @etherjoe5052 жыл бұрын

    Single Instruction Multiple Data 👍👍👍

  • @doug184

    @doug184

    9 ай бұрын

    AMD?

  • @GhostZodick
    @GhostZodick2 жыл бұрын

    Your video always have a low frequency pounding sound in the background. Would you mind look into that and try to fix it in the future videos? At first I thought it was something pounding in my house, but later realized it was in your video because I only hear it at certain parts of your video.

  • @gregsutton2400
    @gregsutton2400 Жыл бұрын

    great info

  • @birseyleryap
    @birseyleryap Жыл бұрын

    that popping sound @8:53 from the lips

  • @green5270
    @green52702 жыл бұрын

    excellent video

  • @Jensth
    @Jensth Жыл бұрын

    You were spot on with this one. After this came out everyone bought NVIDIA stock up like crazy.

  • @igorwilliams7469
    @igorwilliams74692 жыл бұрын

    Thinking about that elevator analogy a bit too much... Are there ANY elevators with level tiers midway (like bottom and top) for riders to decamp? While obviously adding complexity to a system that it almost plug and play, it could certainly be interesting!

  • @19smkl91
    @19smkl91 Жыл бұрын

    6:24 I've seen people rubber banding when stepping on and even bug off at half way up, usually receiving hurtings.

  • @nabeelhasan6593
    @nabeelhasan65932 жыл бұрын

    I always wish there was a unified framework like Cuda for all platforms , NVIDIA absolute monopoly on Deep learning reallly makes things hard

  • @Corei14

    @Corei14

    2 жыл бұрын

    Open cl. Now making it work as well as others is a different question

  • @joshuagoldshteyn8651

    @joshuagoldshteyn8651

    Жыл бұрын

    How does it make things really hard? Simply use an Nvidia GPU with high batch sizes or any CPU with low batches sizes?

  • @estebancastellino3284
    @estebancastellino3284 Жыл бұрын

    I remember when NVidia software graphic accelerator card was the cheap option for those of us who couldn't afford a Vodoo card, the one that did came with hardware accelerator. Vodoo was about it's fith version by the time NVidia put the GForce chip on.

  • @bhavintoliabg4946
    @bhavintoliabg49462 жыл бұрын

    This one video made me respect NVIDIAs work more than any advt ever would.

  • @IntangirVoluntaryist
    @IntangirVoluntaryist2 жыл бұрын

    I still have several old gen cards TNT cards, banshee, voodoo, first gen geforce, some early gen ati cards too i also have some old soundblaster cards :)

  • @screwcollege8474
    @screwcollege84742 жыл бұрын

    Great video, now I understand why Nvidia is worth 600 billion in market value

  • @johnl.7754

    @johnl.7754

    2 жыл бұрын

    Would have never thought that it would be worth over 3x Intel or any other cpu manufacturers.

  • @swlak516
    @swlak5162 жыл бұрын

    These videos make me feel smarter than I really am. And I feel like you’re one of the few KZread contact creators in the space who can do that. Thank you.

  • @Speed001

    @Speed001

    2 жыл бұрын

    This is definitely a bit above me with tech terms I don't care to learn.

  • @johnaugsburger6192
    @johnaugsburger61922 жыл бұрын

    Thanks so much

  • @davidbooth8422
    @davidbooth84222 жыл бұрын

    Hi John. I love your videos! Do you have any possible connections that might want to manufacture a much better and cheaper smoke detector than is available today? I would love to explain how easy that would be to any technical person who would listen. I am not trying to make money either, just save lives.

  • @michaelhulcy6680
    @michaelhulcy66802 жыл бұрын

    "Triangles. Triangles all the way down baby." Dat was a good one. Making Duke Nukem in 96 jealous man.

  • @allcouto
    @allcouto Жыл бұрын

    You guys completly fogot DOJO!

  • @MikkoRantalainen
    @MikkoRantalainen Жыл бұрын

    Great document as usual but the bar graphs not starting from zero around 15:30 wasn't very cool. The illustration made it appear like Gaming is more than double the Data Canter revenue but when you compare the actual numbers 3.22 vs 2.94, you'll quick see that the difference is actually about 9%!

  • @minecraftdonebig
    @minecraftdonebig2 жыл бұрын

    If i was in charge all chip processes engineers and associated people would be required to wear wizard hats because this shit is insane magic

  • @Aermydach
    @Aermydach2 жыл бұрын

    Another great presentation. Watching this got me thinking that I wasted my time studying agricultural and wine science at Uni. Instead, I should've studied computer science/engineering. . .

  • @dankuruseo9611

    @dankuruseo9611

    2 жыл бұрын

    Someone has to feed us 👍

  • @pinkipromise

    @pinkipromise

    2 жыл бұрын

    didnt know farmers have degrees

  • @Aermydach

    @Aermydach

    2 жыл бұрын

    @@pinkipromise They typically don't. The degrees are for researchers, technical support/agronomists (for fertilisers, pesticides, crop and livestock nutrition etc) and other specialist support roles.

  • @Tigerbalm338
    @Tigerbalm338 Жыл бұрын

    To paraphrase a popular SNL skit: "Triangles baby! MORE TRIANGLES!"

  • @johndvoracek1000
    @johndvoracek1000 Жыл бұрын

    I was wondering if you would mention Apple; then you did at the end but not in the way I anticipated. Isn't Apple's M chip a move in the same chip capabilities and architecture as Nvidia, etc.?

  • @isaacamante4633
    @isaacamante46332 жыл бұрын

    At 15:31 the graphic on the left is not anchored at zero.

  • @Cythil

    @Cythil

    2 жыл бұрын

    And is not that clear it not. Generally is good form to indicate this.

  • @royfpga
    @royfpga Жыл бұрын

    Thanks!

  • @JohnKobylarz
    @JohnKobylarz2 жыл бұрын

    Excellent video. As someone who remembers when the GeForce 256 was launched, it’s amazing to reflect on how far they’ve come and how influential their tech has been on the world. Before GPU’s, PC gaming was a much different affair. Even looking at JPEG’s was an somewhat intense system task before GPU’s became the norm. I learned a lot from this video, and enjoyed it. It helps me connect the dots regarding how AI learning works.

  • @timswartz4520
    @timswartz4520 Жыл бұрын

    That G-Force 256 made me very happy for a long time.

  • @AtaGunZ
    @AtaGunZ2 жыл бұрын

    I'm saying this as an AMD fanboy: ROCm sucks. It's not properly supported for use with their RDNA/2 cards, I wanted to try it out since CUDA was pushed down my throat during HPC courses, wanted to see what my brand new RX 6900 XT could do with the HPC knowledge I acquired. Turns out it can't do jack because the card did not support ROCm (or the other way around) on launch, only vauge promises for future support with no announced dates (and apparently the earlier rx 5000 series cards were not supported either, even 2+ years after their launch). I tried to learn more about it but all I could get from the outdated and poorly documented github page was that ROCm was intended for CDNA so not all parts of ROCm were present for RDNA2, so converting my CUDA knowledge into HIP and running with it was out of the question. I looked it up again now to see if there are any improvements; the latest info I can find is on a random hackernews thread from 6 months ago, a user working for AMD on ROCm reporting that there is unofficial support for some ROCm 4.3 libraries... How are we supposed to track this info? I understand that there are architectural differences between RDNA and CDNA, but how does AMD expect sustained growth in this market when not one of my peers nor professors write computational software for AMD GPUs? (meanwhile we are expected to be proficient with CUDA to get a passing grade from our graduate courses). I am not even taking any ML/AI courses, I know the situation is more dire there. I'm still just a student without much industry knowledge so my perspective might not be reflecting how the inidustry really works, but I can't see a world where new graduates wouldn't stick to the platform they are familiar with when going into their career. That being said, I hope I can get that summer internship at AMD :P I am so hyped for them especially after the xilinx acquisition.

  • @justice929

    @justice929

    2 жыл бұрын

    Are you a Stanford or MIT student? Lisa Su a MIT grad, Nvidia CEO Stanford grad

  • @AtaGunZ

    @AtaGunZ

    2 жыл бұрын

    ​@@justice929 wish I was :P Doing my masters at Technical University of Munich (TUM) at the moment.

  • @RainKing048

    @RainKing048

    2 жыл бұрын

    Yeah, this is what I don't understand from AMD. They wouldn't be able to expand their market share if they don't even 'support' their products. Even something like the lowly GT 710 from Nvidia had day one CUDA support. Meanwhile AMD only has vague hints (and sometimes those have to come from the community) if you want to first find out that you could enter the AI/ML scene using their products.

  • @ravindertalwar553
    @ravindertalwar5532 жыл бұрын

    Congratulations 👏 and lots of love and blessings ❤️

  • @ylstorage7085
    @ylstorage70852 жыл бұрын

    no offense... but the Y-axis @15:24 ... dude... that's what I called, how to make a 3 looks 3 times bigger than another 3

  • @final0915
    @final0915 Жыл бұрын

    12:35 haha i wonder what images they collected for non-hotdogs

  • @tonyduncan9852
    @tonyduncan98522 жыл бұрын

    Thanks for that. :)

  • @clarkkent7973
    @clarkkent7973 Жыл бұрын

    While mentioned in passing, it should be emphasized how bitcoin miners bought huge amounts of Nvidia hardware. They literally used Nvidia hardware to create money.

  • @hunter8980
    @hunter89802 жыл бұрын

    How many polygons RTX 3080 process per sec?