Multzinity

Multzinity

Explaining tech one video at a time.

NPUs Explained (2024)

NPUs Explained (2024)

Intel GPUs Explained

Intel GPUs Explained

Monitors Explained

Monitors Explained

AMD Radeon GPUs Explained

AMD Radeon GPUs Explained

Пікірлер

  • @TylerDurden-oy2hm
    @TylerDurden-oy2hmКүн бұрын

    Hope therell be some Voodoo card love in this...but i think the first voodoo was from 96....

  • @draciano
    @draciano2 күн бұрын

    W video gang

  • @blacksongoku312
    @blacksongoku3125 күн бұрын

    Upgrading to a RX 7900 GRE from the 1660ti gtx

  • @blacksongoku312
    @blacksongoku3125 күн бұрын

    Loved my Msi 1660ti for the last 5 yrs, but I'm finally upgrading to an RX 7900 GRE for that sweet RT

  • @multiversestudios1
    @multiversestudios14 күн бұрын

    Yep, as you said with switching out your i7-8700k, it's a night and day difference!

  • @blacksongoku312
    @blacksongoku3125 күн бұрын

    Upgraded to an R9 5900X from an i7-8700K. Night and day difference 😊

  • @toma-st1jy
    @toma-st1jy6 күн бұрын

    Microsoft is just good, let them do what thay think. Ryzen AI will demolish intel!

  • @skivvywaver
    @skivvywaver6 күн бұрын

    You skipped entirely the horrendous FX series from Nvidia. It should be remembered. It made ATI into the competitor it became. Nvidia screwed up. Nobody ever wants to talk about how horrible the FX line up was. It made an ATI customer out of me. I did own a 6800GT after the FX card, but that washed out experience moved me away for good. The colors were awful.

  • @JoeZelensky
    @JoeZelensky8 күн бұрын

    Good video.

  • @shengyi1701
    @shengyi17019 күн бұрын

    Nice but some GeForce series 6 and 8 cards missed out. Used GeForce 2 Ultra in 2001, GS 6800 in 2006, then went to Radeon HD580. But it ran hot! So went back to NVidia GTX 780 Ti in 2013. Then replaced it with GTX 1060 8Gb version. But still deciding in 2024. Still have my 1060 after my MB and PSU died! And we all forgot about the GTX Titans!

  • @Totallynotmwa
    @Totallynotmwa10 күн бұрын

    Probably the most useless feature on your pc Only being used for ai

  • @stephanverbeeck
    @stephanverbeeck13 күн бұрын

    Neat overview, thanks

  • @VOYSTAN
    @VOYSTAN15 күн бұрын

    Nice video BUT at aroud 4:30 you said that NVIDIA won that generation because of AI and i have to disagree iff your looking from a gamers perspective. The price diference was huge but preformance basycly the same and even now their AI gaming technlogy thats is not upscaling(DLSS 3) is only in 35 games in 2024. And im not shure when did fsr 2.0 came out but it can be used in every game compared to DLSS 2.0. Iff you look at the market right now and have the same prices ass me( i live in poland) and IF your looking for a gpu for multyplayer games(no raytracing) and no work load(blender and stuff) from 3060 to 4080(idk if lower then 3060 didnt check) price to fps amd always wins. Idk if i got my point thru but i probobly made a lot of gramar errors sorry.

  • @VOYSTAN
    @VOYSTAN15 күн бұрын

    I wonna that what i said aplies only if you dont think a win is basycly a stronger card(3090>rx 6900 xt but only in certain stuff and a lot of people dont use stuff that its better at but its still better(if you forget the price))

  • @multiversestudios1
    @multiversestudios113 күн бұрын

    @@VOYSTAN I definitely think that AMD offers a much stronger price to performance ratio, but I just used the most expensive card of the generation and used its overall performance because that's a bit more fair than just using gaming.

  • @dan_729
    @dan_72915 күн бұрын

    great vid! keep it up!

  • @connor5124
    @connor512415 күн бұрын

    cool vid keep it up brotha

  • @Sota_eth
    @Sota_eth15 күн бұрын

    I didn't realize how powerful the 4090 was compared to the previous generation until recently What a monster

  • @Talkative_Ducky
    @Talkative_Ducky15 күн бұрын

    Amazing video, I hope you get more popular!

  • @jennytrimberger9120
    @jennytrimberger912015 күн бұрын

    Actually underrated.

  • @gamingxrelic
    @gamingxrelic15 күн бұрын

    Great video!

  • @AlvaroMartinez98
    @AlvaroMartinez9816 күн бұрын

    Nice video. Surprisingly good graphics for the size of your channel. Good narration of the story you were trying to tell. My only suggestion is working on a more "narrative" voice. Your microphone is good, but you sound like just a kid. You could sound more like a narrator kid 😉

  • @r.m8146
    @r.m814616 күн бұрын

    Great video

  • @silbersoule7579
    @silbersoule757918 күн бұрын

    Any thoughts on Battlemage on the Intel side?

  • @multiversestudios1
    @multiversestudios113 күн бұрын

    I think that if Intel makes a card that competes with something like the 4070 for 400-500$, they can take some market share from AMD.

  • @Creeper-xz1eg
    @Creeper-xz1eg18 күн бұрын

    lol i think im the first viewer of this bc it just says 1 view

  • @MrMosga
    @MrMosga19 күн бұрын

    You praise unreleased /unavailable Intel and Snapdragon products too much and ignore Ryzens that are fully available for almost a year now. At least in my nearly year old laptop with 7840HS there is NPU - but there is not enough information of how to actually use it!

  • @RSV9
    @RSV920 күн бұрын

    How many TOPS does a mid-range and a high-end GPU have?

  • @shmookins
    @shmookins22 күн бұрын

    Very nice summary. I think I saw an Intel CPU slide that said "100 TOPS" (Arrow Lake?). I think the NPU part of that CPU is 45+ TOPS but using the whole processor it reaches 100 TOPS. Or something like that. I may have misunderstood the slide. Regardless, it is exciting times for CPU tech.

  • @multiversestudios1
    @multiversestudios122 күн бұрын

    Yep, you're right! I'm just using NPU TOPS since this video is only about Neural Processing Units.

  • @auritro3903
    @auritro390323 күн бұрын

    3:04 Note that 77 TOPS isn't the entire NPU. The NPU alone is going to deliver 3x the performance over current gen as said by Lisa Su, which is around 45 TOPS NPU, similar to that of Qualcomm and Intel.

  • @josevictor7361
    @josevictor736123 күн бұрын

    I'm using a 3700x since 2019. With 32GB of ram I had no issues so far. 😅

  • @josevictor7361
    @josevictor736123 күн бұрын

    Let's see the software support for this kind of hardware. Keep up with the good work, Multzinity. Cheers from Brazil!

  • @Hyp3rSon1X
    @Hyp3rSon1X23 күн бұрын

    Don't be fooled by the number 'tops'. It means very little in the realm of AI and inferencing agains local models, when the bottleneck is the memory bandwidth. GPUs can have bandwidths of 1000+ GB/s. The M Chips from Apple can range from 100 to 800 GB/s. The snapdragon X Elite was told to have about 135 GB/s according to their official specifications. Also: NPUs currently are lots of times proprietary, meaning only limited kinds of applications can make use of them. It is a step to the right direction in general - being able to do AI tasks locally, rather than being bound to some server and service. But currently their existence means little. There's a good chance that currently release NPUs will become obsolete very quickly, because of the restrictions they impose on models that can be run.

  • @balintbalazs4087
    @balintbalazs408723 күн бұрын

    And what if I tell you, that the NPU engineers can put a SRAM/DRAM inside the chip and with the proper NN compiler software, they can reduce the usage of external memory? Also, the main reasons to develop NPU-s are the cases when you have to deal with Power-Performance questions. Think about self-driving solutions: every manufacturer is going to choose an NPU over any GPU, because as long as most of Nvidia GPUs does not care about the power consumption (and the latest cards using 200-300 W), a well-built NPU could require only 20-30 W for the same result. Some explanation about same result: most modern neural networks use a lot of complex math operations with large GMAC requirement inside of them. If you build a hardware that can help the parallelisation of the exact operation, you will need less TOPS to execute it as you run it on a GPU. (GPUs can solve matrix multiplications efficiently, so it’s compiler tries to convert everything to it and optimize it with various kinds of algorithms. But in the end you replaced an operation with a more “expensive” operation. )

  • @RobloxianX
    @RobloxianX23 күн бұрын

    Put into a sentence, an NPU is what a GPU does, but worse.

  • @TamasKiss-yk4st
    @TamasKiss-yk4st22 күн бұрын

    Not exactly, why do you think to do something with just 1-2W power consumption is worth less like to do that with 300-400W power consumption..? NPU only can do AI things, but it's efficient in that, only reason why it's not worth to stack it infinitely, because it's can't do anything else, meanwhile the GPU can do graphical calculations when it's not doing AI thigs..

  • @RobloxianX
    @RobloxianX22 күн бұрын

    @@TamasKiss-yk4st A GPU uses 300-400W if it's specifically the 4090, and doing heavy operations. I can sense the "Pulled out of my ass" already.

  • @stachowi
    @stachowi23 күн бұрын

    Great job on your videos, really enjoy them. I'm in software, so i found this accurate. Keep it up!

  • @owis2001
    @owis200123 күн бұрын

    we got a new awesome tech youtuber , nice ,short ,informative .

  • @multiversestudios1
    @multiversestudios122 күн бұрын

    Thanks! I'm trying my best!

  • @BeyondEllisBeck
    @BeyondEllisBeck24 күн бұрын

    Intel Ultra lunar lake was actually said to have a total of 100 tops.

  • @auritro3903
    @auritro390323 күн бұрын

    Yeah, but the NPU itself is delivering 45+ TOPS.

  • @10baggerog98
    @10baggerog9829 күн бұрын

    keep up the good work bro

  • @Creeper-xz1eg
    @Creeper-xz1egАй бұрын

    These videos are so underrated ngl

  • @MrExdous69
    @MrExdous69Ай бұрын

    Yeah but the 7950x is more powerful than the 7950x3d. Like by a lot

  • @multiversestudios1
    @multiversestudios129 күн бұрын

    Yes, I would agree, but the 3D V-Cache can give it an edge in some scenarios. Honestly, the 7800X3D is the best, but when I meant the most powerful, I just put the most expensive processors from each company instead of going into the nitty-gritty world of benchmarks.

  • @stachowi
    @stachowi23 күн бұрын

    @@multiversestudios1 you know your stuff!

  • @kevinleroy3155
    @kevinleroy3155Ай бұрын

    pretty incredible ytb recommend little youtuber nowadays but had to say you've done a good work on this video

  • @sprdxx
    @sprdxxАй бұрын

    ❤❤

  • @lowbudgetpilot1898
    @lowbudgetpilot1898Ай бұрын

    🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿

  • @vyrgozunqk
    @vyrgozunqkАй бұрын

    A750 trades blows with the RX7600 now. You have probably viewed very old reviews with the initial drivers. Especially on 1440p with slight RT it's better for sure.

  • @w3w3w3
    @w3w3w3Ай бұрын

    interesting video

  • @Karti200
    @Karti200Ай бұрын

    As a daily Intel Arc User (one in home lab and other in my main gaming rig) let me help you with some explaining since you're forgotten some of the stuff and seems you base your information on very outdated stuff. #1 - naming scheme, example ⇾ A750: A - name of generation, so Alchemist ⇾ Battlemage cards will start with "B" first digit ⇾ overall performance tier, literary equivalent of i3, i5 and i7 second digit ⇾ ranking of performance in that tier ⇾ the higher, the better (just like you got A750 and A770 or A310 and A380) #2 - low profile cards A310 is more of a transcode card than a gaming card - I mean you CAN game on it, but why should you when you can get A380 in a low profile variant? And for low profile market on this price range - A380 vs GTX 1650 vs RX 6400 ⇾ A380 is the best card you can get. You are not forced to a x4 gen4 PCIE, and you are not castrated from standard encoders and decoders ⇾ so already beats RX6400 You are having more features (XeSS on hardware level and even small RT capabilities) than GTX 1650 And as a cherry on top ⇾ you got 2GB extra on VRAM, both RX 6400 and GTX 1650 have only 4GB while A380 has 6GB. A380 as non-low-profile card ⇾ comparing it to RX 580 / RX 590 is just a bit wonky - it is like you would compare RTX 3050 to RTX 4060 A380 price and performance is aimed to tier of GTX 1650 and RX 6400 (both on low profile and full size market) And now the biggest bear in the house - driver and overall performance. Most of the people - and sorry to say, but based on your video, you too - are still looking at Arcs just from the prism of bad launch. Yes, launch drivers and software were bad, no lie there, but it was fixed already and Intel is keeping on promise and updating those drivers hard left and right. Just look at the "update" videos from Gamers Nexus or Hardware Unboxed. Intel in 1 year managed to fix so much stuff, that AMD should get envy of - because like it or not, getting black screens randomly with AMD cards is still as common and random now, as it was during 2014-2017 with Polaris cards Tho it is a bit silly how both A750 and A770 are "mehh' cards for 1080p, but at 1440p and 4k they actually get a magnificent boost thanks to their wide bus while compared to other manufacturer cards. A750 in 1080p is at best on level of RTX 3060 and RX 6600 ⇾ while at 1440p it actually is the best price to performance card you can get... it is swapping punches with RX 6650 XT and RX 6700… The biggest fail in this however… Is A770 ⇾ outside 16GB of VRAM (good boost from 8GB of A750), it does not really bring that much of an upgrade compared to A750. That is why you should NEVER buy 8GB of A770 (there are SOME of those but got canceled by AiBs fast - Intel never released one) since you just pay more and get performance uplift that we could call just an error in a test. Now in 2024, after card launch, the only issue I have with my Arc cards ⇾ are overall power draw But I am very optimistic for Battlemage launch and for sure I will buy the top available model. ps. An interesting note to praise Intel on too ⇾ their mobile GPUs are using the same core count as desktop ones. A770M = A770 A750M = A750 Why am I saying this is very good thing to note on? Because Nvidia is very heavy cutting down the core count in their mobile GPUs, example: RTX 4080 = 9728 CUDA cores RTX 4080 Mobile =7424 CUDA cores ⇾ that is a number of cores between desktop 4070 Super and 4070 TI ps #2. sorry for a long comment XD

  • @multiversestudios1
    @multiversestudios1Ай бұрын

    Thanks for correcting me! I didn't realize I had so much wrong. :{ I think the only decision I can defend is the comparison between the a380 and the RX580/590 GME. I know they are nothing alike in performance, but I like comparing things that are the same price, instead of their performance. ps: If you ever want to correct me again, please do. I'm growing my channel, and because of people like you, I can make sure to keep my videos accurate. Thanks!

  • @JoeZelensky
    @JoeZelensky8 күн бұрын

    That is a good write-up and really helped with the info I have been looking for.

  • @Hash-6624
    @Hash-6624Ай бұрын

    underatted vid

  • @Vincent_Koech
    @Vincent_KoechАй бұрын

    My Arc A770 works well for what I do.

  • @zeotuk1.051
    @zeotuk1.051Ай бұрын

    Thank you youtube recommendations for bringing me here

  • @opposedscroll7596
    @opposedscroll7596Ай бұрын

    Intel GPUs currently feel like the cool third competitor that just isn't quite there yet

  • @lowbudgetpilot1898
    @lowbudgetpilot1898Ай бұрын

    NEW VID :OOOOOOOOOO

  • @multiversestudios1
    @multiversestudios1Ай бұрын

    Thanks for being an AI man!

  • @silbersoule7579
    @silbersoule7579Ай бұрын

    If the Battlemage 980 can live up to the rumors of being as good as a 4070ti-4080 at $450-500, I would definitely pick it up. Partly out of desire for price-to-performance and partly to support another competitor in the gpu market. After all, when theres competition, the consumers win. Also I've enjoyed your videos so far and appreciate the quick but effective explanations for those that aren't familiar with all these computer parts and what they mean because a lot of these numbers, letters, and names are not intuitive and need to come from research.

  • @mikebarrett4104
    @mikebarrett4104Ай бұрын

    Currently using a 50-inch led tv, 60hz @ 4k as main monitor. Kinda old but gaming at 1440p it still looks good. Secondary monitor 60hz @ 1080p just to display temps and stuff. I'm researching for an upgrade so, great video lol! Thanks!

  • @multiversestudios1
    @multiversestudios1Ай бұрын

    You're welcome!

  • @opposedscroll7596
    @opposedscroll7596Ай бұрын

    Nice video m8. You explained the basic concepts really well and were informative with what each card could do.