Your GPU is Useless Now

Ғылым және технология

==JOIN THE DISCORD!==
/ discord
Your GPU is Useless Now. It's been a growing trend that more and more games are using your CPU in a PRETTY dramatic way. You would think that since graphics are getting better, that your GRAPHICS CARD would have to work proportionally as hard, but that doesn't seem to be the case. General-purpose computing on the CPU has become the go-to. Even a top-end CPU from just 3 years ago, can't push 60 FPS in some games.
On top of that, most people don't want to be upgrading CPUs frequently because it is a lot more expensive than buying just a new GPU. You might have to also upgrade you motherboard and RAM at the same time. Is this only going to get worse?
VV RX 6800XT I tested as well (affiliate links) VV
PowerColor Red Dragon rx 6800xt 16gb: amzn.to/3YhV3oU
RX 6800xt (in general): amzn.to/3s05BNm
HUB: • Insane Gaming Efficien...
Nvidia: • DirectX 12 Ultimate on...
BLAndrew575: • 2300+ Hour Satisfactor...
Daniel Owen: • AAA Unreal Engine 5 ga...
• Baldur's Gate 3 PC Opt...
Digital Foundry: • Immortals of Aveum PC ...
Unreal Sensei: • How to Create a Game i...
0:00- WTH is going on??
1:34- The Irony of CPU Utilization
2:47- New Features are More "CPU Demanding"
4:37- CPUs are for General-purpose Computing
5:44- What Inefficient CPU Usage ACTUALLY Means
8:20- We'll see if things get better
10:04- CPUs are becoming MORE important than GPUs
11:04- Silver-lining

Пікірлер: 2 700

  • @CaptToilet
    @CaptToilet10 ай бұрын

    So it comes down to 2 things. Are CPU's just not good enough? Or are developers just not good enough? The answer should be obvious. Hint it isn't the CPU

  • @bigben9056

    @bigben9056

    10 ай бұрын

    its cpu,people just dont understand how heavy is rt on cpu

  • @upfront2375

    @upfront2375

    10 ай бұрын

    Exactly! I've been pc gaming for around 25 yrs now and I've never seen a time when CPU was nearly this important for gaming, except for old network EA games of course

  • @2528drevas

    @2528drevas

    10 ай бұрын

    @@upfront2375Same here, I built my first gaming PC in 1998 to play "Half Life", and it did fine with a AMD K6 300, because the 16MB Voodoo Banshee handled the load.

  • @Leaten

    @Leaten

    10 ай бұрын

    This seems to be a way deeper issue to me. Whenever I game the CPU is only used at around 4%.. but in regards of software I recently updated my GPU and it's not even fully utilized (no bottleneck in general) and I can't even reach my monitor's refresh rate 🤦 I have no idea what is causing this anymore. Why would a game not use my pc's hardware fully if it's available and instead let me see lower framerates LOL

  • @bl4d3runn3rX

    @bl4d3runn3rX

    10 ай бұрын

    Would be interesting to see how a game performs on 5900x on release day and 2 years later fully patched on the same system, has it improved or still the same?

  • @user-nq5hy7vn9k
    @user-nq5hy7vn9k10 ай бұрын

    Based on Steam Hardware survey, 1650 is still the most used GPU. If a developer wants a wider audience to be able to play their games, it's much better they focus on better optimization really soon

  • @redshiftit8303

    @redshiftit8303

    10 ай бұрын

    Unfortunately, their target audience is now console. Where they can get away with 30 fps and the peeps still pay up....

  • @user-nq5hy7vn9k

    @user-nq5hy7vn9k

    10 ай бұрын

    @@redshiftit8303 I doubt with such crappy optimization even the consoles will be able to give 30fps for too long

  • @danavidal8774

    @danavidal8774

    10 ай бұрын

    ​@@user-nq5hy7vn9kif I remember correctly remnant II runs at 720p upscaled to 2k on consoles It is wild

  • @InnuendoXP

    @InnuendoXP

    10 ай бұрын

    @@user-nq5hy7vn9k nah this is where the optimisation starts. It takes time, and games are taking longer than ever to develop. When they hit the wall on performance, that's when they start pulling out tricks. Though if a Zen 3 3700 equivalent is limiting to 30FPS, you'll need 2x performance per clock to maintain 60, and we still don't have CPUs doing that. At the very least, Series S might keep a lid on VRAM requirements though.

  • @Leaten

    @Leaten

    10 ай бұрын

    We know this. Only AAA developers make demanding games if you didn't notice yet because gamers can justify a hardware upgrade for them

  • @radioleta
    @radioleta9 ай бұрын

    As a graphics programmer. Vulkan and DirectX 12 shouldn't lead to more CPU utilization by itself. In fact, the whole point of the modern APIs is actually to reduce CPU overhead. The new APIs allow you to use use multithreading to use multiple cores to record rendering commands though. But that actually helps saturate the GPU! Yes, I think it might have something to do with the lack of optimization. I agree that the improvements in realism are not worth for the performance impact most times :)

  • @Mefistofy

    @Mefistofy

    9 ай бұрын

    Just a thought: could it also be memory data rate when everything becomes bigger and open world.

  • @ayliniemi

    @ayliniemi

    9 ай бұрын

    As the cpus, gpus and game engines become more complex with each generation I would think it gets harder and harder to master optimization while at the same time creating a game and bringing it to market in a profitable timeframe.

  • @Mefistofy

    @Mefistofy

    9 ай бұрын

    @@ayliniemi Did not think about it but complexity is definitely something that exploded ind past decades. I work with ML and getting the GPU into high utilization can be quite hard sometimes, depending on architecture. All the new shiny libraries offer a lot of comfort but are sometimes badly documented. If you want to do something specific finding a way around a framework can be cumbersome and sometimes you waste a little processing time to get the damn thing to work. I guess there might be similarities in games. Hardware is developing so fast for decades, software is barely keeping up.

  • @ayliniemi

    @ayliniemi

    9 ай бұрын

    @Mefistofy So your saying because of time constraints the code isn't as perfect/efficient as it could be? Like you could go back and recode Mario Bros on the Nintendo, keep seeing how you could make the game run more efficiently on the NES processor am I right? You'll probably run into a dead end at some point.

  • @zuffin1864

    @zuffin1864

    9 ай бұрын

    You know what isn't realistic? Low frames dangit!

  • @HoD999x
    @HoD999x9 ай бұрын

    (former game) developer here. you can usually get things calculated a lot faster (x2-x10 in my experience) by thinking long and hard compared to a first prototype. the thing is that this process is usually very expensive and sometimes would mean you run out of funds while rewriting half your engine. on top, you become less flexible because only the special cases you optimized your code for are fast. your game would run better, but you will not have nearly as much content

  • @richr161

    @richr161

    9 ай бұрын

    AS far as spiderman. Its such a cpu hog because its streams a ton of data off the drive. It wasn't an issue on ps5 because Ps5 has hardware to assist with decompressing the data. When it was ported to PC , it functions the same, but relies on the cpu to decompress the data leading to high cpu usage. It you monitor the ssd usage you'll see it loading huge amounts of data of the drive. This is a game that can really use gpu decompress like ratchet and clank on pc.

  • @TraktorTarzan

    @TraktorTarzan

    9 ай бұрын

    is it an engine issue? cause with modded skyrim im playing a game thats essentially 500GB, and it runs decently on my 1080. but modern games, with way less content and and similar graphics(i usually play em on medium/high) ends up running with similar fps, even though its less than 100GB.

  • @richr161

    @richr161

    9 ай бұрын

    @@TraktorTarzan I assure you that the graphics detail aren't similar. Skyrim graphics weren't great back then and they're definitely outdated now. The size of the game doesn't have much to do with graphical fidelity. cyberpunk is an open world rpg and only comes in at 55gb. Its definitely the best looking game out when you turn on path tracing and all the modern effects.

  • @TraktorTarzan

    @TraktorTarzan

    9 ай бұрын

    @@richr161 i said modded skyrim, not the basegame, that isnt even close. look up "Skyrim in 2023 | Ray Tracing Shader" or "skyrim ultima". Also i said compared to modern games, not THE best looking modern game

  • @QuandariusHuzz-bq1jn

    @QuandariusHuzz-bq1jn

    8 ай бұрын

    alot of pc games these days have problems with asset streaming and resource management

  • @josephl6727
    @josephl672710 ай бұрын

    It's time to quit high end gaming. Lol They are ripping everyone off.

  • @vextakes

    @vextakes

    10 ай бұрын

    Imagine without the highest end CPUs, it would be even worse

  • @theplotthickens6313

    @theplotthickens6313

    10 ай бұрын

    I play on 1080p ultrawide so I'm even more CPU bound with the same system lol

  • @Ay-xq7mj

    @Ay-xq7mj

    10 ай бұрын

    This. Me be playing allied assault pvp on a 3070ti.

  • @takehirolol5962

    @takehirolol5962

    10 ай бұрын

    Back to the 90s folks...you youngsters missed an awesome decade... But super expensive as a PC gamer...

  • @brucerain2106

    @brucerain2106

    10 ай бұрын

    Always have been

  • @bobsteven2363
    @bobsteven236310 ай бұрын

    As a game dev (not programmer) I will lay out some fun facts. Making a high poly mesh with loads of detail can take a day to make. You can also auto uv unwrap it and start texturing the next day. Super easy. But the poly counts for a single character can easily reach one million with all the parts. Games cant handle that so you need to spend a week manually retopologizing, baking, uv unwrapping and texturing that model so that it has a low poly count and still looks amazing. Unreal engine releases auto lods. Yay, now I can skip the manual retopologizing phase and just make lods with the click of one button. Game sizes are now way bigger. Unreal Engine releases nanite. Oh wow, I can just import models directly from zbrush and fusion 360? Cool, now I dont need to worry about optimizing at all since the game engine can handle it. Every year, making games becomes much easier and less time consuming at the cost of performance. You can still optimize but why would you if you already finished what you were tasked with?

  • @curvingfyre6810

    @curvingfyre6810

    9 ай бұрын

    More importantly, they're under pressure from their bosses to churn them out faster. The crunch is insane, and the quality suffers across the board.

  • @ralphwarom2514

    @ralphwarom2514

    9 ай бұрын

    Yup. Pretty much.

  • @MrThebigcheese123

    @MrThebigcheese123

    9 ай бұрын

    So, it is ok to release a half baked product and spend over 70% of the development time on pre planning while leaving real development until 2 years before release? Mkay then...

  • @curvingfyre6810

    @curvingfyre6810

    9 ай бұрын

    @@MrThebigcheese123 I think the point is to blame the direction and money side of things, not the average programmer. They have to get through the day and survive the frankly insane crunch time. It's up to the directors and producers to approve of the level of work that they have forced out of their engineers in the allotted time, and to choose whether more time is needed.

  • @Born_Stellar

    @Born_Stellar

    9 ай бұрын

    good to know, interesting to hear from inside the industry.

  • @Robwantsacurry
    @Robwantsacurry9 ай бұрын

    You've missed something important in the PC space, copy protection. Many PC titles are encrypted, sometimes with nested encryption. Denuvo for example, code is decrypted on the fly, because unencrypted code could be ripped from memory. This pushes CPU usage up massively.

  • @saurelius5217

    @saurelius5217

    9 ай бұрын

    More reason not to buy new AA games.

  • @JodyBruchon
    @JodyBruchon9 ай бұрын

    I talked about this in "everything is a f***ing prototype." Software is a gas that expands to fill its container. I own a bunch of old XP-era embedded hardware and netbooks. Most of them have one core. Look up the PassMark for the Atom N435 or VIA C7-M 1200MHz, then look up the most basic CPU for a Windows 10 laptop like the Celeron N3050. The difference in computing power is huge. The old wimpy chips could do all the basic tasks anyone needed in a laptop at a reasonable speed, but instead of the software getting faster as hardware has exploded in speed, everything is written in Python or JavaScript and everything uses huge frameworks and pulls in huge dependencies for small things. I recall an old article that disassembled and analyzed bloat in the Facebook app for iOS which was notoriously fat at the time and the most ridiculous piece of bloat was an entire library of functions pulled into the app just to use a single function that does something like getting a date. Any competent programmer could have spent a day at most writing it themselves but they opted to pull in a library to do that single thing. It's disgusting.

  • @nemesisrtx
    @nemesisrtx10 ай бұрын

    2023 has been the worst year for PC Gaming, games releasing unfinished and terrible optimized, etc... Nowadays people have very good systems but not even current hardware is capable to keep up with how demanding games are, hopefully game devs understand that it is better to have a playable game on day 1 and not rush their projects because of hype, AND, most importantly that most PC gamers don't have a high end PC lol

  • @brucerain2106

    @brucerain2106

    10 ай бұрын

    Truuue

  • @Not_interestEd-

    @Not_interestEd-

    10 ай бұрын

    Actually, there's a surprising answer as to the issues we've seen recently. DX12 has been out for a while, and it's great. Has a lot of drivers to do a lot of the heavy lifting so your games are usually optimized, even with minimal effort put into actual optimization. The problem comes with DX12 "Ultimate", which theoretically is a better branch of DX12, but most of the "useless" drivers had been ripped out, and I guess in the lot removed, something was doing WAY more work than expected, thus, poor optimizations. This in conjunction with bad management and impossible deadlines (think about the Mick Gordon vs Bethesda incident) makes game development a harsh environment for AAA titles. I really want people to stop blaming the devs, blame the management that screws around with the millions of dollars they sit on, paying the actually working devs a fraction of what they get.

  • @Ghostlynotme445

    @Ghostlynotme445

    10 ай бұрын

    The rtx 4090 going 40fps at 4k is real what a $1600 experience

  • @Patrick-bn5rp

    @Patrick-bn5rp

    10 ай бұрын

    Pretty good year for emulation on PC, though.

  • @mimimimeow

    @mimimimeow

    10 ай бұрын

    Welcome to the transition period. It happens in every console generations except 8th gen because PS4/XB1 were objectively very weak for its era.

  • @XieRH1988
    @XieRH198810 ай бұрын

    Things aren't being optimised properly sums it up fairly well. The current period is one of transition, where developers are fumbling and stumbling their way to master DX12 and other things. It'll probably get worse before it gets better.

  • @deality

    @deality

    10 ай бұрын

    I believe it dx12 is not optimized but it's the future and it should get better

  • @he8535

    @he8535

    10 ай бұрын

    Poor optimization more complex compression still missing all the features a good game would have

  • @americasace

    @americasace

    10 ай бұрын

    Sadly, I vehemently agree with your assessment..

  • @borone1998

    @borone1998

    9 ай бұрын

    Tod Howard would beg to differ

  • @Gramini

    @Gramini

    9 ай бұрын

    I wonder how long the transition period will be, given that D3D12 is over 8 years and Vulkan over 7 years old now.

  • @ZTrigger85
    @ZTrigger859 ай бұрын

    I’m so glad I watched this. I have a Ryzen 5 5600X and I’ve been CPU bottlenecked lately. I was considering upgrading to a Ryzen 9, but seeing that it didn’t solve the problem for you saved me a lot of money. Are hardware manufacturers paying game developers to make their games as demanding as possible? I think about this often.

  • @MartinMaat

    @MartinMaat

    9 ай бұрын

    I don't think they struck a deal among each other but rather developers will utilize hardware to the max to make their game look better than the competitor's game. So "whatever the hardware guys will come up with, the software guys will piss it away". Their interests align nicely though.

  • @a7mdftw

    @a7mdftw

    9 ай бұрын

    Can i ask what is your gpu

  • @ZTrigger85

    @ZTrigger85

    6 ай бұрын

    @@a7mdftw Sorry, I just got the notification for this. No clue why. I’m running a 4070.

  • @xTheN8IVx

    @xTheN8IVx

    6 ай бұрын

    This is mostly a game development issue, the 5900x is still a great CPU for plenty of games. Paired with a 4070, that’s a great setup. There just needs to be more optimization on the game devs side.

  • @dangelocake2635

    @dangelocake2635

    5 ай бұрын

    I'm not a dev, but it seems more like a industry issue. What I mean is, in order to make a game run as smooth as possible, you need time to optimize. The more time you need, the more people you're keeping into your business, so it costs money. But a game doesn't generate most of the money until is released, so you must balance your timeframe with the amount of fund/time/potential profit . Companies usually rather push a poorly optmized game over a highly cost one an fix it later. You could improve engines overtime, but everygame needs to step up in terms of graphics, so there's only so much work you can reuse. So I don't think it's a conspirancy, because you have games like RE4 Remake that was running smooth from day one, because it's a Capcom engine they'd developed for years or even Rockstar games that are sometimes demanding, but you can see why they are, in terms of tech.

  • @surnimistwalker8388
    @surnimistwalker838810 ай бұрын

    The way I am seeing it is that it's not going to get better. Publishers are pushing game dev studios to pump out games as fast as they can. UE5 enables this as you described being able to pump out a game that looks pretty but relies on hardware to brute force programming "issues." This is a problem with the quick cash grab mentality in the gaming industry and it's not going to go away until the whole bubble that is getting created bursts.

  • @abeidiot

    @abeidiot

    9 ай бұрын

    Also, game dev pays peanuts and has insane working hours. The brightest computer science talent is no longer interested in working in game dev unlike the 90s

  • @onyachamp

    @onyachamp

    9 ай бұрын

    This is true. It's like an artist pumping out work for money rather than a love of their art. I personally think on a sinister note, that studios are running silent companies to sell cheats for the games they make. A person buying hacks for $10-20 a month are spending $120-240 a year on cheats and they would be so much easier to develop than the game itself. From a business perspective it is easy money, and that's exactly and almost entirely what the industry has become.

  • @surnimistwalker8388

    @surnimistwalker8388

    9 ай бұрын

    @@onyachamp You know Rockstar has been caught how many times now distributing pirated versions of their own games. I wouldn't put it past them to do that.

  • @BygoneT

    @BygoneT

    9 ай бұрын

    ​@@surnimistwalker8388I've literally never heard of this? When?

  • @Jakob178

    @Jakob178

    9 ай бұрын

    the bubble are called normies that buy every yea r literally shit, doesnt matter the quality

  • @ErockOnTech
    @ErockOnTech10 ай бұрын

    The take away for me here is that modern games aren’t optimized. I’ve said this in videos. So has HUB. But good job going in depth on CPU usage. You’re right about reviewers using higher end CPUs. I’m guilty of that myself.

  • @vextakes

    @vextakes

    10 ай бұрын

    I don’t think anything’s wrong with using fast CPUs, because the goal is to show the overall GPU performance. However, a lot of ppl might not be able to get that performance just because of the CPU they own. So it’s a mixed bag. It requires a lot of testing, but should prolly be pointed out depending on what games are reviewed. Mb in CPU reviews we could show if they give reasonable performance as well, compared to relative GPU power if we’re talking about gaming

  • @Soraphis91
    @Soraphis9110 ай бұрын

    One issue easily neglected: UE and Unity are kinda general purpose engines. Yeah, UE has a history in FPS games and is rly good in that but you can basically do any game with the engine. this means those engines have a lot of abstractions and many game developers - when they chose one of those engines - usually don't have the manpower to go so far inside the engine code to optimize it for the last bit, regarding the concrete game they are working on. Just check how many engine developers are hired for projects that are developed on inhouse engines of AAA studios. So, with the comfort of taking a ready-to-go-engine you also lose some control

  • @NeovanGoth

    @NeovanGoth

    9 ай бұрын

    Totally agree. UE and Unity are awesome as they allow even smaller teams to use state-of-the-art graphical effects they could never have written themselves (and usually perform really well), but they also seem to encourage using them in improper ways.

  • @hermit1255
    @hermit12557 ай бұрын

    I actually feel like the march AAA game devs are on towards more and more demanding games will eventually be forced to a stop as I think most people on mid to lower systems will just stop buying their games. A bright future for underdog indy stuff.

  • @italoviturino6386

    @italoviturino6386

    5 ай бұрын

    The "look at how real everything inside this game looks but pls ignore what is in it" will be replaced with games like Hi-Fi rush, where the art style will make it so it ages better while demanding less from the hardware

  • @metalface_villain

    @metalface_villain

    4 ай бұрын

    this has already begun tbh, everyone seems to be playing more indy stuff and ignore big aaa titles unless they are something as great as elden ring for example. this ofc is not only because of how demanding the games are but also because the triple a games are becoming just a money grab full of microtransaction while the more indie stuff focus on a great gaming experience

  • @imbro2529
    @imbro252910 ай бұрын

    Tbh I think it's an optimization issue of the games themselves not the hardware. Because really we have the 30 and 6000 series GPUs, Intel's 12-13 gen and Ryzen 5000 CPUs probably the best hardware for only to barely run shitty ports like The Last of Us, CyberBug, Hogwarts Legacy (it was quite buggy on release), Darktide (a shitstorm of poor optimization), Forespoken and etc. All of these came out poorly built because the devs probably have difficulties with all this new software and being pushed from a top to release a product. So I don't think it's a hardware issue more like software issue that doesn't correctly utilize the true potential of our components

  • @QQ-xx7mo

    @QQ-xx7mo

    10 ай бұрын

    this is just what happen when the masses get access to a media (games, cinema, tv, internet) it becomes shit.

  • @dugnice

    @dugnice

    10 ай бұрын

    I think it's a concerted effort to destroy the PC gaming market so that only the wealthiest of people can indulge in PC gaming and everyone else gets a console, but I'm probably totally wrong. 🤷🏻‍♂️

  • @knasiotis1

    @knasiotis1

    9 ай бұрын

    ​@@QQ-xx7mowhat.

  • @pliat
    @pliat10 ай бұрын

    DX12 can be optimised far better that DX11, but that requires time, and most importantly, skilled devs that understand low level coding. The devs just are not good enough.

  • @_GLXC

    @_GLXC

    10 ай бұрын

    maybe those devs are actually around, they're just still working on a game

  • @thechezik

    @thechezik

    10 ай бұрын

    This is have evrything to do how America degrading those develepores are not pay like just to be meanwhile cost of living skyrocketing and this is all over diffrent industries basicly competion its killed there is no moro loyality consumer base now its focus on AI and most importaninly wealth vs middle class vs working class its gone !!!

  • @AwankO

    @AwankO

    10 ай бұрын

    Perhaps those developers are not given enough time to optimize properly.

  • @henryyang802

    @henryyang802

    10 ай бұрын

    I will not say that the devs aren't good enough, maybe the large game-developer community are still figuring out how to use DX12 the best way possible? Fine-granural control means more fragmentations and more to learn about what's going to be the best. Probably there aren't only 1 Optimal way of using it, there are many?

  • @s1p0

    @s1p0

    10 ай бұрын

    make game first

  • @Oni_XT
    @Oni_XT9 ай бұрын

    I'm not a game dev but this popped into my head since I'm constantly seeing texture streaming options. Is it possible more games are integrating this and it's effectively pooping on CPUs?

  • @SkyAnthro
    @SkyAnthro9 ай бұрын

    Have you tried turning on hardware accelerated GPU scheduling? It might help with relieving some of the work load on the CPU ^-^

  • @DankyMankey
    @DankyMankey10 ай бұрын

    Also, another issue is that most popular games are also not multi-threaded so a single thread/core of your CPU is bottlenecking your GPU.

  • @mikfhan

    @mikfhan

    10 ай бұрын

    Yep, this is the issue we've had with games since the turn of the millenium. 5GHz boost is needed because games rely so much on a main thread, but CPU can't boost forever. We have plateau'ed around 4½ GHz now and it will take miracles to go beyond that in a stable manner. Parallel programming is difficult. The best way to improve your gaming experience from now on is not hardware, but rejecting games that only use 4 cores effectively. Wait for performance reviews, make publishers care about optimization, instead of micro transactions.

  • @minnidot
    @minnidot10 ай бұрын

    You really nailed some good points. CPU's used to be one of the last things we had to worry about upgrading. I posted a vid on how to lower CPU usage in Battlefield 2042 and even users with 13900k systems have commented that it helped. That's one of the top pieces of hardware available and its struggling with a game from 2021

  • @ericimi

    @ericimi

    10 ай бұрын

    That video was yours ? I literally just used it the other day for battlefield my 7700x was using about 50 percent then I made the user.cfg file and it uses about 33 percent . Pretty big difference .

  • @ralkros681

    @ralkros681

    10 ай бұрын

    “But older games are not optimizing for new hardware” Most bs excuse I have ever heard. Had to say it before someone else did

  • @minnidot

    @minnidot

    10 ай бұрын

    @@ericimi that was mine. I'm so glad it helped you!

  • @mttrashcan-bg1ro

    @mttrashcan-bg1ro

    10 ай бұрын

    It's BS that every new GPU generation you need a new CPU to be able to utilize it, the 5900X pushed a 3080 and 3090 really well in 2020, but it's virtually a potato on new games in 2023, in some cases my 4090 at 4k, it wasn't even an upgrade over the 3080, but now I can turn DLSS off, which barely makes a difference, because a 5900X is just a trash CPU against new games. I wanna see what Ryzen 8000 and Intel 14th Gen are like, I would like to stay with AMD but I want really want to upgrade a 12 core to 16 core when a 12 core is being nearly fully used in some games, I want at least 20 or 24 cores, but I doubt AMD will increase core counts, whereas Intel probably will.

  • @RickyPayne

    @RickyPayne

    10 ай бұрын

    @@mttrashcan-bg1ro Unless you're a heavy, heavy multitasker like a game streamer, core counts for gaming isn't very important past 6-8 cores because most all games are designed with 8 core consoles in mind. Cores 9+ are only normally only helpful for doing extra non-gaming tasks while gaming. For strictly gaming you're better off with an 8 core AMD x3d model over anything else since, at 6+ cores, processor speed and cache are more important. Once you hit 8 cores/16 threads, unless you're a developer, game streamer, content creator, Gentoo user, etc, you're better off with more cache and single core processing power over more cores and threads.

  • @F1Lurker
    @F1Lurker9 ай бұрын

    Very insightful and concise video, thank you for making it

  • @Thanatos2996
    @Thanatos29969 ай бұрын

    Remnant 2 is actually less CPU bound than the first game, if you can believe it. The first was less visually impressive, but the shaders were so badly optimized on the CPU side that some areas struggled to stay above 70 on my 5800/3080ti rig at 1440 with shadows turned on. They’re both stellar games, but Gunfire could really stand to work on their optimization.

  • @pablolocles9382
    @pablolocles93829 ай бұрын

    It's called: bad optimization.

  • @beansnrice321
    @beansnrice32110 ай бұрын

    I think the main issue isn most of these games is that their animation is all being handeled by one thread on the cpu. Many 3d content creation programs also have the same issue, with Maya being one of the few multi-threaded animation engines in the industry.

  • @Gramini

    @Gramini

    9 ай бұрын

    I don't think that animations are that heavy. My guess is on the actual game logic (including AI) and maybe physics simulation.

  • @blindfire3167

    @blindfire3167

    9 ай бұрын

    @@Gramini AI and Shadows/Lighting are the heaviest pieces on the CPU (mostly AI), while Physics depends on what type of simulation you're doing (some can be handled by mostly GPU like rain/water or destruction). Ray Tracing (although very heavy on GPU) requires a very beefy CPU to handle it since you still need the CPU to calculate light, which also is needed to calculate shadows and reflections (though reflections are usually just handled by the GPU, it can still be kept waiting for CPU to finish the other processes) and I'm not 100% on this, but it always feels like games that use RT always have it running on multiple cores but not multiple threads. I know I'm just a lowly peasant with my 8700k (I just barely graduated and still waiting for that dream job to kill all my enthusiasm for any industry lol) but every game I've tried RT on my 3080 has run horribly from core 0 taking most of the load on thread 0 and then on cores 2 and 4 only the first thread being maxed out.

  • @Gramini

    @Gramini

    9 ай бұрын

    @@blindfire3167Shadows/Lighting are also done on the GPU. Or do you have some specific case in mind that doesn't? Good point with the physics one. Some pieces of it can be delegated to the GPU, but not all. And not all games do that. It also taxes the GPU, which might be more important to do the rendering. Also multiple cores = multiple threads. To be more specific: to do things in parallel, a program creates a new (software) thread. It's then up to the OS to schedule the thread to a physical core/thread on the CPU. The program can also give a hint that it should be in on another core. The situation you described with only every second physical thread/virtual core being used is quite an interesting one. That _might_ make sense, because those two virtual cores are still only one physical core, which cannot do the same thing twice in parallel. So some programs hint that only every second CPU thread should be used. From what I know/was being told by a consultant was that it's usually best to not do that and just leave it to the OS to schedule/balance.

  • @toddwasson3355

    @toddwasson3355

    9 ай бұрын

    @@blindfire3167 Multiple threads means running on multiple cores. There's no difference.

  • @blindfire3167

    @blindfire3167

    9 ай бұрын

    @toddwasson3355 Nope, it *can* mean the same thing but you can have something running on multiple cores but not multiple threads, it would just run on the first thread on each core alone.

  • @justarandomgamer6058
    @justarandomgamer605810 ай бұрын

    If I recall correctly data centers had the same issue and they innovated by producing hardware for specifically handling the transfer of large volumes of data instead of the CPU which is more designed for all-purpose tasks.

  • @_GLXC

    @_GLXC

    10 ай бұрын

    you would think that with all this hubbub about "Tensor cores" or "raytracing cores" that the operation would be GPU bound, but no. v_V

  • @mttrashcan-bg1ro

    @mttrashcan-bg1ro

    10 ай бұрын

    You know it's sad when the latest GPU is bottlenecked by a CPU that is newer than it, every CPU bottlenecks a 4090 with RT at some point in these newer games

  • @BlindBison

    @BlindBison

    10 ай бұрын

    Yeah, consoles took that route too. PS5 has dedicated hardware for asset streaming and decompression. PC is getting direct storage so maybe that’ll help but basically no games even use it yet.

  • @abeidiot

    @abeidiot

    9 ай бұрын

    Fun fact, this has been available in PCs since a long time. It just wasn't catered to spoon feed game developers, linux even has a system call to handle it. Now microsoft is adding directstorage to directx to make it easier for game devs to implement

  • @vidmantaskvidmantask7134
    @vidmantaskvidmantask71347 ай бұрын

    Good voice talking. : ) You are skilled. Its interesting to listen.

  • @bulletwhalegames1575
    @bulletwhalegames15759 ай бұрын

    Just wanted to drop this here, modern rtx solutions are not poorly optimized. Raymarching is extremely demanding on both cpu and gpu, the only way we can do it right now is to cleverly discard a lot of rays and bounces so that we can get somewhat decent framerate. How we do this is by calculating a structure that contains information about where geometry is located (too much info to really get into but there are plenty of good source), building this structure is what is costing your cpu performance, the more geo you need to update in this structure per frame the more your cpu will be doing. For example skinned meshes are extremely expensive and most of the times will be ignored (you can see this in games, a lot of the characters will not receive bounce lighting for example). Modern games do perform good on normal settings once you disable raytracing most of the time, raytracing is not really far enough at this point in time since there still is no hardware to accelerate the construction of structures (and this might likely stay like this for quite some).

  • @FreelancerND

    @FreelancerND

    6 ай бұрын

    So basically prebaked lightning is still runs better and looks better in most of the cases? =)

  • @colbyboucher6391

    @colbyboucher6391

    4 ай бұрын

    Yeah... this channel is just a dude who has no idea what he's talking about talking as if he's authoritative, which people love because it validates what people already feel.

  • @burnzybhoy9110
    @burnzybhoy911010 ай бұрын

    Personally i feel like pc game optimisation has become an afterthought as of late, i dont feel like DX12 is stable enough in its current form and we should have the option to run DX11 if we choose, also the strange memory leaks we are suffering these days makes me ask a lot of questions, is DX12 the issue or are game devs targeting ram and cpu ? All i know is i want more accessability with DX options

  • @KainniaK

    @KainniaK

    10 ай бұрын

    Because of FSR and DLSS

  • @jjcdrumplay

    @jjcdrumplay

    9 ай бұрын

    How do you monitor memory leaks?

  • @burnzybhoy9110

    @burnzybhoy9110

    9 ай бұрын

    @@jjcdrumplay in order to monitor memory leaks not only do we have to do our research but also monitor your ram usage via task manager and then performance, from here you can see ram usage.

  • @mihailmojsoski4202

    @mihailmojsoski4202

    5 ай бұрын

    @@jjcdrumplay valgrind helps, tho it makes your program/game run like shit while testing because it's technically an x86_64 emulator

  • @jokerxd9900
    @jokerxd990010 ай бұрын

    I think we are at a weird state we just jumped to next gen and there will be mistakes and bad optimizations but after a while they will master it and things will get better. If they are open to learning and not lazy

  • @deality

    @deality

    10 ай бұрын

    I have a theory that unreal engine unity they are just trying to kill the gaming industry because of how unoptimized those applications are also directx12 is poorly optimized which is why you get shitty fps using it

  • @Juanguar

    @Juanguar

    10 ай бұрын

    @@dealitydirectx 12 does not optimize anything because it offloads the optimization to game developers Some devs even said it themselves and explained why some games ran on 11 better than 12 It’s because devs were lazy af and over relied on the auto optimizations that 11 has offered

  • @raskolnikov6443

    @raskolnikov6443

    9 ай бұрын

    Next gen has been out for 3 years….

  • @vairaul
    @vairaul8 ай бұрын

    The problem is most games are heavily depended in one or two CPU cores using the others as support, a situation where for example: cpu0 - 100% / cpu1 - 80% / cpu(2-X) - 30% is happening. I believe the today's problem is software bottleneck to utilize the most of modern GPU.

  • @D1EGURK3
    @D1EGURK39 ай бұрын

    I can relate a lot to this topic...I have a 4070 Ti and an Ryzen 7 3800X...I'm CPU limited in basically every newer game...but not so much in older games...

  • @danos5048
    @danos504810 ай бұрын

    As I understand it, the CPU usage stat is actually based on the number of threads being used. Each thread in use contributes to a percentage based on how many there are. 80% on a 12 core 24 thread chip means that at that moment 19 threads are being used. 0.8 * 24 = 19. 12 cores are actually being used and some of those cores are into hyper-threading/SMT.

  • @tadeuferreira5705

    @tadeuferreira5705

    10 ай бұрын

    Wrong bro, CPU utilization is about CPU time and not about thread's. Any modern os with programs and service running on the background has hundreds if not thousands of active threads at any given time

  • @cagefury3789

    @cagefury3789

    10 ай бұрын

    @@tadeuferreira5705 You're talking about OS threads or process threads. Those are just concurrent instructions running within the same process that share memory. He's talking about hardware threads, which are also sometimes called logical cores. You're right in that it's about time, but utilization takes threads into account as well. You can have 1 thread constantly doing work 100% of the time but you're overall CPU utilization will be very low if you have a lot of threads/cores.

  • @maverikshooter

    @maverikshooter

    10 ай бұрын

    ​@@tadeuferreira5705 50% = 12 cores are used. And 100% = 24 threads.

  • @margaritolimon3683

    @margaritolimon3683

    10 ай бұрын

    @@tadeuferreira5705 No it’s about usage overtime that is why it goes up and down depending on the scene. For a normal cpu (no e-core math hard) 50% being used is all the cores anything higher and it’s now using hyper-threading. Also it’s trying to explain everything with one number. If a 12 core cpu is being used and only 6 cores (no hyper) are at 100% and 6 are at 0% then it will show up at 25% or all cores at 50% (no hyper) will also show as 25%.

  • @VDavid003

    @VDavid003

    10 ай бұрын

    50% on a 24 thread cpu could mean 100% on 12 threads with 0% on the rest or 50% on all 24 threads or anything in between

  • @brucerain2106
    @brucerain210610 ай бұрын

    Remember ps3 with its 256mb of ram and somehow it ran TLoU, Gta5 and Uncharted? And now we have components that cost like a whole console and they can’t even be utilised to their full potential. Wow very cool

  • @slaydog5102

    @slaydog5102

    10 ай бұрын

    I thought we were "advanced" though?...

  • @BruceLee-qc2lm
    @BruceLee-qc2lm6 ай бұрын

    Thanks for the video. I have a 3080 with 5900X, and I was trying to figure out the bottleneck even after using Ryzen Master. Might upgrade to 5800X3D and maybe 4080ti when it gets released at the end of the month.

  • @Nukeaon
    @Nukeaon6 ай бұрын

    thank you for this video! new sub :)

  • @CyberJedi1
    @CyberJedi110 ай бұрын

    One of the biggest problems I think is that CPU advancement is nowhere near GPU advancement... especially on single-threaded performance. From the 3090 to the 4090 we got almost double the performance, from the 5800x3d to the 78003d is not much higher than 10%, and games don't utilize much more than 8 cores, so it doesn't matter how many you have in the cpu. Also, cpu's are kind of stuck at 6ghz, I heard somewhere that it will be really hard pushing past that.

  • @MitremTheMighty

    @MitremTheMighty

    10 ай бұрын

    3090 to 4090 double the performance? 😂It's closer to 60%, which is good, but nowhere near double

  • @korinogaro

    @korinogaro

    10 ай бұрын

    Yes and no. Main problem is that engines suck ass "out of the box" with threads utilization. Devs need actively write ode to utilize more cores and threads but they don't give a fuck and go with default settings in engine in as many places as possible. And 2nd problem is that companies switched run for GHz to masturbation contest over number of cores (because after they figured out how to glue CPUs together it is easier and gives good results in syntetic benchmarks). So it is combination of these two problems. CPU manufacturers give us more cores but improvements in *umpf* of every core are not impressive and devs don't give a fuck about iptimization for more cores.

  • @user-eq2fp6jw4g

    @user-eq2fp6jw4g

    10 ай бұрын

    r7 5800x3d is still insanely good value for money for budged/bit better future proof 1440p gaming taking to and account how expensive am5 platform still is. Specially good motherboards.

  • @nugget6635

    @nugget6635

    10 ай бұрын

    Actually CPU has a much better scalar performance than GPU. GPU is vector processor (highly parallel). So single thread CPU is better actually. Parallel GPU is thousands of times better

  • @andreabriganti5113

    @andreabriganti5113

    10 ай бұрын

    At a certain point, just increasing the clocks will not lead to great improvements. It did happen already. Having tons of fast cache is a better solution for gaming.

  • @livingthedream915
    @livingthedream91510 ай бұрын

    honestly it's possible to avoid the cpu utilization pitfall by simply not using any kind of raytracing, and in addition it's well known that Nvidia drivers are more CPU heavy that AMD's

  • @skorpers

    @skorpers

    10 ай бұрын

    Yeah people are really stubborn about Raytracing. Ghostwire tokyo is a pretty good example of a game that looks absolutely fine without having raytracing on. And in some cases, you might not even know there's a difference because you have to point the camera at certain angles for there to be any shortcomings of Screen space reflections.

  • @chiari4833

    @chiari4833

    10 ай бұрын

    Yep i dont get why ppl are buying into this expensive marketing trick. It may look good, but it's not ready. It needs massive optimization .

  • @livingthedream915

    @livingthedream915

    10 ай бұрын

    You fellas in the comments get it, this tech is simply too soon out the gate to be seriously considered, anyone who actually bought a 2000 series gpu for raytracing got royally exploited and we're still in a situation where it's almost always a better experience to have it off

  • @skorpers

    @skorpers

    10 ай бұрын

    @@livingthedream915 The main thing that tipped me off about RTX being BS is that the developers of the games promoting it the most stopped even attempting to use impostering techniques to add flavor to the scene in a cheap way. I.E the examples they used like reflections simply not existing on glassy surfaces if RTX was off when the PS2 could literally do it. Also how metro looked extraordinarily washed out with rtx off when colored lighting was done on old consoles as well.

  • @GDRobbye

    @GDRobbye

    10 ай бұрын

    Games without RT can look as good as games with RT, it just requires more work from the designers. So to some degree, RT is a time-saver and going forward, we'll probably see more focus on RT and less focus on traditional illumination/shadowing, which in turn means that, gradually, games without RT will simply look worse. Hopefully, by that time, RT won't be such a big performance hog.

  • @eliasalcazar6554
    @eliasalcazar65548 ай бұрын

    I think after Xbox One and PS4 reach EOL we'll see some great strides in optimization. I agree also that the cost/performance of graphics isn't scaling nicely. It seems that the move toward "realism" isn't worth the hardware tax when it comes to final presentation. But, I do see GPU manufacturers serious about gaming including more specialized hardware on their chips, like Nvidia and AMD with RT Cores. Unfortunately I see this raising the price of GPUs even further in the foreseeable future. It will definitely be interesting to see how a mid-gen "Pro" refresh of the consoles will shake up the PC landscape as well. I'm guessing the consoles will be aiming at 7800xt levels of performance.

  • @ErwinLiao
    @ErwinLiao9 ай бұрын

    hey bro thinking about upgrading do u know if a pre built or building your own pc wich one is cheaper

  • @MarikoRawralton
    @MarikoRawralton10 ай бұрын

    Tim Sweeney once said that the main failing of Unreal Engine was that everything runs on one thread, at least back when he was discussing it. He actually praised EA's Frostbite for being able to multithread logic (but admitted it was a harder engine to use).

  • @kevinerbs2778

    @kevinerbs2778

    10 ай бұрын

    can the Frostbite engine use mGPU?

  • @progste

    @progste

    10 ай бұрын

    I believe that was UE3 though

  • @mttrashcan-bg1ro

    @mttrashcan-bg1ro

    10 ай бұрын

    Frostbite engine was good up until Battlefield 1 where is just rooted everyone's CPU so the i7 6700k at the time was bottlenecking 1070s a 1080s, by Battlefield 5 there were some 6 and 8 core options around but the clockspeed and IPC still matters more and that issue was amplified with 2042 where it's able to push a 24 thread CPU to 100% at times despite the visuals not being improved at all over BF1. Frostbite is a gorgeous engine but it's the second most CPU intensive engine I can think of after the one that the newer AC games use.

  • @MarikoRawralton

    @MarikoRawralton

    10 ай бұрын

    @@kevinerbs2778 If you mean multi GPU, I have no idea. I've seen conflicting reports.

  • @MarikoRawralton

    @MarikoRawralton

    10 ай бұрын

    @@mttrashcan-bg1ro BF1 ran great on my old PC and I was running an 8320E back then. That was a terrible CPU. Targeted 60fps on the old PS4/Xbox One and those both had notoriously weak CPUs. I think it benefited more from core count.

  • @anarchicnerd666
    @anarchicnerd66610 ай бұрын

    Nice vid Vex :) I'm not a dev so I can't comment really, but I'm with you - things can only get better. The big thing to remember with DX11 and DX12 is the whole reason that transition happened in the first place - namely devs complaining about the anemic performance of the X1 back in the day and the API having too much overhead. That's a big reason WHY DX12 is so focused on stripping out guardrails and handing control back to developers. It's also worth noting we're in the middle of a weird transition period again, the move to AM5, Intel's paradigm of e-cores and p-cores, Windows 11 rollout, the move to DDR5 etc etc. The sheer breadth and scope of hardware available on the open market and used market for people to build systems for is awesome for consumer choice and value, but a nightmare for devs who need to optimize for an almost limitless combination of hardware. ...man I sure picked a hell of a time to build a PC and join the enthusiast community XD my poor little R7 5700X is gonna get completely outpaced by upcoming games How ya liking the 5800X3D? Very curious what your thoughts are going to be for editing with it versus the 5900X, but it's clearly a monster at gaming...

  • @mathewbriggs6667

    @mathewbriggs6667

    10 ай бұрын

    I got the ryzen 5950x over the 5800x3d I needed the extra cores and clock speed but the x3d poops on it lol

  • @mv223

    @mv223

    10 ай бұрын

    @@mathewbriggs6667 It truly is a badass chip.

  • @handlemonium

    @handlemonium

    10 ай бұрын

    Lol my 10700 is gonna be bottlenecking so hard when I upgrade to the 8700XT or 5070 in 18 months.

  • @yonghominale8884

    @yonghominale8884

    10 ай бұрын

    As an ancient dinosaur who played the original DOOM on a 3dfx and the OG Pentium I can attest the things come in cycles. The issue is consoles and when transitioning from one generation to another, it's always rough. I still have nightmares about Crysis.

  • @mathewbriggs6667

    @mathewbriggs6667

    10 ай бұрын

    @@yonghominale8884 it's gonna be a while longer then 18months

  • @jjcdrumplay
    @jjcdrumplay9 ай бұрын

    I remember finding second hand Core 2 Duo E7500 for a Linux/winblows build by shopping on ebay for like under 5 bucks. At the time it was only about 7 maybe years old, and the thing worked for 5 years plus! Would it be worth the same buying second hand AMD chips if you'll save hundred, just taking the risk it was overused?

  • @Marco_My_Words
    @Marco_My_Words5 ай бұрын

    I purchased a CPU with 24 cores, the Intel i9-13900KF, for this specific purpose. My objective was to ensure the CPU wouldn't become a performance bottleneck, and it's achieving that goal, but its also overheating all the time. However, investing 600 euros in a CPU seems excessive, particularly when high-end GPUs are already priced above 1500 euros. Modern games have become increasingly resource-intensive. While I appreciate the advancements and realism in games, the escalating costs are becoming really challenging to handle. The amount spent on a top-notch gaming PC could be enough to buy a decent car.

  • @Aryzon13
    @Aryzon1310 ай бұрын

    It will not get better with time. Games were more optimized when DX12 and Vulkan were just rolling in. And since then it went downhill as you mentioned, And it will only keep getting worse until people stop buying. And since people will never stop buying, it will keep getting worse indefinitely. And you will be continuing to buy better hardware to compensate for the devs incompetence or straight up malicious intent.

  • @Leaten

    @Leaten

    10 ай бұрын

    AI generated code is here to save us lol

  • @Boris-Vasiliev

    @Boris-Vasiliev

    10 ай бұрын

    Graphics became the most important feature for marketing of new games. Need better graphics - buy new hardware. Its inevitable. DX12 and Vulkan are just tools for using new GPU features, they are not forcing devs to make ultra hi-res textures and a ton of post-processing effects. Its still possible to make low-poly models in a closed room to get 300+fps. But marketing needs open world, filled with people and ready to take nice looking screenshot at any moment.

  • @kada0420

    @kada0420

    10 ай бұрын

    Vulkan was great. Gave more life to my old pc during the time.

  • @Leaten

    @Leaten

    10 ай бұрын

    @@kada0420 bcuz it was still just as basic as opengl

  • @JN-qj9gf

    @JN-qj9gf

    10 ай бұрын

    ​@@Boris-Vasilievgraphics became the most important marketing feature for games 40 something years ago.

  • @dragons_advocate
    @dragons_advocate10 ай бұрын

    Here's one downside of these higher and higher detailed meshes and textures I have seen nobody talk about yet: The high graphical fidelity only really comes to shine when not in motion, or very slow motion. Fast movements quickly turn all those details into noise - particular with video compression (like, say for example a KZread video) -- but also 60 Hz can be low enough for our eyes to see a smearing effect. Meaning paradoxically, with this insane level of details games are capable of nowadays, you would need MORE fps (and a high refresh rate monitor, natch) to really enjoy it (Again, mainly talking about fast moving scenes and games here). And may god have mercy on your soul if you play with motion blur on. There is a specific circle in hell reserved for those people.

  • @DashMatin

    @DashMatin

    9 ай бұрын

    fr

  • @XxWarmancxX

    @XxWarmancxX

    9 ай бұрын

    In defense of motion blur: racing games. For them specifically it's hard on my eyes with motion blur off, making my eyes ache at worse on my desktop monitor.

  • @dragons_advocate

    @dragons_advocate

    9 ай бұрын

    @@XxWarmancxX out of curiosity, at which framerate?

  • @NeovanGoth

    @NeovanGoth

    9 ай бұрын

    Having good per-object motion blur should be a basic requirement for each game. When motion blur becomes problematic it's mostly camera motion blur. Many games for example don't scale the simulated shutter speed with the actual frame rate, causing it to look perfectly ok in 30 fps, but like a blurry mess in everything above.

  • @trog69
    @trog699 ай бұрын

    I bought this rig last August (i7-12700k/3080/32gb 3733 xmp/2tb NVMe) and all my games run great, but that's more because I'm an exceptionally slow player. It takes me months to finish a game. So, I'll wait on the newest games, hoping for optimization patches to roll in.

  • @serg331
    @serg3315 ай бұрын

    I went to where you were in cyberpunk. Ton of mods (none for performance), basically max settings except a few insignificant settings. I walked around and I was getting 55-65% utilization for cpu and 96% gpu util. I have 5800x cpu and 3080 gpu. Also I had my browser open and other stuff. It was 1080p so that’s probably why, but my point is maybe resolution plays a big part in this.

  • @XScythexXx
    @XScythexXx10 ай бұрын

    Great video and points, I always played on a low end PCs years behind of current games while growing up, in a country with a poor economy, but with years of work i could finally save up to upgrade my own PC to a decent level several years ago, nowdays whenever a major title releases, I feel like if you don't have the latest hardware you are just everything I've put in my PC just became obsolote on them, it's insanity. I feel like I'm back in 2010 playing some random games in 800x600 just to get 30 fps with the state of all recent titles.

  • @NeovanGoth

    @NeovanGoth

    9 ай бұрын

    I think that's connected to the current-gen consoles having much beefier CPUs than their predecessors. The PS4 and Xbox One were much more limited on the CPU side, so it soon became easy even for lower end for PCs to reach and surpass this baseline. Al lack in GPU performance can easily be mitigated by just lowering the resolution or reducing the quality of graphical effects, but if the CPU can't keep up, there often isn't much that can be done.

  • @agenerichuman
    @agenerichuman10 ай бұрын

    I've been really impressed with how you've been covering this. You're one of the few people who is sounding the alarm but without being all doom and gloom. You're right that there is hope but you're also right that this is a problem. And it's one not being talked about enough. I feel bad for all the people building PCs now who are going to be hit with a rude awakening. Upgrading a CPU sucks for someone who isn't familiar with the process (or even someone with shaky hands). It's not hard but things can very easily go wrong. Part of the push for GPUs was to take some of the load of the CPU. Seems like we're moving backwards. Also I think AI upscaling is great but I don't like the trend to use it to optimize games. Upscaling causes visual distortions. In some games it's not bad but in others it's awful. And some people will always notice no matter how good it is. It's clear we're at a cross roads in PC gaming. I hope you're right that good developers will rise to the top and things will get better.

  • @kenhew4641

    @kenhew4641

    10 ай бұрын

    "Part of the push for GPUs was to take some of the load of the CPU. Seems like we're moving backwards." The GPU and CPU have different utilization and even the way they process data is different. The CPU is generic while the GPU is task-specific. If the GPU is tasked with rasterization calculating raytracing paths. or data simulation, they do it in many orders of magnitude than the CPU. like if rendering a single frame takes an hour using CPU, GPU can render that scene in just one second. The problem is that now games are getting more complex, more data intensive with much bigger file size hence needing even more bigger memory that can transfer data at much more higher speeds, it's not just about rendering the visual output anymore, the CPU needs to process AI behaviour and logics, simulate physics, simulate soft body collisions like clothing and hair, simulate particles systems like smoke, clouds or fluids, ALL at the same time in a typical gameplay of your typical AAA games. One single fabricated chip the size of a condom pack, is not going to be able to carry all that load, especially when gaming now is going beyond 1080p. It might be enough if you scale down to 720p, but new games coming out now don't even have the option for 720p anymore. We're not moving backwards, we're moving too fast too much ahead that the data have nowhere to go to get processed, and ended up landing on poor old already overworked CPU's doorsteps, waiting in a long queue to be processed.

  • @taxa1569

    @taxa1569

    9 ай бұрын

    DLAA is the best thing to come out of the release of DLSS. Upscaling is hot garbage otherwise

  • @abeidiot

    @abeidiot

    9 ай бұрын

    @@kenhew4641 it has nothing to do with resolution

  • @arcadealchemist
    @arcadealchemist7 ай бұрын

    Currently the new Development meta is streaming assets and persistance. the graphic side is as good as it needs to be for now the CPU stuff is all about moving assets to be loaded etc which a lot of these open world games seem to need to be large enough for gameplay but allow the casching and offloading of assets..

  • @nbrown5907
    @nbrown59076 ай бұрын

    I have stepped up from an RTX 2080 Super to a 6800xt to an RTX 3090 24g OC to an RTX 4090 24g OC and all is fine have gotten better performance every step of the way. The first two cards were on an I9-10900k and the last two cards were on an I9-11900K so I got the PCIE 4 bus with that minor upgrade lol. Remember the MSI or whatever you use for GPU settings can limit your cards performance too.

  • @arioch2004
    @arioch200410 ай бұрын

    Regarding the cpu decompressing and handling textures, that is what resizable bar is for. The GPU can access storage to retrieve and decompress textures without involving the cpu. So if you have a recent graphics card and a motherboard that has the resizable bar feature in efi/bios, then enable it.

  • @imcrimson8618

    @imcrimson8618

    9 ай бұрын

    instructions unclear, my pc turned into a nuclear reactor and now its fallout 4

  • @dafyddrogers9385
    @dafyddrogers938510 ай бұрын

    I'm glad you made a video on this, I was feeling a bit sad after learning the new 4080 I bought would be getting held back by my CPU in raytracing in games like Spiderman, Witcher 3 etc. because I bought a high power gpu so I could experience these new features and yet I can't do that now because I need a £500 brand new CPU and platform :(

  • @avanx7699

    @avanx7699

    10 ай бұрын

    I got the same problem. For me a new GPU basically means building a brand new Platform as well and the worst part is, that i bought some new fitting parts only a few month ago for my current GPU which decided to give up on me now. No matter how i spin the wheel, it sucks from every point of view right now.

  • @kevinerbs2778

    @kevinerbs2778

    10 ай бұрын

    The Witch 3 is a ported from DX11 to DX12 which doesn't work well. ground up DX12 engine builds work better.

  • @deality

    @deality

    10 ай бұрын

    You need it anyway

  • @eliascence
    @eliascence7 ай бұрын

    I am using R9 5900x with MSI RTX 3060 12 GB Gaming x in MSI B550 Gaming Plus motherboard and I've got no issues at all. Maybe its down to your NVidia drivers? Microsoft hasn't updated 472.88 version for my graphics card and any recent driver is likely to be the problem. What RAMs are you using? AMD Ryzen AM4 CPUs are very powerful. Very happy with my AMD R7 5700x and AMD R9 5900x also I have got AMD R7 5800H in my Gaming Laptop and AMD R7 5700G in mini Gaming PC. AMD is overtaking Intel that's for sure! Fantastic performance.

  • @davidlefranc6240
    @davidlefranc62409 ай бұрын

    Decent video and one thing i know is 12gb gpu's are the new 8gb gpu's and a 10gb 3080 is just not gonna push much fps in a triple A title bc of this lack of gb. I feel like the 12gb model will give a different results! I almost forgot its the same for the ram 32gb is the new 16gb period .

  • @imnotusingmyrealname4566
    @imnotusingmyrealname456610 ай бұрын

    This is one of your best videos. Upscaling can't make the CPU process games faster.

  • @saricubra2867

    @saricubra2867

    10 ай бұрын

    CPU's can't brute force bad game optimization. Just now with my i7-12700K, i can get over 117fps on Crysis 1 (also higher frames than the 5800X3D), that is DX10.

  • @jmporkbob
    @jmporkbob10 ай бұрын

    One of the biggest issues is consoles, I think. PS4/XB1 released with a laughably pathetic cpu, even at its time-much less several years later. So with that being not only the lowest common denominator, but also kind of the central hardware of the industry, the cpu requirements of games became a non-issue for essentially the past decade. PS5/XSX released with a solid cpu at the time (basically underclocked 3700X) and it's still respectable a few years later. Given that they are targeting 60 (and sometimes even 30) fps on that cpu, it's going to be pretty difficult to hit very high fps on cpus from around this time period. Can there be more cpu optimizations done? I strongly think so. But the generational leap as we move out of the crossgen period is driving a lot of it.

  • @JaseHDX

    @JaseHDX

    10 ай бұрын

    DF made a video on the series x cpu, performs similarly to a 1700x, not even close to an underclocked 3700x

  • @jmporkbob

    @jmporkbob

    10 ай бұрын

    @@JaseHDX it's an 8 core zen 2 design, just underclocked compared to the 3700x. dunno what to tell you, it's literally that architecture lol

  • @ppsarrakis

    @ppsarrakis

    10 ай бұрын

    1700x is still like 5 times faster than xbox one cpu...@@JaseHDX

  • @mimimimeow

    @mimimimeow

    10 ай бұрын

    @@JaseHDX You can't use the DF data really, Windows OS and PC games wouldnt be tailored for that specific hardware config and so the Xbox CPU may be bottlenecked by other areas that Windows didn't utilize. It's like how Android runs like ass on an overclocked hacked Switch compared to a Shield, despite both having the same chip.

  • @ImplyDoods

    @ImplyDoods

    10 ай бұрын

    ​@@mimimimeowxbox's literally run windows already just with modified gui

  • @Voklesh85
    @Voklesh859 ай бұрын

    Well done and very interesting video. I agree with almost everything you said but in my humble opinion one important element is missing. Modern gaming doesn't force you to change CPU every year but you have to predict when it's time to change it. For example, until September 2022 I had an Intel 7820X 8 core CPU. I have used this CPU for almost 8 years, without problems and I even managed to use it with an Rtx 3080TI without bottlenecks at 3440x1440. But then the new generation of consoles actually began and that's when we needed to change the CPU. Those who own an Intel 13 or a Ryzen 7000 today will not have to change their CPU in 2024 but will certainly have to do so when the next consoles arrive regardless of the market segment of the CPU they own. Then another matter is having a balanced PC. Obviously there is always the issue of optimization but contemporary developers spend only a small part of the budget to optimize titles on PC also due to the large number of components in circulation.

  • @matsv201
    @matsv2018 ай бұрын

    Some modern game have direct storage access. DX12 game can use that to read the data directly from the drive directly to the GPU with no input from the CPU at all (appart from the northbridge inside the CPU) Also decompress the data. The general game data never needs to be compressed (unless you are saving in some games)

  • @portman8909

    @portman8909

    7 ай бұрын

    "no input from the CPU at all" is not true. It will just use less of the CPU, but the CPU will still have to allocate.

  • @PedroAraujo-vg7ml
    @PedroAraujo-vg7ml10 ай бұрын

    Yeah, Starfield is only getting up to 90FPS max on the demanding scenes with THE BEST CPUs, like the 13900k and the 7950X3D. Thats actually crazy. You cant even get to constant FPS with the best hardware you can buy rn.

  • @deadhouse3889

    @deadhouse3889

    10 ай бұрын

    Are you running it on a SSD? It's a requirement.

  • @MrDabadabadu
    @MrDabadabadu10 ай бұрын

    Core increase era is done. We are now in frequency, ipc and cache era. 16 core from 3 years ago is crushed by r5 7600x. 8 core can be benefitial vs 6, so you a gamer will buy every single generation of cpu from now until something changes.

  • @anthonyperez87
    @anthonyperez877 ай бұрын

    Thank you for talking about the cpu. 😢I’m wondering if changing out my gpu or just cpu on my prebuilt would make a significant improvement?

  • @lukiworker
    @lukiworker9 ай бұрын

    I agree with the points made with Video editing in clear footage. I'm not certain if AV1 Encoding and Decoding in Videostreaming can be done on the CPUs too. There are reasons for why the graphicscards have become so huge despite GDDR6 and the lack of video memory causing problems even with the latest cards and current video games on high and ultra settings. Has memory memory bandwidth (traffic) of games increased enormously which memory bandwith can't keep up with? Or has a memory bottleneck on the GPU been reached that nobody notices with GDDR6? Let me explain: I think the load distribution is analysed a bit off, as soon as a bottleneck in the GPU due to little memory bandwidth on the GPU, it can no longer manage the high traffic, it goes to the next problem solver in the CPU. But as CPUs are, they can solve all problems, but not quickly and not so much at once, as a video game is. That could explain the low loads on the GPU the memory bandwith isn't wide enough. That is due to a archtitecture design choice made years ago and a mistake like seen today. The consequences are performance drops, because the loads are shifted to the CPU, but magically the game doesn't crash. And due to the architecture it also has a high power consumption set in GDDR6 which certrainly explain the cards getting so huge and big. How could have things looked diffirent before RDNA? Back at VEGA: It was cheaper to rely on the proven and experienced GDDR memory. Instead of setting up a new infrastructure with HBM and continuously optimising it with each generation. AMD hoped that HBM memory would become the defacto new industry standard, have even accompanied two generations R9 Fury and RX Vega. And discontinued it for the next RDNA generation for GDDR6, which has exactly the same scaling issues with adding more VRAM memory, that Nvidia cards have too. GDDR can't widen out more the memory bandwith, that is the architectures fault of GDDR when trying to scaleup. The bottleneck being memory bandwith which HBM doesn't or wouldn't have. I'm not shure if it is due to the high traffic on the gpus memory bandwith. I have not found a metrics like fps or something else, that could measure the traffic on the gpus memory bandwith. I recommend the following sources to read and see on: kzread.info/dash/bejne/pZlttrWOlr2cmZM.html www.makeuseof.com/tag/high-bandwith-memory-really-need/ kzread.info/dash/bejne/oaGesatqnr26n7g.html Linus tech tips: kzread.info/dash/bejne/jH2et6-edqbaoco.html 3 Klicksphilips: kzread.info/dash/bejne/h6ScxMh7gK7fk8o.html Niktek: kzread.info/dash/bejne/qmuGmaptXcK6g8Y.html

  • @rollingrock5143
    @rollingrock514310 ай бұрын

    I've noticed this a lot on flight sim 2020. It gets so bad sometimes. Other modern games as well. Gpu is an upgrade from my last one and it stagnates behind a 3 year old CPU. Great point to bring up Vex.

  • @dylanzachary683
    @dylanzachary68310 ай бұрын

    I had a 6700k up until this year I wore that thing out and it was great I upgraded to a 12700k and I already feel like I need another upgrade…

  • @rusudan9631
    @rusudan96319 ай бұрын

    cyberpunk puts some decent load on my ryzen 3600, normally it's fine and stable 60 fps at 1440p with my 3060ti but in big crowded city areas it drops to 45-50-55. makes me wonder if i should upgrade to a 5700x or maybe even 7700x (latter is bigger pain since i gotta replace mobo-ram too)

  • @sunfirehell
    @sunfirehell9 ай бұрын

    Hey, i've got a 10gb 3080, with a ryzen 5 5650 G PRO, and trying to find the right 240 Hz monitor. I've tried benchmarks etc to wonder if i would have also to upgrade CPU, RAM, Motherboard, fans.... My CPU has lower performance than yours, but are you aaware of the performance i would have in 1440P? Also i've been wandering around, trying to find the best IPS screen with great HDR (1000) but couldn't find a flicker free one, if u have any clues please

  • @georgeindestructible
    @georgeindestructible10 ай бұрын

    2:47 very well said, a lot of people thing that just because they get the best CPU or GPU, especially the CPU, it's not gonna give them any issues, how inexperienced they are. The worst part is, we have more than enough horse power in most modern CPUs to deal with almost everything at least at constant 60 FPS but "it's hard to code for that" devs usually say.

  • @Sp3cialk304
    @Sp3cialk30410 ай бұрын

    It's almost like the PS4 generation consoles had terrible CPUs so devs didn't have much to work with. Now we have the PS5 which has the equivalent of 16 zen 2 cores. 7 cores that devs can use from the CPU and the IO device to stream and decompress assets is equal to 9 zen 2 cores. The massive increase in geometry alone has bumped CPU requirements. Not to mention volumetrics, lighting, particle effects ECT. Also with ddr4 systems you also have a massive memory bandwidth bottleneck. The consoles use shared high speed gddr6 that can be streamed to and from the SSD in real time.

  • @NeoHorizonLabs
    @NeoHorizonLabs9 ай бұрын

    Just one line.... Big greedy companies with tight deadlines and mentally destroyed developers... It's just that they are too lazy to optimize cuz these companies want games done quick

  • @brunoperugini6299
    @brunoperugini62999 ай бұрын

    which software is that monitoring the computer performance? always wanted to know which software is that

  • @ProfRoxas
    @ProfRoxas10 ай бұрын

    Unfortunately Ray Tracing is a yet another bottleneck, which is why it doesn't show up as gpu usage, even though it's running on it. CPU Usage can be said for more complex NPC AI, Logic or physics likes destroyed object falling to pieces and calculate the physics of each of the fallen chunks. DLSS won't help there because it's the same ammount of chunks regardless of your resolution, be it 4k or 240p. Like for example if you enable V-Sync (let's say 90-100fps -> 60fps), your GPU usage probably falls, because it has to wait for the specific timing of your monitor (like it would have to wait for ray tracing to finish). Using a ganeral purpose game engine like Unreal or Unity doesn't necessarily mean it's more optimised, it's more like developers don't have to implement basic functions. Using a custom engine can improve the performance, but it would take like years or decade to develop it, so we just sacrificed a small ammount of perfomenace (let's say 10-15% fps) for a highly simplified and faster development time.

  • @jakubgiesler6150
    @jakubgiesler615010 ай бұрын

    Game engine dev here: its hard to utilitize full potential of compuer just because every pc is very different and you need to fullfill somewhat comparable performance on all architectures.

  • @prashantmishra9985

    @prashantmishra9985

    6 ай бұрын

    How to become like you?

  • @TragicGFuel

    @TragicGFuel

    6 ай бұрын

    I always wondered, why can't devs try to detect what the cpu gpu ram is, and let the game choose settings that will be best for that amount of processing power

  • @jakubgiesler6150

    @jakubgiesler6150

    5 ай бұрын

    @@TragicGFuel Absolutely, your curiosity touches on a fundamental challenge in game development. While it's true that developers can detect the hardware specifications of a user's system, ensuring optimal performance across a vast range of configurations is more complex than it may seem. Firstly, there's the issue of variability within a single type of hardware. For instance, two computers with the same CPU, GPU, and RAM might still have differences in other components such as the motherboard, storage speed, and cooling systems. These variations can affect performance. Secondly, user preferences also come into play. Some gamers may prioritize graphics quality over frame rate, while others may prefer the opposite. Developing a one-size-fits-all solution that satisfies everyone's preferences is challenging. Moreover, constantly adapting game settings based on detected hardware can introduce a level of complexity that may impact the overall gaming experience. Quick adjustments during gameplay could lead to interruptions or fluctuations in performance. Despite these challenges, many developers are actively working on solutions. Some games do employ automatic detection of hardware specifications to recommend optimal settings. However, striking the right balance between customization and simplicity remains an ongoing challenge in the dynamic world of game development. In essence, while the idea of dynamically adjusting game settings based on hardware is intriguing and has been explored to some extent, achieving a perfect and seamless solution for the diverse landscape of PC configurations is a complex and ongoing endeavor.

  • @dIggl3r
    @dIggl3r22 күн бұрын

    Is there *any* other apps than Afterburner to see GPU/CPU usage in games?

  • @megapunkkk
    @megapunkkk9 ай бұрын

    Fun part is just now CDproject claimed that after the update Cyberpunk 2077 will be way more CPU heavy.

  • @user-eu5ol7mx8y
    @user-eu5ol7mx8y10 ай бұрын

    Does this mean future graphics cards should have special accelerators for tasks currently done by the CPU?

  • @roklaca3138

    @roklaca3138

    10 ай бұрын

    Direct storage comes to mind

  • @bl4d3runn3rX
    @bl4d3runn3rX10 ай бұрын

    I think AM5 is a good investment. Hoping for 3 CPU gens on it. So you buy a motherboard and DDR5, which is very cheap already and you can update twice...not bad if you ask me. Interesting comparison would also be 5900x vs 7800x3d with a 3080... can you do that?

  • @onomatopoeia162003

    @onomatopoeia162003

    10 ай бұрын

    Least for AM5. Would just have to update EUFI, etc.

  • @Nicc93

    @Nicc93

    10 ай бұрын

    going from 5900x to 5800x3d seen almost 50% higher average fps in some games, but it really depends on the game here. Higher clock speeds will benefit some games, some will benefit from the cache.

  • @ozanozkirmizi47

    @ozanozkirmizi47

    10 ай бұрын

    Hard pass AM5! I say that as an extremely very happy AM4 user... I'll check Ryzen 8000 and Ryzen 9000 series in the future. I may consider to buld an additional system If I see worthy components for my hard earned money. Until then, "Thanks! But, No Thanks!" I am good...

  • @SrApathy33

    @SrApathy33

    10 ай бұрын

    I got the 5800x3D on launch day. It was a massive upgrade over my 3600, boosting my GTX1080 in Cyberpunk with 50% more fps. The jump to a 3070Ti, which should be over twice the GPU, didn't improve Cyberpunk's fps by 50%. That worries me for the longevity of my 5800x3D which I planned on keeping several years. The massive depreciation on that CPU doesn't help it. Same depreciation as my 3070Ti btw. I could buy a Ryzen 7600/7700 platform with 32gb ddr5 for the value I lost on my 5800x3D and 3070Ti in one year.

  • @UKKNGaming

    @UKKNGaming

    10 ай бұрын

    ​@@SrApathy33buy a 6800XT it'll work a lot better with the 5800X3D. 3070TI is a dying GPU. 6800XT is competing against a 3080TI right now for way less money.

  • @Amariachiband
    @Amariachiband8 ай бұрын

    10:27 THANK YOUUUU I been telling people it’s a waste of money to get any high end gpu before upgrading their CPU

  • @zzsmkr
    @zzsmkr8 ай бұрын

    I have a Razer blade with the same specs as your PC, and I got really scared that the seller ripped me off and there is something wrong with the laptop while playing Miles Morales, but it really just came down to 9 5900hx not being able to push 3080 😢

  • @Bsc8
    @Bsc810 ай бұрын

    It's called bad optimizations by devs due to marketing pressure. So too much limited time for them to provide good games experiences! UE5 it's a CPU eater because it's not being optimized at all due to the fact that it's not well known yet (just like early UE4 games). It's like running a demo game on the engine editor not a final release.

  • @kizurura

    @kizurura

    10 ай бұрын

    Man, Unreal Tournament 2004 looks gorgeous and it's a technical miracle it looks great when it ran on literally anything at the time. That's optimization.

  • @mimimimeow

    @mimimimeow

    10 ай бұрын

    Most new engines are CPU eaters because lots of the logic were done on abstract layers, so devs don't have to program everything manually - precisely the purpose of a game engine. Games are getting way more complex than it was 10 years ago too. It's a tradeoff because when more things are optimized manually then the cost+time would go through the roof, which would be better used on game content and QC that will bring more revenue. Alternatively you make a simpler game. Take Elden Ring vs Armored Core 6 for example. They both run the same engine, but Armored Core's linear design and non-organic graphics are easier to optimize and QC, so it runs way better than Elden Ring for a given dev cycle. The reality is game companies are corporations and they all have financial targets to meet.

  • @Bsc8

    @Bsc8

    10 ай бұрын

    @@kizurura yes very good example!

  • @Bsc8

    @Bsc8

    10 ай бұрын

    @@mimimimeow i know It works like that, but making content for a game that runs like ass it's not a good thing: let's say i'm interested in a new game, i'm thinking about buying because i made a beast of a PC two years ago but that game can be played smoothly on my hardware!? I'll pass and probably never play It. What's the point of making good ideas/content for new games that peoples can't enjoy at all? That's the main reason why i can't get hyped for nothing anymore, and when i'm wishlisting something i always have the fear of not being able to run It. _(edit) The only games i play now are: the older ones i still have to play on my libraries, something that comes free from the stores or heavly discounted._

  • @mimimimeow

    @mimimimeow

    10 ай бұрын

    @@Bsc8 heh, ask the genius executives that make those decisions. We should have this, we should have that, because the market research says so, here's the suboptimal budget, deliver it before the next fiscal year. It's ok if it's buggy at release, as long as we hit the revenue target first. If market analysis was spot on people will put up with it anyway. Job done.

  • @hatchetman3662
    @hatchetman366210 ай бұрын

    I know I've been struggling with a 3700x. It often doesn't even max out my 2070 Super, anymore. I could only imagine how bad it would be with a 3080. You pretty much hit on everything I been preaching for the past few years.

  • @alaa341g

    @alaa341g

    10 ай бұрын

    try to max up features and graphics that are way more heavy on the GPU , at laest like that you'll gonna be sure its working 99% XD ; fuck the modern gaming market

  • @CurseRazer

    @CurseRazer

    10 ай бұрын

    Don't even think about it xd. My 4070 ti with a 3800x is only working at maximum 60-70% most of the times. There are instances where it is maxed, but very few sadly. Dont even know what to buy next... a 5800x3d or a 5900x... looks like it doesnt matter

  • @hatchetman3662

    @hatchetman3662

    10 ай бұрын

    @@CurseRazer Well, 3700x and 3800x don't perform much differently in games. If I can afford it, my next upgrade is gonna be a 5600x3d or 5800x3d. It should help in games, regardless. But it is disheartening to see that nothing has gotten better and there's no solutions, currently.

  • @jordanlazarus7345

    @jordanlazarus7345

    10 ай бұрын

    I've got a 1070 and a 3900x and even I'm not topping out my GPU in some games lol, will probably go for the 5800x3d at some point.

  • @hatchetman3662

    @hatchetman3662

    9 ай бұрын

    @@aqcys6165 My PC is as "optimized" as it can be without a faster processor and GPU.

  • @Nico-ci9qb
    @Nico-ci9qb9 ай бұрын

    As someone who doesnt know the exact specifics, if the cpu is struggeling with These tasks woul a new specialised komponent fix the issue? Something like a MPU ""mesh processing unit "" ?

  • @AcuraAddicted
    @AcuraAddicted9 ай бұрын

    IDK, my 3800X from four years ago is still chugging away. Honestly, it doesn't make much sense to upgrade to the next gen, because of how small the gain is. Even the latest Ryzen gen is barely a 30% gain over mine, and that upgrade would be astronomical, as I'll have to buy a new MB and memory.

  • @andreabriganti5113
    @andreabriganti511310 ай бұрын

    Lowering some options such " crown density " can help a lot in games like Spiderman and Cyberpunk. It isn't the " ultimate " solution but it help. EDIT: It's also worth give an eye on the GPU control panel, in order to see what kind of workloads are assigned to the GPU. I had, in few games, such issues with my 5800X3D, alongside my 4070 TI and few adjustments, solved those minor issues I encountered. Hopefully this will help. Have a good day.

  • @CollynPlayz

    @CollynPlayz

    10 ай бұрын

    What settings do I do in nvidia control panel

  • @andreabriganti5113

    @andreabriganti5113

    10 ай бұрын

    @@CollynPlayz Be sure PhysX is off or, at worst, is controlled via GPU instead of the CP,U, than look into " manage 3D settings " and crank up the video settings. This will NOT benefit the CPU directly but will make the GPU do more work, helping to balance the usage between GPU and CPU. After that, lower the amount of the CPU workloads in Windows and be sure the power management is disabled. This did help me but again, it work fine when the performances difference between CPU and GPU is minor AND the CPU is almost always at 97/98%+. If, as an example you have a Phenom and a 3090, the gap can't be helped a lot.

  • @richardsmith9615

    @richardsmith9615

    10 ай бұрын

    @@andreabriganti5113 Would you recommend the 5800x3d as an upgrade path from a 5600g for 1440p gaming? Or do you think it's better to hold off for a future socket instead? Currently I'm using an Arc a770

  • @tomomei

    @tomomei

    10 ай бұрын

    Yes use the 5800x3d it will give you a massive improvement in your 1% lows and make games stable. Also the 5600g is a PCIE Gen3 only cpu and with the 5800x3d it will be gen4@@richardsmith9615

  • @mv223

    @mv223

    10 ай бұрын

    @@richardsmith9615 It's worth every bit if you don't want to revamp your whole system, especially with all the issues the new processors are showing. I have the 5800x3d and the 4090, and a lot of the times it will max out the 4090. No need for anymore for a WHILE. Also, I game at 3440x1440 @ 240hz

  • @astreakaito5625
    @astreakaito562510 ай бұрын

    I'm building my new 7800X3D system tomorrow and this is why. Although you gotta remember other bottlenecks can exist, could be cache issue, memory issue, gpu mem bandwidth issue on the RTX4000s, and sometimes the engine itself simply can't cope with poor code and will simply fail to use HW resources for seemingly no reason. Also if a thread is maxed it is maxed and no amount of moar cores will help, it's impossible for the same tasks to jump to another thread that's free that's why not even CP2077 which is very well multi-threaded for a videogame still won't use your 5800 at full 100%

  • @2528drevas

    @2528drevas

    10 ай бұрын

    I skipping this generation and riding my 5800X3D and 6900XT for at least another year. I'm curious to see what AMD has up their sleeve by then.

  • @NippyNep

    @NippyNep

    10 ай бұрын

    bro that can last u years@@2528drevas

  • @Revnge7Fold
    @Revnge7Fold5 ай бұрын

    DUDE, this is so true! I experienced thiis first hand. Was on an AM4 Plantform with 2700CPU and GTX1070, Upgraded my 1070 to a 3070. I always skip a gen. But my gaming experience was so bad on a the newer games that I played, especially Cyberpunk 2077 and Stalker Gamma. The games where just choppy/lots of frame dips and got max 40-50 fps no matter my settings. So I read on a cyberpunk redit of guys performance on 3070's and some of the with older CPU's where experiencing the same thing! There where some examples of guys with weaker CPU's that said upgrading their CPU's made a massive difference. So I decided to get myself a 5800X and OMG... it made a MASSIVE difference like basically DOUBLEing my FPS... a CPU upgrade has never made such a big difference in my life!

  • @dedanieldd
    @dedanieldd7 ай бұрын

    Just wanna appreciate your mic setup. Despite having the mic that close to your mouth, your voice sounds clear, doesn't sound choked, compressed and too filtered out like most of live streamers I've watched.

  • @macronomicus
    @macronomicus10 ай бұрын

    Its good to get a sense of the required hardware up front before making a game purchase, see what others are saying, and avoid badly optimized games vocally, could give devs some crowd support to push back on management & budget for some proper optimizing, otherwise they're throwing away millions of potential sales.

  • @thseed7
    @thseed710 ай бұрын

    Think it's good practice to develop games accessible to more than just the top 2-5% of Gamers with the highest end systems. Optimization is important as well. If your new features tank high end CPUs and GPUs, they probably aren't ready yet.

  • @GmanGavin1
    @GmanGavin19 ай бұрын

    4:10 higher resolution textures use more VRAM on your GPU. The only thing the CPU would be doing is managing that is stored on VRAM or not.

  • @kennethnash598
    @kennethnash5989 ай бұрын

    can you try forcing an internal gpu with max ram usage and discrete gpu and see if it has any changes?

  • @Lust4Machine
    @Lust4Machine10 ай бұрын

    Well I agree there's been a trend of poorly optimized games I don't think the requirements to run games with good graphics shouldn't change to different components as graphics technology developes. I would also consider that it might be the massive leaps in gpu performance has outpaced CPU development.

  • @knockbitzhd

    @knockbitzhd

    7 ай бұрын

    yeah im running fortnite at medium/high settings at dx12 with ryzen 7 5700x ( came out a year ago ) with rtx 4060 and the gpu utilization is at 88% no matter what i do it wont hit 99% xd

  • @TheIndulgers
    @TheIndulgers10 ай бұрын

    Part of the issue is Nvidia’s driver overhead with their software schedular. Hardware unboxed did some excellent videos about the severity of this with older cpus. Now that many newer games are eating up additional cpu resources (doubly so for RT), Nvidia’s lack of foresight is compounding the problem. This can be seen in games like Hogwarts, Jedi Survivor, and SpiderMan.

  • @giantninja9173

    @giantninja9173

    10 ай бұрын

    Yeah but Nvidia dropped the charade, showing they couldn't give 2 S***s about the audience that made them with this gen, and went in on enterprise AI cards.

  • @PersonaArcane
    @PersonaArcane9 ай бұрын

    You can enable hyper-threading, but this is a parallel processing issue. Only 1 core can be used by a game unless it's properly programed to run in parallel with multiple cores (which can be HARD!)

  • @DiizRupT
    @DiizRupT6 ай бұрын

    yes this is an issue for sure one thing I do is either go 4K or up the resolution scale.

  • @asmongoldsmouth9839
    @asmongoldsmouth98399 ай бұрын

    *The truth is, the difference between ULTRA and HIGH/MEDIUM mixed settings is nearly indistinguishable. ULTRA is a setting for people who don't know what each setting does. I have a 3080 10GB and the 7800X3D and I have zero issues with AAA titles on HIGH with ULTRA textures.*

  • @joeyazzata29
    @joeyazzata2910 ай бұрын

    What’s up Vex I have a 5800X3D / RTX 4070 Ti I run Fortnite on UE 5 High preset with Nanite, Lumen, DLSS Balanced, at 1440p 165hz. I’m targeting a stable 158fps (165hz + nvidia reflex cap) Over the weekend, I had the chance to test a 4080 in my system Using Intels PresentMon: 4070 Ti rendering at 6.3ms = 158fps 4080 was rendering the game at 5.5ms indicating the 4080 can run the game at around 180fps However, due to the 5800X3D only being able to run the game at 7ms… I only saw 140-150fps with both GPUs The 5800X3D bottlenecked the 4080 to the point of being completely indistinguishable from my 4070 Ti Nanite and especially Lumen GI are extremely CPU demanding! You also have to consider there’s not a ton of detail going on in Fortnite compared to other games. Fortnite might be the best CPU optimized game we see on UE5 Check it out using intel’s presentmon

  • @joseyparsons7270

    @joseyparsons7270

    10 ай бұрын

    what's the fps difference when you use hardware rt vs software lumen? when using hardware rt, is the cpu render time the same?

  • @joeyazzata29

    @joeyazzata29

    10 ай бұрын

    @@joseyparsons7270 it’s a small difference, 10-15fps for me Either lumen is already doing most of the legwork or the rt effects themselves aren’t that demanding, not sure but it’s very light. I think AMD cards even run the RT better in this game if I recall correctly

  • @marcoferri716
    @marcoferri7169 ай бұрын

    What software is used in this video to check all the of stats?

  • @theslicefactor4590
    @theslicefactor45909 ай бұрын

    What's the program you're using to show those stats?

Келесі