Why Reviewers Benchmark CPUs @ 1080p: Misconceptions Explained

Ғылым және технология

Asus: www.asus.com/au/events/infoM/...
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 3050 - geni.us/fF9YeC
GeForce RTX 3060 - geni.us/MQT2VG
GeForce RTX 3060 Ti - geni.us/yqtTGn3
GeForce RTX 3070 - geni.us/Kfso1
GeForce RTX 3080 - geni.us/7xgj
GeForce RTX 3090 - geni.us/R8gg
Radeon RX 6500 XT - geni.us/dym2r
Radeon RX 6600 - geni.us/cCrY
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6700 XT - geni.us/3b7PJub
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6900 XT - geni.us/5baeGU
Video Index:
00:00 - Welcome back to Hardware Unboxed
02:14 - Sponsor Spot
03:05 - The Testing
04:18 - Assassin’s Creed Origins (1080p Ultra High)
06:10 - Warhammer: Vermintide 2 (1080p Extreme)
07:10 - Watch Dogs: Legion (1080p Very High)
12:20 - Watch Dogs: Legion (4K Very High)
13:48 - Call of Duty Modern Warfare 2 (1080p Basic)
14:59 - Call of Duty Modern Warfare 2 (4K Basic)
15:36 - The Riftbreaker (1080p High)
16:39 - The Riftbreaker (4K High)
16:55 - Mixed Data (13th gen vs 12th gen CPUs)
19:08 - Final Thoughts
Read this article on TechSpot: www.techspot.com/article/2618...
Why Reviewers Benchmark CPUs @ 1080p: Misconceptions Explained
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo

Пікірлер: 2 700

  • @puokki6225
    @puokki6225 Жыл бұрын

    Anyone saying 1080p is "dead" needs a serious reality check. According to Steam hardware stats, ~65% of their userbase still uses it. 1440p is at 11% and 4K at a tiny 2.7%. For reference, 1366x768 is at 5.9%. The high end market, which realistically almost all of the DIY PC market currently is, is very niche even when it comes to gaming in general.

  • @vsaucepog7829

    @vsaucepog7829

    Жыл бұрын

    Just people looking for confirmation bias lol. 1080p was and still will be king for a long time.

  • @SpartanDusk

    @SpartanDusk

    Жыл бұрын

    I blame the console advertisements and TVs, consoles advertise running games at 4K, while true sometimes, is not always true. That said, my greedy mind would want a PC that would outperform a console so a PC with at least a 5600X/12100 and a 3060Ti, with either a 1080p or 1440p monitor.

  • @dante19890

    @dante19890

    Жыл бұрын

    1080p is dead for enthusiasts unless ur heavy into competetive esports with a 360hz monitor. ITs enthusiast and not the average consumer who are interested in these benchmarks

  • @vsaucepog7829

    @vsaucepog7829

    Жыл бұрын

    @@dante19890 yeah, but that's a very small minority of the community.

  • @lucianocortes8636

    @lucianocortes8636

    Жыл бұрын

    I bet those same idiots use a 23" monitor to play at 1440p or higher, rofl.

  • @chrishexx3360
    @chrishexx3360 Жыл бұрын

    This video should be classed as a mandatory watch before buying new CPUs. Absolute gold standard content.

  • @mircomputers1221

    @mircomputers1221

    Жыл бұрын

    and GPUs

  • @MoChuang343

    @MoChuang343

    Жыл бұрын

    19:48 I also understand why viewers want to see what kind of performance upgrade a new CPU will provide for their specific configuration but you can quite easily work that out with just a bit of digging. All you have to do is work out how much performance your graphics card will deliver in a chosen game when not limited by CPU performance this can be achieved by looking at GPU reviews simply check the performance at your desired resolution then cross-reference that with the CPU data from the same game. Can you please clip this and post it like in all of your video descriptions?

  • @seppomuppit

    @seppomuppit

    Жыл бұрын

    mandatory before getting your youtube benchmarking commentor's licence

  • @TechRIP

    @TechRIP

    Жыл бұрын

    @@MoChuang343 So you want people to do the work for the reviewers? HUB gets paid to do reviews. Hold them accountable for doing them correctly instead of being lazy.

  • @MoChuang343

    @MoChuang343

    Жыл бұрын

    @@TechRIP I would argue its HUB's job to provide the resources and data in order for consumers to make an informed decision and that is exactly what they do. Reviewers cannot possibly make a video on every single possible hardware configuration. Even with just 3 generations of CPU and GPU releases, that would be an insane combinatorial number of possibilities. But by making videos that shows the upper limit of a CPU, and the upper limit of a GPU (at given settings), then you can easily determine if a system with that combo is CPU or GPU limited in that game. And with that info, you can pretty quickly determine whether a CPU or GPU upgrade will help.

  • @antonyhermens9985
    @antonyhermens9985 Жыл бұрын

    Wow Steve... you have a enormous amount of patience explaining these tests. Thank you for the effort and don't get burnout man.thanks again

  • @seylaw

    @seylaw

    Жыл бұрын

    This saves him a tremendous amount of explanations in the comments though. :D

  • @justinvanhorne9958

    @justinvanhorne9958

    Жыл бұрын

    @@seylaw It really doesn't. the new human standard of internet dweller that is being targeted here simply is incapable of listening. As I stated elsewhere, you could produce 100 of these videos consecutively and one benchmark gets released and the comments would be flooded. This is what happens when you attempt to assign explanation and logic to situations that ask for none. People are stupid, gamers that walk into a shop and drop 6k USD on a prebuilt are by definition ignorant. It's not their fault, and its not their entire existence, its just a way of making you understand what you are up against here. With the ever increasing gap of knowledgeable techies to ones that were simply brought up in it, believe me I can look at my kids and see it. Intelligence, logic, complex thoughts can take many forms, when things are taken for granted, they just want them "to work". Now if you'll excuse me, that's enough PC time for today, time to go back to the real events of impending financial collapse of "first world" economies and all the other good things that go with capitalistic societies.

  • @seylaw

    @seylaw

    Жыл бұрын

    @@justinvanhorne9958 Sure, but he can always point to this video. You can't be safe of ignorant people, right. But I support it that he tries to explain this to educate people that are willing to learn and to listen.

  • @justinvanhorne9958

    @justinvanhorne9958

    Жыл бұрын

    @@seylaw that is the unfortunate thing, those educated enough to know better, probably aren't spouting off in the comments about something passionately that is the equivalent of putting balls in holes.

  • @bestrum1188
    @bestrum1188 Жыл бұрын

    Videos like this are so important and I just want to thank you for all the content you produce. This has been my favorite hardware channel for many years and it is so great to see you getting close to the 1mil milestone.

  • @SashimiSteak
    @SashimiSteak Жыл бұрын

    In a nutshell we are trying to compare "maximum performance potential" of CPUs, so we need to do our best to create a CPU bottleneck scenario. Edit: to elaborate: The test is an artificial scenario designed to compare, not to show "real world performance". Because CPU/GPU utilisations varies from game to game too much, "real world performance" will swing up and down so much and will never show any meaningful patterns. The CPU you'd want to buy depends wholly on what games you play, as such it's impossible for reviewers to tailor made their review based on your needs. "maximum performance potential" is a far more universally meaningful metric. You can actually extrapolate real world scenarios by taking the "maximum performance potential" shown here for games you play, find a review for your GPU somewhere and find out how many frames it can push for those games, then match the closest performing CPU / GPU. This is one way to optimise your purchases. But you still need to do some calculations, can't expect the data to be spoon fed to you.

  • @jason6296

    @jason6296

    Жыл бұрын

    ^^^^^^ ??????? lol

  • @chadhowell1328

    @chadhowell1328

    Жыл бұрын

    @roenieI’m sorry but what?

  • @GewelReal

    @GewelReal

    Жыл бұрын

    @Roenie It will absolutely matter. Games not only get more GPU demanding but also CPU demanding. Otherwise we'd be still using Pentium III's

  • @araqweyr

    @araqweyr

    Жыл бұрын

    ​@Roenie It's not like games only use GPU, or rather CPU isn't needed only to shove data into GPU. There are also some game related computations on the CPU and it also has to be done for each frame. We might not be able to see the difference right now but if (or rather when) developers start better utilizing CPU then gap between CPUs will start to widen

  • Жыл бұрын

    @Roenie Here in PAL region we used to play at 24fps in 576i, so we've not just went a long way up in graphics quality and resolution, but framerate as well (I use to target at 60fps in my PC, but my screen is capable of going up to 144). Different starting framerate and resolution in NTSC, but same history.

  • @Cinetyk
    @Cinetyk Жыл бұрын

    The thing to understand is that one thing is a CPU review/benchmark, another thing is building a "reasonable" configuration (CPU+GPU) and see what kind of performance you get. They're two different kind of "studies", looking into related but different things. Like Steve said, you can combine both CPUs and GPUs reviews to inform on what kind of performance you can get with your desired pairing. I hope you continue to do the "scaling" videos where you see how a CPU scales with increasing GPUs and vice-versa - this truly makes it able to get a clear picture about performance.

  • @markhackett2302

    @markhackett2302

    Жыл бұрын

    It is far worse than that. WHAT GPU is there? "Oh, it has to be an NVidia one, because they have like 10x the market share of AMD!!!", and since the speed of a 6600XT and a 3060 is about the same, there won't be an AMD card on the roster. And any balanced system will spend money on a GPU rather than a CPU, and bottleneck to "the CPU is largely irrelevant", because even in "cpu tests", the GPU bottlenecks first. Indeed the GPU is rated not to the CPU but the monitor, so the CPU+GPU combination is 100% irrelevant, it is the GPU+Monitor that should be combined. So it isn't "who buys a low end 120$ CPU to use with the 1600$ 4090", it is "who buys a 4090 but runs it on a 1080p monitor?"

  • @Spido68_the_spectator

    @Spido68_the_spectator

    Жыл бұрын

    @@markhackett2302 they used the 6900 XT for 1080p testing for a while

  • @aratosm

    @aratosm

    Жыл бұрын

    No one likes you.

  • @MarcABrown-tt1fp

    @MarcABrown-tt1fp

    Жыл бұрын

    @Mark Hackett You simply don't understand that in order to find out how powerful any given Cpu is in single core or multi core performance, you can't be GPU bound. The resolution is irrelevant in Cpu testing.

  • @GeneralFuct

    @GeneralFuct

    Жыл бұрын

    ​@@MarcABrown-tt1fp the data he presented literally shits on everything you just said

  • @AM-qc3vk
    @AM-qc3vk Жыл бұрын

    The Call of Duty Modern Warfare 2 benchmark has a cool feature that shows you how many fps the cpu delivers vs the gpu. Pretty cool to see that data directly in a benchmark and to see where the bottleneck is in your system.

  • @invalid8774

    @invalid8774

    Жыл бұрын

    Well its usually not that hard to spot the bottleneck. If your gpu is at over 95% load, its a gpu bottleneck. If not, its most likely a cpu bottleneck.

  • @pendulum1997

    @pendulum1997

    Жыл бұрын

    @@invalid8774 What?! My 3070ti is a bottleneck with your logic then?

  • @ungoyone

    @ungoyone

    Жыл бұрын

    @@pendulum1997 Why wouldn't it be? It's not a high-end card.

  • @dnalekaw4699

    @dnalekaw4699

    Жыл бұрын

    @@pendulum1997 well yeah of course it is

  • @MoltenPie

    @MoltenPie

    Жыл бұрын

    @@pendulum1997 depends on the resolution and settings. Your 3070ti can bottleneck even an i5 2500k if you push for 8k ultra. And vice versa if you set 720p with 25% resolution scale on low settings even an i9 13900ks would probably become a bottleneck with your 3070ti.

  • @PK-lk5gs
    @PK-lk5gs Жыл бұрын

    Great piece of testing! "The games you play aren't the same games everyone plays" - I'm shocked to realize how this very basic point is not obvious to everyone. Thank you very much, Steve!

  • @flameshana9

    @flameshana9

    Жыл бұрын

    It is a problem when _everyone_ assumes you need 90+ fps to game.

  • @masudparvejsani

    @masudparvejsani

    Жыл бұрын

    @@flameshana9 It is a problem when everyone assumes you need to max out everything you see in graphics settings.

  • @TheDude50447

    @TheDude50447

    Жыл бұрын

    As an RTS player I know this better than most. Some people find it hard to believe that I generally favor cpu over gpu performance for gaming.

  • @Machinens

    @Machinens

    Жыл бұрын

    ​@@TheDude50447 same thing with MMO's, I need more CPU horsepower for VFX and loading into the areas and players as well

  • @4zur3
    @4zur3 Жыл бұрын

    The concept is to never limit one tested part by its counterpart. Always test a gpu with most performant cpu. Vice versa for cpus, testing at low resolution so you are not limited by gpu. This concept holds for decades. Every gamer should deeply understand by taking time to learn this.

  • @r1pperduck

    @r1pperduck

    Жыл бұрын

    Takes no time at all, it's a very simple concept and you explained it perfectly.

  • @meurer13daniel

    @meurer13daniel

    Жыл бұрын

    Agree. If you are entering the pc gaming market, you should at least know how hardware works. It doesn't take time to learn and you will buy the best product for you.

  • @sammiller6631

    @sammiller6631

    Жыл бұрын

    performant? shouldn't that be "performing"?

  • @YurandX

    @YurandX

    Жыл бұрын

    @@sammiller6631 artist can be performing, cpu is just most performant. idk if its correct but it makes sense to me

  • @meurer13daniel

    @meurer13daniel

    Жыл бұрын

    @@sammiller6631 Performing = verb Performant = adjective Performant in this context is the correct word

  • @The_Noticer.
    @The_Noticer. Жыл бұрын

    I can't believe we're still having this conversation. I'm so glad i'm no longer on computer related webforums, because i'd have pulled out my hair trying to explain people this. I've done it so many times, made the same tired arguments.. but people see a result they dont like and they immediately revert back to this grug mentality. Same with the "bottleneck" argument. Brutal, just absolutely brutal trying to tirelessly explain this. Bless your patience, I would've long lost that patience by now.

  • @umamifan

    @umamifan

    Жыл бұрын

    Computer and tech forums are the worst places to be on the internet. Too many armchair experts who get upset the moment a disagreement happens...

  • @YeaButCanUDoABackflp

    @YeaButCanUDoABackflp

    Жыл бұрын

    For real. It's not even that complicated.

  • @Bareego

    @Bareego

    Жыл бұрын

    People are not stupid, they're just not as rich as you to make that argument. Until you get this there's not point arguing.

  • @mcbellyman3265

    @mcbellyman3265

    Жыл бұрын

    @@Bareego They are stupid. It doesn't require money to understand basic logic. In fact, the less money you have, the more important it is to understand this stuff, as it requires you to be more selective in buying the appropriate equipment. Rich people can just buy the best CPU and best GPU and ignore all this.

  • @The_Noticer.

    @The_Noticer.

    Жыл бұрын

    @@umamifan Just like with most things now, people are hardwired to see people having different opinions as an enemy that is beyond reproach. There are SOME acedemic arguments against using 1080p data. Such as it not being very good to measure ABSOLUTE performance of that specific CPU in that specific title, as increasing the resolution also has a detrimental affect on CPU load. It has to process more geometry data, especially now with ray-tracing. However, when used as a RELATIVE test against other CPU's using the same load, its still perfectly valid. As they all are tested under the same conditions. But when something as simple as what steve is explaining is lost on these people, the finer points will be lost on them even moreso... As I said, im glad I no longer dwell on LTT-like forums.

  • @darthdadious6142
    @darthdadious6142 Жыл бұрын

    The absolute most useful info was his recommendation to cross reference cpu vs gpu charts to match up your cpu/gpu combo. I was really wanting to know how to find the maximum GPU my CPU will benefit from. Now I know how to do that. I just wanted a simple chart with all cpu/gpus combos to make it easy. Is that too much to ask? lol, kidding. That would be a nightmare for anyone to compile every time a new cpu/gpu came out... But a quick look, my 3800x wouldn't need more than a 3070ti or a 6800 (16 game average at 137 fps at 1080p). If I upgraded my cpu from my 3800x to a 5800x, I could utilize a 6950 or a 3090 ti (to get a 180 fps 16 game average at 1080p). While I'd have to jump to a 7900xt or 4070 ti to get the 200 fps 16 game average on a 5800x3d. Sadly, I couldn't find any recent benchmarks to see how my 5700xt handled today's games. But based off of the original review of the 3070ti, I'd see a 45% improvement at 1440p or 55% at 4k. But no 1080 numbers, which is what I game at...

  • @TheAllhailben7

    @TheAllhailben7

    Жыл бұрын

    Honestly, you don't want to be CPU bottlenecked in a game. If you're GPU bottlenecked, the FPS might go from like 90 to 80 sometimes. If you're CPU bottlenecked, you could be bouncing back and forth between 75 and 100 while your gpu struggles to be fully used constantly which is not a good experience. If you're running anything like a 3080/6800xt or better, at least a 1440p monitor is recommended because you get all of the performance you're paying for.

  • @qlum
    @qlum Жыл бұрын

    While I was fully aware that there is a difference in cpu performance, this does really show it's time to upgrade my aging 8700k@5ghz now that I upgraded my gpu to a RX 7900 XT. Especially since a lot of the games I play may be more cpu bound than the games regularly benchmarked. However, I do tend to spread out my upgrades a bit as not to spend a load of money at the same time. So probably something that's gonna wait till middle of the year. The gpu upgrade was definitely more needed (GTX 1080) not only for performance but also because Nvidia's linux drivers are hot garbage.

  • @MrHeHim

    @MrHeHim

    Жыл бұрын

    Generally CPU bottlenecks can be felt/seen in game as hard stutters and GPU bottleneck of course would be lower FPS but still relatively smooth. I personally would match it with a 7700x, if you're lucky enough to live next to a Micro Center they are giving away 32GB DDR5 6000 if you buy one. BUT the motherboards are over 250 so it comes out to around $600 still.. BUT (more butts) they have a $600 bundle with a 7900x, G.Skill 32GB DDR5 6000, and a ASUS B650E-F ROG STRIX GAMING WIFI 😅 I'm personally happy with my 6800 XT and 3700x, so although i almost made the purchase twice i backed out last second because then I'm going to want to get a 4080 or 7900 XTX to match it 🥲

  • @qlum

    @qlum

    Жыл бұрын

    The paradox games I play benefit greatly from x3d I as for microcenter, I live in the Netherlands so there is nothing of the sort here. In checking I did notice the 7700x dropping quite a bit in price to € 300 in the last few days and that does look interesting. Still with the first x3d parts around the corner, I can at least wait for reviews / benchmarks of those. With a decent am5 motherboard and a decent kit of memory that would be around € 720 including vat or comparing it to usd roughly $635 excluding vat. Not too terrible either with no bundles.

  • @MA-90s
    @MA-90s Жыл бұрын

    This guy doesn't pull his punches and delivers outstanding content. We need this. Thank you.

  • @RNG-999

    @RNG-999

    Жыл бұрын

    Doesn't*?

  • @MA-90s

    @MA-90s

    Жыл бұрын

    @@RNG-999 Does not*?

  • @zenstrata

    @zenstrata

    Жыл бұрын

    he's wrong though. because eventually 1080p will be a useless test benchmark, because nobody will use it. Otherwise they should be using 144p or even 480p for 'testing'. If he was entirely right, then he should go to the extreme, and be 'more right'. but instead he's only 'partially' right, and that won't last, because eventually 1080p will be consigned to the dustbin of history. Just like 144p and 480p is today.

  • @MA-90s

    @MA-90s

    Жыл бұрын

    @@zenstrata "eventually" dont cut it. This is the here and now. Right now, he is 100% correct.

  • @Kuriketto

    @Kuriketto

    Жыл бұрын

    @@zenstrata No, he's not wrong, and you've completely missed the point of the video. Testing at 1080p with top-end GPUs gives the audience an idea of how much a given CPU might scale with GPU upgrades years down the line, because it's by far the resolution that most gamers use today and will be using even several years from now, likely even after multiple GPU upgrades. Games will continue to become more demanding, and as an example, somebody building on a budget may need to keep upgrading their GPU just to maintain an acceptable level of performance at 1080p, while their CPU remains untouched. The idea that 1080p might one day fall out of mainstream usage is irrelevant.

  • @JarrodsTech
    @JarrodsTech Жыл бұрын

    Great explanation, I will definitely be linking to this video in a number of my own comments 👍👍

  • @pf100andahalf

    @pf100andahalf

    Жыл бұрын

    Hi Jarrod!

  • @martineyles

    @martineyles

    Жыл бұрын

    Only half helpful though. In the desktop space, it's good to test the CPUs with a high end GPU. However, reviewers have acknowledged that there is increased CPU workload when using ray tracing at 4k, so it's possible a low end CPU might not provide the framerate we want at 4k, and the CPU benchmarks at 4k are still useful.

  • @tyre1337

    @tyre1337

    Жыл бұрын

    always better to link to reviewer that can be trusted and isn't manipulated by nvidia

  • @grizzleebair
    @grizzleebair Жыл бұрын

    Great video! I did know this, but bless you for doing this video for your newer followers. That is a heck of a lot of repeated testing. For you new followers, stick around, these guys know their stuff and do quite extensive testing. They also have a good working relationship with some other of the top Tech YTers, and aren't afraid to do collabs or refer us to someone they trust if it helps us out or helps them reinforce a point they have made. On another note, I know Australia is big, and this may not be feasible, but it would be nice to see a collab with several of the Aussie Tech YTers. Maybe with just the ones in your corner of Aussie Land. Just a thought.

  • @jaymacpherson8167
    @jaymacpherson8167 Жыл бұрын

    Thanks Steve for making it clear how the hardware compare using a baseline standard. That apples to apples comparison empowers the individual PC builder when mixing and matching their hardware.

  • @Beefy-Tech
    @Beefy-Tech Жыл бұрын

    I tested a 4090 at 1440p competitive settings in warzone 2 with a 5800x3d, to test cpu bottlenecks in warzone 2, and me, a guy with 169 subs got comments saying 4090 is not meant for 1440p, but for 4k only...as if 360hz 1440p does not exist, or is it assumed that most professional play at 4k res vs 1080/1440p high refresh xD? The point of a cpu test IS to bottleneck it to see which ones faster or how far in FPS it can take the GPU.

  • @elirantuil5003

    @elirantuil5003

    Жыл бұрын

    Probably children. 2080ti was also a "4k only" card, let's see them gaming on 4k with it now.

  • @mrbobgamingmemes9558

    @mrbobgamingmemes9558

    Жыл бұрын

    @@elirantuil5003 expecially on forsebroken (forsespoken)

  • @Threewlz

    @Threewlz

    Жыл бұрын

    360hz is copium, isn't it? Hell even 240hz is kinda overkill. How many games are young getting 240fps plus (apart from valorant and csgo) without spending thounsands of dollars into a pc? 144hz is enough and should be the norm a few years from now on I believe

  • @TheGreenReaper

    @TheGreenReaper

    Жыл бұрын

    @@Threewlz They tested this with fighter pilots. 144Hz is not the limit of human perception. 240 Hz is closer. 360 Hz is likely beyond a noticeable return.

  • @BlahBleeBlahBlah

    @BlahBleeBlahBlah

    Жыл бұрын

    @@Threewlz ​ projecting much? Sure it isn’t a concern for you (or me TBH). However I understand there are people who want and can afford the hardware to give them super high frame rates @ 1080p.

  • @lawliot
    @lawliot Жыл бұрын

    I buy CPUs based on the number of Xs in its name, no benchmarks needed. But seriously, I see at least one comment on every CPU video (not just on HUB) about 1080p and it felt pointless to bother replying to them.

  • @Eidolon2003

    @Eidolon2003

    Жыл бұрын

    XFX RX 7900 XTX

  • @assetaden6662

    @assetaden6662

    Жыл бұрын

    @@Eidolon2003 thats a gpu tho. The most x's CPU can have is one.

  • @siewkimng1085
    @siewkimng1085 Жыл бұрын

    Great explanation, while I certainly understand the need to avoid GPU bottlenecks while testing CPU performance, it’s great to see it articulated in such a clear manner.

  • @reaperofwaywardsouls
    @reaperofwaywardsouls Жыл бұрын

    Great video. I think that including 4k benchmarks is a great way to remind some folks that they don't necessarily need a CPU upgrade (people that game at 4k). No need for the full suite of benchmarking games though, maybe 2 or 3 should get the point across. You may be surprised about the amount of people who look at 1080p graphs and transpose it to 4k in their minds, and then are disappointed when they don't see a comparable performance uplift when they upgrade (and game at 4k).

  • @charlestwoo

    @charlestwoo

    Жыл бұрын

    I just want cpu gaming benchmarks to have 3 games shown in 4k so I can see it clearly, that is all. I know 1080p is where u lean on the cpu the most but I still want to see 4k damn it.

  • @Strix-gp9xg

    @Strix-gp9xg

    Жыл бұрын

    That is exactly why they should always include a 4K benchmark results when testing CPUs.

  • @jtnachos16
    @jtnachos16 Жыл бұрын

    I've never understood the confusion around testing cpus at lower resolutions with powerful GPUs. I've ALWAYS understood the reason for it to be that it serves as an indicator of what kind of scaling performance you can expect going forward as the cpu ages, by highlighting the point at which the processor starts to become the bottleneck. This is very important info for those that can't afford to update every couple years and may be sticking with a baseline rig with only minimal upgrading possible for 5 years or more. This is especially a factor with CPUs, because manufacturers tend to change socket requirements there FAR more frequently than any other part gets requirement changes. This is also ignoring that 1080p is still a very common screen resolution for people to stick to, even when on hardware that can in theory handle 1440p just fine.

  • @jtnachos16

    @jtnachos16

    Жыл бұрын

    @Roenie Except it WILL become the bottleneck, because those newer games also hit cpus and other parts harder as well. Your argument here is so deeply flawed it isn't even funny. You are basically arguing that a new gpu that can keep up with the newer game will become a bottleneck faster than an old cpu that can't keep up with increased demands. That is beyond faulty logic. Especially given that graphics improvements are starting to hit limits these days and that you can turn down graphics settings, but your options for reducing cpu util are FAR more limited. CPU replacement to a newer gen is THE most involved upgrade in a system, under normal circumstances, as it often necessitates a new mobo (+ potentially RAM) and even CPU cooler (or at least an updated mounting kit) while you are at it, as opposed to GPU upgrades which are generally plug and play and even power supply upgrades which just involve unplugging things to replug them in with the new PSU. Those who are upgrading systems piece by piece are liable to wind up with RAM and CPU as the oldest parts in their system, the ones showing the most length in the tooth. Knowing how your cpu scales with more powerful graphics cards can be a big, big factor in choice at time of purchase, simply because it is generally much easier to afford 300 dollars here and there over the years than it is 600+ to upgrade cpu, mobo, and potentially RAM all at once.

  • @domm6812
    @domm6812 Жыл бұрын

    Good vid, but I suspect a lot of the people complaining about 1080 testing may not understand the definition of what a bottleneck is in this instance, how the CPU becomes one, and the fact that CPU performance is masked/hidden at higher resolutions. In short ...go back to the very basics so there can be no confusion.

  • @pf100andahalf

    @pf100andahalf

    Жыл бұрын

    Exactamundo. He needs to make a 60 second short too. It probably wouldn't hurt to make several follow-up videos to this one showing the concept in different ways for those in the back... of the short bus.

  • @UnStop4ble

    @UnStop4ble

    Жыл бұрын

    my problem with this video is that it couldve been atleast 50% shorter and still wouldve gotten the point across, maybe even better cause of peoples low attention span

  • @moldyshishkabob

    @moldyshishkabob

    Жыл бұрын

    @@UnStop4ble Frankly, I think the people with such a low attention span that they don't understand this may not want to put in the effort to understand it in the first place. Because why are they watching a channel that has an average run time on videos that's well beyond 15 min? There's a reason LTT has millions more subscribers than HUB or GN, as S-tier as their testing is.

  • @JACS420

    @JACS420

    Жыл бұрын

    @@UnStop4ble yeah, tbh I’ve built enough pc’s I’ve sold graphics cards once overclocked performed like the next tier none ti/super variant. Mean while others gpu/CPU’s can’t handle 150+mhz oc’s. One customer I sent along the way with a 3060ti boosting up to 2050mhz. I warmed him, but he popped in my discord not to long ago claiming it’s still kicking.

  • @huggysocks

    @huggysocks

    Жыл бұрын

    Nope I have literally zero respect for people commenting on how people do their work when they haven't bothered to learn the most basic things about the topic. It's like backseat driving from a toddler only dumber cause they are older and should know better.

  • @thisguy317
    @thisguy317 Жыл бұрын

    Awesome video guys. We need more videos like this I'd love to see more GPU reviews include old CPUs mixed in to the results, just to get a feel for what the FPS-cost is associated with not upgrading my CPU in tandem.

  • @KevinM0890

    @KevinM0890

    Жыл бұрын

    u can do it on your own. create a cpu limit on your system and u know if your cpu needs to be upgraded or not to reach the FPS numbers u need :)

  • @codyl1992

    @codyl1992

    Жыл бұрын

    @@KevinM0890 not really because that doesn’t take into account everything. Clock speed and core count isn’t everything for this stuff.

  • @5i13n7
    @5i13n7 Жыл бұрын

    I find the 'mixed data' chart very interesting. I would greatly appreciate if you guys could do a video showcasing the expected gains from historical cpu architectural upgrades at various resolutions on some of the more popular gaming titles. eg. 4090 with each 2700x, 3700x, 5700x, 7700x, 8700k, 9700k, 10700k, 12700k, 13700k. Even just a couple games to compare would be great to see actual expected gains for a cpu generation upgrade at a user's target resolution. I, and likely many others, tend to stretch my pc build as long as it's legs will carry it, and a lot of the charts for new hardware include only the last generation or two to find the generational uplift. Enthusiasts may be upgrading with each generation, but I'd wager more people are considering upgrades from several generations back, and could use this information to choose a most suitable upgrade for their use case.

  • @tortuga7160
    @tortuga7160 Жыл бұрын

    1080p is still alive and healthy. 65% on steam survey, and LG is coming out with a 500hz 1080p monitor. I'm pretty sure LG did the calculus before deciding to make that panel.

  • @OutrunCid

    @OutrunCid

    Жыл бұрын

    The Steam survey is far from representative. Its 1080p numbers are more likely to be pushed upwards by ancient PC's and laptops, than by ultra high framerate monitors. Unfortunately it does not measure the refresh rate used. I think the numbers shown by Steve are more representative for those looking into high-end hardware than those presented by Steam though.

  • @alsamuef

    @alsamuef

    Жыл бұрын

    Moronic product.

  • @Hyperus

    @Hyperus

    Жыл бұрын

    @@alsamuef How so?

  • @Kuriketto

    @Kuriketto

    Жыл бұрын

    @@OutrunCid How is the Steam Survey far from representative when it literally just pulls system specs? It might be nice to know what refresh rate is used by most gamers, but the hardware survey largely focuses on computer hardware and baseline software, not in-game settings or monitor makes and models. LG also is likely very aware that extreme refresh rate monitors are a niche category and haven't churned out millions of them, but one also needs to recognize that with today's technology, producing a 500Hz monitor is only achievable at a max resolution of 1080p anyway, and while using a TN display at that. In that context, the Steam hardware survey resolution results are slightly irrelevant, but the survey as a whole can at least inform what percentage of the userbase has hardware capable of driving a 500Hz display to its full potential, though realistically only a handful of games can even achieve 500+ FPS given modern hardware. And as far as benchmarks on this channel go, it has people with ancient hardware, someone like a pro player who wants the absolute best and everybody in between, covered. As Steve pointed out, just do a little cross-referencing.

  • @CelestialGils

    @CelestialGils

    Жыл бұрын

    @@OutrunCid Not really. You have the misconception of steam doing the survey automatically on every computer that has steam installed. That's not how it works. The steam hardware survey is random and you have to accept it in order for your system to be part of it. You can trigger it of course, but most people don't do that. You also can see the data more in detail, based on what hardware or resolution makes up that month, not just the change where all the surveys are taken into account.

  • @chrzonszcz323
    @chrzonszcz323 Жыл бұрын

    For future i would suggest including strategy games like Stellaris, rimworld or Factorio. These are very CPU intensive games where you will be able to see the difference between different CPUs much easier

  • @rENEGADE666JEDI

    @rENEGADE666JEDI

    Жыл бұрын

    But in factorio processors with 3d cache are in a different league. Although, for example, a star citizen would open people's eyes to how much this technology gives ;)

  • @RafaHuel

    @RafaHuel

    Жыл бұрын

    im tired of watching AAA games in CPU conparision when we know the indies games that have a insane hungry ass wanting a chonky cpu

  • @Daeyae

    @Daeyae

    Жыл бұрын

    My r5 1600 and my friends r5 5600x are night and day in stellaris, his is orders of magnitude faster.

  • @thentil

    @thentil

    Жыл бұрын

    What metric exactly do you want them to show for these games? The productivity charts should give you a good idea of how they'll scale in these games.

  • @saricubra2867

    @saricubra2867

    Жыл бұрын

    @@RafaHuel Indie games don't have good optimization unless they do.

  • @mlqvist
    @mlqvist Жыл бұрын

    Thanks for videos like these Steve, they are the one opportunity I have to feel smart. How do people not get this stuff?

  • @ComputerGeeks-R-Us
    @ComputerGeeks-R-Us Жыл бұрын

    Love the work you folks are doing! I've been in IT for decades, including benchmarking and standardizing client hardware platforms for a Fortune 250 for over 20 years. The understanding of the interdependencies and limitations between components isn't exactly straight-forward. I've also run across some really weird inconsistencies that the OEMs couldn't even explain. It's nice to see Hardware Unboxed doing the hard work and sharing their knowledge. Keep up the good work!

  • @jimatperfromix2759

    @jimatperfromix2759

    Жыл бұрын

    I agree with @gregorymchenry1464 and many others that this is both a well-needed and extremely well done video. Most people with IT careers like Gregory and myself understand the general principle espoused in this video very well. But the average Joe or Jane buying gaming hardware typically doesn't have exposure to this general principle of performance analysis. Just to reinforce that Hardware Unboxed is right on target in this video, let me more formally state the general principle. (a) In any computerized system there are typically multiple system components, each of which *might* contribute to slowing down the response time of the entire system to a low enough level such as to disappoint the user(s) of the system. (b) Given any specific configuration of components into such a computer system, it is almost always the case that just one of the several system components contributes the most to slowing down system response time. (c) Within the discipline of computer performance analysis and system tuning, point (b) is generally very useful and is called the "bottleneck principle." (d) The rare exception to having a single bottleneck is actually literally the goal of perfect system configuration, namely to have all potentially bottlenecking components hit bottleneck state at the same system load - that is, you're not overbuying on any given component and all components sort-of run out of gas at the same time. (e) It's fairly hard to perfectly tune a system (as per (d)), yet we can use the bottleneck principle to try to at least ensure that one system component is not "horribly bottlenecked" relative to the other system components. (f) So although configuring computer systems is somewhat of an art (for which you can perhaps hire Gregory to help you out), you can't mess up too horribly if you at least keep the bottleneck principle well in mind. (g) In a computer system designed for gaming, as long as you also take care of important extra issues such as making sure you have enough system memory and making sure your primary disk (plus as many other disks as possible) are SSD disks, then your main decision has to be focused on the two major potential bottlenecking points, namely the speed of the CPU and the speed (and to a lesser extent video memory) of the GPU. (h) In looking at reviews of potential CPU/GPU purchases to configure in your gaming system, look at a wide range of reviews, but take with a grain of salt any reviews that do not properly respect the bottleneck principle as apropos. (i) For CPU and GPU (respectively) reviews in the context of gaming PCs, that means in practice that CPUs should always be tested with "best available" GPUs, and GPUs should always be tested with "best-available" CPUs. (j) Other types of sub-reviews within a broader review article are sometimes useful to provide additional scope, but if you smell that the reviewer doesn't understand the usefulness of the bottleneck principle in doing hardware reviews, run away. (k) A recent example of a review that totally botched the bottleneck principle (and from a normally highly respected reviewer) was when LTT did a gaming oriented review of new 7000X series AMD CPUs, but they didn't have the patience to wait 2 more weeks to get their hands on a new Nvidia 4090 card to do those CPU reviews with, such that the 3090 they used wasn't fast enough for a proper review of those very-fast CPUs, and they came to erroneous conclusions about the lack of speedup given by the new AMD CPUs. Bear in mind what this Hardware Unboxed video teaches, and you're much better equipped to understand the implications of all CPU and GPU reviews. I would, however, toss out one somewhat more obscure point that requires a bit more nuance in understanding better. By nuance I mean that it's a bit more complicated and might be worthwhile of additional discussion (whereas the bottleneck principle is a slam dunk). Specifically, what I've got in mind here is the additional recommendation to do primary testing at 1080p - and I would like to modify that slightly. Substitute for 1080p the most-common low-end resulution "of your era" or perhaps do double testing on both the most-common low-end resolution of your era, plus the most-common middle-end resolution of your era. This is a sliding scale across macro time, since in the next era the low-end resolution will be at 1440p, and way down in the future we might see such great CPUs and GPUs that 4K gaming would then become the "minimal" resolution for testing. I mean, if nearly everyone is actually gaming at 2K or 4K, there is no point in differentiating 1080p performance of 1000 frames per second from 1080p at 500 frames per second. That having been said, for this "current era," it looks like testing at 1080p will be fine for at least a while longer. I want to quote in its entirety the comment in this thread by @puokki6225, who says "Anyone saying 1080p is 'dead' needs a serious reality check. According to Steam hardware stats, ~65% of their userbase still uses it. 1440p is at 11% and 4K at a tiny 2.7%. For reference, 1366x768 is at 5.9%. The high end market, which realistically almost all of the DIY PC market currently is, is very niche even when it comes to gaming in general." So in our era, we DIYers can safely ignore the testing needs of the 5.9% still running at 1366x768 (since presumably they want to upgrade to at least 1080p), cover 65% of current userbase by focusing on 1080p testing, and for those 13.7% of DIYers aiming at either 1440p or 4K gaming, they should be able to apply some reduction factor to guess approximately what 1440p and 4K performance might look like.

  • @andrewcross5918
    @andrewcross5918 Жыл бұрын

    I just wish there were more non FPS metrics tested like tic rates in Stellaris or turn times in Civ 6 and other 4X titles. I also wish there was a wider array of game types tested like late game ARPGs like Path of Exile or MMOs. The kind of load these titles put onto systems is totally different to many AAA style games so seeing how these perform would add a lot of breadth.

  • @HexerPsy

    @HexerPsy

    Жыл бұрын

    MMOs are particularly tricky to test due to the uncontrolable variable of the other people in the game. No busy city is the same, and testing in an empty field is kinda pointless. As for the rest... idk - if enough people want Civ 6 in the comments, then it ll happen? Guess the reviewer mostly tests the games that the audience wants to see / plays?

  • @lexiconprime7211

    @lexiconprime7211

    Жыл бұрын

    I, for one, would love MMO testing in raids and dungeons (WoW, FF14, BDO, Gw2, etc...). But I think the reason they don't is because there's no way to eliminate variables that could potentially skew the data.

  • @rk9340

    @rk9340

    Жыл бұрын

    @@lexiconprime7211 It's a bit pointless. You just lower the settings where it can run 60/144 fps lows consistently. Things like random texture loading times are influenced by your Harddisk more than anything else and the rest of your system can actually freeze to a halt while waiting for that data.

  • @andrewcross5918

    @andrewcross5918

    Жыл бұрын

    @@rk9340 in plenty of MMOs there are no GPU settings that will do that in large raids or PvP because the CPU is the bottleneck. Same for Stellaris et al. Low tic rates means you need to have the patience of a saint or stick to smaller maps with fewer other civs for it to remain playable to end game and beyond. Just checking steam and Civ 6 is 22nd in the top player list, Heats of Iron 4 (Paradox grand strategy) is 34th, Cities is 37th, Path of Exile is 42nd. All ahead of CP2077 in 54th. Then you have EU4 in 69th, Stellaris in 74th and CK3 in 88th. I don't see Hitman 3 or Rainbow 6 siege or tomb raider or horizon zero dawn on the top 100 yet they get tested over far more played games. Edit. Adding up how many people are playing those 4 grand strategy titles (I believe they use the same base engine) and you are in and around the player count of Call of Duty at 80k ish. So on the whole very very popular.

  • @lexiconprime7211

    @lexiconprime7211

    Жыл бұрын

    @@HexerPsy That's kind of why it would have to be primarily a cpu test in controlled environments like raids or dungeons, where the number of players and NPCs is always the same. But even then, you would need a consistent crew of dozens of people all repeating the same actions ad nauseum, which is why I think no one would do it. It'd be VERY useful for someone like me though, since I play a lot of MMOs.

  • @russellmm
    @russellmm Жыл бұрын

    this was a great video! nice job. I think this also shows why doing productivy testing is so important as it also isolates the CPUS from the GPU.

  • @zakelwe

    @zakelwe

    Жыл бұрын

    Exactly All the cpu testing in games this site does is rather wasted because if it is not gpu limited who cares if it is 200 or 150fps apart from a very small minority. What's more important is how long it takes to do something useful such as video conversion etc. If you are playing games then the vastly more important thing is to spend money on the gpu. The 5800x3d was better at gaming but worse at everything else but cost more. It was just not balanced.

  • @albundy06

    @albundy06

    Жыл бұрын

    @@zakelwe Everything you post is a waste. Don't watch it. Don't comment.

  • @spladam3845
    @spladam3845 Жыл бұрын

    This was a great idea for a video, and a good resource for new viewers, well done Steve and team.

  • @clickbait0190
    @clickbait01905 ай бұрын

    Well... choosing a modern game within your preferred genre, comparing the performance of various CPUs and GPUs, and then selecting components that fit together is one of those ideas that seem so obvious once you've heard them, you wonder why you didn't think of it yourself...

  • @dandylion1987
    @dandylion1987 Жыл бұрын

    Uh oh. You do realize all viewers here have 150IQ or more and know the intricacies of PC technology inside out, right ?

  • @ozicryptoG

    @ozicryptoG

    Жыл бұрын

    Well neither do they.

  • @Eternalduoae

    @Eternalduoae

    Жыл бұрын

    Dude, I have an IQ of above 500 at 360p!

  • @godblessbharat708

    @godblessbharat708

    Жыл бұрын

    @@Eternalduoae sounds like frame rate

  • @lukeshackleton4775

    @lukeshackleton4775

    Жыл бұрын

    Sounds like a joke that went over a head

  • @stephenbull2026

    @stephenbull2026

    Жыл бұрын

    @@lukeshackleton4775 Sounds like a joke which, if you don’t get, means your IQ is the bottleneck. 😂

  • @aaronwhelan1
    @aaronwhelan1 Жыл бұрын

    So true!! moving from a i7 2600K to a R7 7700x with an older 980ti felt the same as when I moved from a Geforce 680 to a 980ti with the older 2600k

  • @OutrunCid

    @OutrunCid

    Жыл бұрын

    Now that 2600k is an oldie of course. Will be making the switch to the same R7 soon whilst still on an R9 Fury. Not expecting wonders to happen, but I know that in those cases where I was CPU limited, improvements will be experienced. I just need to listen to my GPU right now to know which games these are :D

  • @alexm7777

    @alexm7777

    Жыл бұрын

    A i7 2600k and a 980 ti would make for a nice budget rig.

  • @Daeyae

    @Daeyae

    Жыл бұрын

    @alex m sadly its just cheaper to get an old ryzen or new i3 most of the time

  • @MalHerweynen

    @MalHerweynen

    Жыл бұрын

    @@Daeyae depends whats on the used market like rn I cant find any reasonable priced ryzen parts that arent like 90% of the price of buying it new

  • @Daeyae

    @Daeyae

    Жыл бұрын

    @Malakai Hawaiian true, in my area a 2600k would be like £15 and a 1155 mobo about £30, ddr3 is super cheap too. A 1600 is about £30-£55 with a cooler A320 boards are about £25+, ddr4 is cheap but not as cheap and ryzen loves fast ram so youd want decent stuff. I guess it depends also how much £15 is to your budget, if your trying to spend under 300, £15 is 5% which is quite a bit but is it worth losing out on 2 cores and a small speedup?

  • @pokemon1666
    @pokemon1666 Жыл бұрын

    @Hardware Unboxed apparently rebar is not enabled in some games by default and you have to turn it on in nvidia profile inspector, would you take a look at it?

  • @philipreininger2549
    @philipreininger2549 Жыл бұрын

    Really great video, I 100% get where people are coming from with the "unrealistic pairing argument" but for testing it's just a different story than for actually purchasing parts and recommending certain combos. Keep up the great work :)

  • @tkllluigi
    @tkllluigi Жыл бұрын

    When someone can explain something simply means that they full understand the topic. One of the best hardware educated videos, well done!

  • @kodysmith8897
    @kodysmith8897 Жыл бұрын

    I was stuck with a 1080p monitor for some time so I appreciated the information. Now I still think of it as academically interesting, shows historical scaling, and allows me to help inform my non-pc literate friends and family.

  • @earvin4602
    @earvin4602 Жыл бұрын

    I agree with everything said here about CPU benchmarks and I'm still adamant about the fact that a CPU "scaling" benchmark is incomplete and misleading without being paired with corresponding GPU scaling benchmarks. Let's take Steve's example about wanting to retire the 8700k: While the 4090 benchmark suggests a 98% uplift potential in Watchdogs when switching to a 13900k, only 6% will be achieved without retiring the paired 3060 alongside. And for that information the presumed meaningless GPU bound benchmark is actually very valuable.

  • @michaelrobinson9643
    @michaelrobinson9643 Жыл бұрын

    In a world where people say "I feel" as an expression of thought, I'm not surprised many do not understand scientific or engineering rigour. The concept of isolating variables to objectively examine each is too horrible to think about for many I expect.

  • @Encraftonline
    @Encraftonline Жыл бұрын

    Thanks for adding 1080p in your benchmarks. Most fortnite pros are still using 1080p and likewise for other competitive titles.

  • @fiendhappy6964

    @fiendhappy6964

    Жыл бұрын

    yea they SUPPORT FOR COMPETITIVE GAME PLAY 1080P.

  • @conkcreet

    @conkcreet

    Жыл бұрын

    @@terraincognitagaming facts

  • @Encraftonline

    @Encraftonline

    Жыл бұрын

    @Terra Incognita Gaming brah think you're still lagging 2 years behind. In November, 2022 we had fncs invitational LAN and this year they have FNCS global championship with prize pool of $10 million.

  • @fiendhappy6964

    @fiendhappy6964

    Жыл бұрын

    @@terraincognitagaming no bruh , pubg , dota and warzone

  • @CanIHasThisName
    @CanIHasThisName Жыл бұрын

    I would like to say that I really appreciate that you also include midrange GPUs in your CPU benchmarks even though it's a lot of extra work for you. It helps the viewer to quickly get an idea where the difference still matters and where it doesn't. Theoretically you could use the presented graph and compare it to another graph that focuses on testing the GPU, but then you also need those benchmarks to be done under the same conditions in the same games and sometimes you'll need more than one outlet for that which really complicates things. And then you have what's seen at 7:53 - based on the 3080/4090 data you'd assume that the 2600X would get you the same result as 8700K with an RTX 3060, except it doesn't.

  • @rocketsurgeon1349
    @rocketsurgeon1349 Жыл бұрын

    Good explanation. And i'm sure there are plently of videos talking about reasonable pairings of CPUs with GPUs, now or if you're upgrading.

  • @dame666999
    @dame666999 Жыл бұрын

    Hi I’m newish to pc gaming and built a pc with an 11700k and a 3080, but have replaced the 3080 with a 4080 with the 11700k at the moment, and it’s running resident evil village at a 100 to 117 fps at 4K, my lg cx tv is 120hz and I cap my fps at 117fps.Would a 13700k or 7700x be ok for a 4080 for mainly playing at 4K or would a lower end cpu be a better choice or is the 11700k ok in this situation ? Any advise would be great as I don’t want to waste my money. I know people will say crap choices for my cpu and gpu but both run great for me.

  • @jdlpeke1
    @jdlpeke1 Жыл бұрын

    A perfect explanation, between the difference in CPU performance and CPU+GPU performance, because although they go together they are worlds apart. You can see perfectly how things change when there is no GPU bottleneck.

  • @matrix3509
    @matrix3509 Жыл бұрын

    Seems like most people just refuse to understand what the purpose of benchmarking even is. SPOILERS: Benchmarking is about founding out the true performance characteristics of whatever part you're testing, thus it makes sense to eliminate other variables. Benchmarking is the computer equivalent of taking a car to a race track to find its lap time. Then some dipshit comes into the comments and talks about how nobody cares about lap times because the real world has speed limits.

  • @speedracer9132
    @speedracer9132 Жыл бұрын

    I want to see more channels benchmarking resource intensive games like MSFS2020 instead of low resource games that get hundreds of frames per second

  • @3Dsirius
    @3Dsirius Жыл бұрын

    What a great video! Congratulation Steve, thank you for the hard work and clear presentation and keep posting.

  • @kjgasdfduhigsdauf
    @kjgasdfduhigsdauf Жыл бұрын

    Didn’t realize people questioned this practice. Once again shows how little common sense gamers have

  • @mauree1618
    @mauree1618 Жыл бұрын

    I guess it’s easy to forget how much of what we know is built on prior knowledge. We need more videos like these for the beginners.

  • @DREAMERkun
    @DREAMERkun7 ай бұрын

    What budget cpu should i pair with 7800xt for 1080p video editing on premiere pro

  • @AirborneHedgehog
    @AirborneHedgehog Жыл бұрын

    Great video, it was amazing to see how wide the gap in performance is when you get the GPU out of the way. One minor thought: at 11:24, you question how valuable the data from the 13900K vs 13100 with a 3060 is. I would argue that it is valuable when paired with the charts above it: If I'm gaming at 1080p, and all I can afford is a 3060 for the foreseeable future, then I know I'm throwing my money away if I pay for anything more than a 13100. I won't see any benefit unless and until I upgrade the GPU. But to your point: if I wanted to future-proof my build against an eventual upgrade? Yeah, that chart doesn't help much. (I'm in a bit of a weird case in that my preferred genre - flight simulation - is EXTREMELY limited by single-threaded CPU performance, so I did notice a bump when I updated from a Ryzen 3600 to a 5800X when paired with my RTX 2060. Prior to FS2020? I don't think I would have bothered much with the CPU.) Edit: Looks like you address that in the final thoughts. Well done. I'm convinced. 👍

  • @halistinejenkins5289
    @halistinejenkins5289 Жыл бұрын

    16:56 is why i started watching your channel. not really complaining but this kind of scaling has been lost it seems over the past two years. i love the cpu/gpu scaling content, because it shows you the cut off point for certain platforms at 1440p high frame rate gaming (144-165) with different gpus. i guess it's hard to get it all in, so everyone can't get what they want. dunno, maybe it's just me, but the inner nerd gets excited when i see how far the 8700k can go with today's gpus before you start leaving noticeable performance on the table. regardless, cheers! been watching since the beginning and will continue to do so.🙂

  • @jouniosmala9921
    @jouniosmala9921 Жыл бұрын

    Some graphics quality settings affect the CPU limit significantly in some games. I learned that during the year when I was using 2070 super with i7 920. (My GPU upgrade wasn't for gaming but for a project that needed it, 2060 super was ideal choice but 2070 super had a quieter partner model.) It would be interesting to see if that's actually case for your normal set of games tested. Pick fastest GPU, slowest CPU and check if quality settings affect performance.

  • @cellanjones28

    @cellanjones28

    Жыл бұрын

    Graphic quality settings would impact the CPU by how much the CPU needs to work ( I think)

  • @calisto2735

    @calisto2735

    Жыл бұрын

    No shit, Sherlock? Quick! We need to call NASA!

  • @AerynGaming

    @AerynGaming

    Жыл бұрын

    They certainly do. Resolution is a good knob to tweak because it almost never has any kind of meaningful effect on the CPU, but usually massively changes the GPU throughput.

  • @jouniosmala9921

    @jouniosmala9921

    Жыл бұрын

    @@cellanjones28 Yes, some settings do. Other's I was able to keep at maximum settings since those only affected the GPU load. But what was interesting is that some graphics settings just turned games that were playable with settings used with my old GPU into completely unplayable and I had to tune it down to find playable settings but all my games at the time were older games. This is something that would be interesting topic worth further exploration when they are not busy working hardware that's about to release or just released.

  • @cellanjones28

    @cellanjones28

    Жыл бұрын

    @@jouniosmala9921 just depends on the resolution. i run 1440P so it's all gpu based.

  • @programmier
    @programmier Жыл бұрын

    Great video once again Steve! You made it every thought process very clear! Great Work! 😀

  • @Kuriketto
    @Kuriketto Жыл бұрын

    As someone who's still rocking an i7-8700K, just upgraded from a Vega 64 to a 7900XTX and has plans to upgrade my CPU platform to hit a performance target of at least 144 FPS in most if not all modern shooters at 1440p, this video has been fairly vindicating.

  • @darkywarky
    @darkywarky Жыл бұрын

    Thank you for also testing the 8700k. I was wondering if I should upgrade it playing at 1440p with a 3070.

  • @InternetListener

    @InternetListener

    Жыл бұрын

    No. And you don't need a 3070 because new AMD cards with same or better performance (and higher efficiency, that will save you hundreds or thousands of moneys in electric bill) are cheaper even than the used ones. Just check prices of 6700 series (6800 or 6900 series, or even 7900 with a proper discount could make you play at 2K or 4K). While you can get more than 90 fps consistently you can always make use of the supersacaling or downscaling option in new GPUs to make sure your GPU works at 100% even if you were in a CPu bottleneck scenario in a particular game.

  • @GewelReal

    @GewelReal

    Жыл бұрын

    @@InternetListener AMD cards do not have RTX features tho 🤷

  • @meurer13daniel

    @meurer13daniel

    Жыл бұрын

    I had a 9700f non K and had to upgrade to a 13600k.I dont know how your 8700k is performing know, but I was having a lot of bottleneck scenarios in modern games with a 3070. Spider Man, Cyberpunk and Plague Tale Requiem are clear exemples. For 2023 games I supose it would be even worse such Hogwarts Legacy

  • @darkywarky

    @darkywarky

    Жыл бұрын

    @@InternetListener I already have the 3070 for a long time, but I was wondering if I was leaving performance on the table by using a 8700K in combo with a 3070.

  • @khalidzouzal8417

    @khalidzouzal8417

    Жыл бұрын

    I got the 8700 non k paired with an RTX 3080ti, I basically play any game really at Ultra 2160p above 60 fps. Like they already mentioned in the video, the higher your resolution is, the less your cpu matters. I dont play competitively, so im okay with 60+ fps at 4K. And it works flawlessly for me.

  • @ferasamro9735
    @ferasamro9735 Жыл бұрын

    To be Honest as Enthusiasts we think all of your viewers understand what you do but these comments are a good think because this means that non experienced people are watching your videos, this is very good news for the hardware community and for you as whole Thanks always for your Hard work.

  • @myfakeaccount4523

    @myfakeaccount4523

    Жыл бұрын

    Enthusiast = spends too much money.

  • @riba2233
    @riba2233 Жыл бұрын

    Awesome video Steve... I am shocked at how many people don't get that. And some of us want to know how cpus perform at eSports titles, and if they can run 360 or 500hz monitors

  • @2ndtlmining
    @2ndtlmining Жыл бұрын

    Great content as usual. Keep it up mate!

  • @TriXtieR
    @TriXtieR Жыл бұрын

    Great video! So I’m currently running a Ryzen 5700g and a rtx 3060 but im about to upgrade both my CPU and my GPU but I wanted to do it cheaper and get by with it with the best performance possible with a B550 till Intel drops their next then I’ll be upgrading my whole system so I went with a 5800X3D and a 4070 Ti how do you feel they will run together?

  • @MicaelAzevedo

    @MicaelAzevedo

    Жыл бұрын

    Lol

  • @cosmic_cupcake

    @cosmic_cupcake

    Жыл бұрын

    If you're getting a whole new Intel system next year then why do you need to blow money on a 5800 X3D? Is the 5700G really so bad for you that you can't hold on to it for a bit longer?

  • @TriXtieR

    @TriXtieR

    Жыл бұрын

    @@cosmic_cupcake plan on handing it down and want them to have a decent set up when I do hand it down.

  • @HexerPsy

    @HexerPsy

    Жыл бұрын

    Sounds like a good combo! You might be on the 5800X3D for a long time, depending on the type of game you play thanks to its cache. But you left out your target resolution. If you target 1080p 60 fps thats overkill for example, so we also cant advice you if the 4070Ti is reasonable...

  • @TriXtieR

    @TriXtieR

    Жыл бұрын

    @@HexerPsyThat’s what I was thinking seemed like a decent combo! I’m going to run 1440p most of the time.

  • @DarkReturns1
    @DarkReturns1 Жыл бұрын

    Can you please do an 8700k revisit? That CPU was game changing at the time and hugely popular. After 5 years is a pretty normal time to upgrade a CPU, I know alot of 8700k owners looking to upgrade this year

  • @marka5968
    @marka5968 Жыл бұрын

    Great video and insightful look into CPU testing. That is why I tell people to take CPU testing differences with lot of thought. You won't notice the difference between a high end CPU and a lower end one until years into the future. At that point, the upgrade would be much much cheaper to do than overspend today to keep for 6 years.

  • @MikeSharpeWriter
    @MikeSharpeWriter Жыл бұрын

    This is why I like to see the non-gaming benchmarks results as well. Mind you I often do multi year gaps between upgrades so performance jumps should be noticeable.

  • @flameshana9

    @flameshana9

    Жыл бұрын

    >multi year gaps You make it sound like upgrading every year is the norm. And if it is that's just a horrible waste of money.

  • @1BlinkwithAngels82
    @1BlinkwithAngels82 Жыл бұрын

    Really appreciate you guys making videos like this because since COVID began there are a lot of new people in the PC gaming space that are almost entirely clueless and borderline delusional with their logic regarding topics such as this. For example, quite a few people that reply to your tweets that I've seen both Tim and Steve face palm over LOL.

  • @GamingLovesJohn
    @GamingLovesJohn Жыл бұрын

    I am one of those Steam users that plays at 1080p 165Hz with a 3900x and an 3080. I haven’t seen the need to upgrade my monitor, as most games I play are on high refresh rates anyway. I find that switching to 1440p, you still need a higher tier card. At 1080p, it’s easier for me to hit far higher than 144FPS+.

  • @Outmind01

    @Outmind01

    Жыл бұрын

    Serious question because I have yet to experience refresh rates so high - can you really tell the difference between 144 and 165Hz?

  • @assetaden6662

    @assetaden6662

    Жыл бұрын

    @@Outmind01 Going farther than 144 has the effect of diminishing return. It will give you an advantage, up until the point of you being a bottleneck. No reason to buy 360hz monitor, when your reaction time is 400ms.

  • @KontrolStyle

    @KontrolStyle

    Жыл бұрын

    @@assetaden6662 this is a great point, if you're not shroud or a pro streamer/gamer no reason to get 360hz. @outmind01 no you can't tell a difference.

  • @nsacc5600
    @nsacc5600 Жыл бұрын

    This is awesome video. I have a 12600k with rtx 4090 and I'm still banging my head should I upgrade cpu on 4k resolution. Maybe on raptor lake refresh. You guys got any advice?

  • @fr0zen1isshadowbanned99

    @fr0zen1isshadowbanned99

    Жыл бұрын

    No And don't think about upgrading the CPU and GPU for another 5 years.

  • @jessterman21
    @jessterman21 Жыл бұрын

    Thanks for this, Steve! Explaining 99th percentile framerate is often bound by the CPU would be a helpful addition. You mentioned this, but using a strong enough CPU to maintain your preferred MINIMUM framerate in games is extremely important.

  • @nepnep6894

    @nepnep6894

    Жыл бұрын

    To a degree (basically most midrange cpus and up) your ram and how you configure it makes a bigger difference for the minimums/lows than the cpu

  • @nhkbeast3884
    @nhkbeast3884 Жыл бұрын

    At 1080p my cpu is being used like 100 percent and gpu like 30 percent but when i switch to 4k both are at 100 percent. Most people don't realise that you guys are actually looking for cpu bottle necks and that's why y'all test cpu's at 1080p.

  • @dante19890

    @dante19890

    Жыл бұрын

    the cpu bottlenecks will show up in higher resolutions too but not as pronounced

  • @assetaden6662

    @assetaden6662

    Жыл бұрын

    @@dante19890 yeah, when gpu is bottlenecked harder, cpu bottlenecks can only be seen in 1% and 0.1%

  • @qubes8728
    @qubes8728 Жыл бұрын

    Funny this popped up. Was commenting in another video about my gains in WZ with the 5800x3d. One guy asked how a cpu can increase frames and didn’t even know what L3cache was! Another guy with a 3090ti/ 5800x3d asked if he should enable xmp. He also wondered if he should reduce vram usage in WZ because that’s what he saw on KZread. I told him he’s got 24gb of vram so it won’t be an issue. I think some guys get confused when amd calls L3cache, 3D v cache. They think it something completely different or didn’t even know all cpus have L3cache.

  • @absolutelysobeast

    @absolutelysobeast

    Жыл бұрын

    just curious? what were your gain in WZ with the 5800x3d? I upgraded to mine a few months back before i was playing warzone so i dont have an idea of what i could have expected before. When i run the MW2 benchmark i get a 1 percent CPU bottleneck result. I am running it with a 4080 at 1440p max settings. But i seem to have alot higher 1 percent lows than my friends i play with, but they are all on 30 series cards ranging from 3060 up to 3090 so its hard to really get a reference point on whats CPU and whats GPU you know?

  • @GewelReal

    @GewelReal

    Жыл бұрын

    doesn't matter what AMD calls it. average Joe has no idea about the PCs in depth. You could explain to him that CPU is like a gearbox in a car. Better CPU = longer gears aka higher top speed (if engine aka GPU can drive such speeds)

  • @qubes8728

    @qubes8728

    Жыл бұрын

    @@absolutelysobeast i was averaging 120fps with a 5600x in demanding areas now im averaging at least 150fps. So around a 30fps gain with the 5800x3d with my 3070ti. 1% cpu bottleneck is good. Jufes from Framechasers noticed higher 1% lows in wz with 5800x3d as well. Check out his 5800x3d video. Seems to ba a wz thing with the 5800x3d but if they have x3d's then it could be the 4080 combo. Dont stress on lows unless it doesnt look smooth. your lows are probably higher than most guys highest fps anyways. Enable the gforce fps overlay in game and see how the lows go in game.

  • @Lurker-dk8jk
    @Lurker-dk8jk Жыл бұрын

    Very well presented. Hoping this can clear up how potential differences between CPUs can help when upgrading later down the line when faster GPUs are available. My guide for parts shopping for a new PC involves checking the maximum performance data for each of the desired CPU and GPU, then using the lower of the two resultant framerates to see what my expected maximum performance will be for that combo. The CPU feeds the GPU. The slower of the two is the limiting factor.

  • @sniper0073088
    @sniper0073088 Жыл бұрын

    This data is extremely useful. It makes it very easy to choose the best gpu for a given cpu. This kind of benchmark would be very nice to have in every cpu review. Making the most of a budget cpu while not overspending much easier

  • @smoketinytom
    @smoketinytom Жыл бұрын

    I don’t understand why some guy decided to test E Cores performance with modern games and then go on about how poorly they did vs P Cores. But that was like asking James May to set a lap time on the TG Test Track when he’s in a basic 90BHP Golf Diesel designed for Motorway miles and normal driving vs the Stig in a 320BHP Golf R designed for maximum performance and speed on the track… Which the comparison was performed at.

  • @damasterpiece08

    @damasterpiece08

    Жыл бұрын

    watch more tech videos and you'll understand eventually

  • @smoketinytom

    @smoketinytom

    Жыл бұрын

    @@damasterpiece08 That they’re testing stupid scenarios for content and clicks when it’s never going to better or comparable or done by people who watch this channel?

  • @Takashita_Sukakoki

    @Takashita_Sukakoki

    Жыл бұрын

    It was an academic benchmark of IF you only play games should you even care about e-cores?

  • @smoketinytom

    @smoketinytom

    Жыл бұрын

    @@Takashita_Sukakoki And a poorly labelled and executed benchmark that had no literal value aside from “Let’s film it to generate clicks because I’ve got no content”.

  • @Takashita_Sukakoki

    @Takashita_Sukakoki

    Жыл бұрын

    @@smoketinytom I found it interesting if 12600(cheaper b board) has no e-cores you dont want to oc a 12600K and youre a gamer? Take a 12600/12500 all day. Saves you on ram/mobo/cpu.

  • @RongGuy
    @RongGuy Жыл бұрын

    First of all I would like to say I agree with your testing methodology. That being said, I think it would be cool to see "upgrade guide" type videos with new GPU generations: e.g. take 5-6 popular CPU's that are a couple of generations old, and benchmark them with 2-3 GPUs people are most likely going to buy (like xx60 (Ti), xx70 (Ti) and xx80).

  • @PainterVierax

    @PainterVierax

    Жыл бұрын

    I remember such videos in the past, testing different GPUs on a single popular midrange CPU to show the bottleneck. But those videos are done when there is less launches… and when Steve doesn't have to do such a basic explanation for angry newbs.

  • @allxtend4005

    @allxtend4005

    Жыл бұрын

    belive me the i5 was way more popular then the i7 because of the price of the intel products was that high and motherboards as well from intel. and at this time when the ryzen 2600x was selling well the most people had a intel I7 3770k / 4790k and so on and not a 8700 ( not the most populated cpu for gamers at all )

  • @CertifiedSlamboy
    @CertifiedSlamboy Жыл бұрын

    💡 moment for me in this video. Great stuff.

  • @TostonDePana
    @TostonDePana Жыл бұрын

    Absolutely fantastic video. I had some understanding but this made everything crystal clear. Thank you!

  • @Jansn
    @Jansn Жыл бұрын

    I can feel these comments you guys get, I'm getting the same stuff, those people will always be there, don't worry, just keep doing what guys are doing.

  • @OverseerPC
    @OverseerPC Жыл бұрын

    This sort of testing is really important especially for competitive-style gamers, wherein in most instances the CPU IS THE BOTTLENECK.

  • @zoltanlaky4907

    @zoltanlaky4907

    Жыл бұрын

    Which is quite frustrating for gamers like me, who exclusively play GPU intensive single player titles and are happy with 60 to 90 fps. I skim CS:GO, Fortnite, Overwatch, Apex, PUBG and all other similar software in every benchmark. I want to know which CPU is capable of working well with my GPU without being needlessly overpowered, so CPU benchmarks are generally not useful for me. That is not to say I don't understand why they are set up the way they are, it's just not useful for me.

  • @shagadelic3000

    @shagadelic3000

    Жыл бұрын

    @@zoltanlaky4907 Then go watch "gpu" benchmarking instead of "cpu" benchmarking, 5head.

  • @OverseerPC

    @OverseerPC

    Жыл бұрын

    @@zoltanlaky4907 In your case mate, just get a decent CPU (modern 6-8 cores) and then dump most of your budget on the GPU. Personally, I like playing at 60-100 fps high fidelity on single player games because it gives a more 'realistic' feel for immersion. Getting more frames is good but sometimes I feel like it takes out the immersion. For me, high FPS really matters when gaming multiplayer or competitively wherein you don't want to enjoy the graphics but focus more on winning with less visual noise and faster response.

  • @assetaden6662

    @assetaden6662

    Жыл бұрын

    @@zoltanlaky4907 overpowered CPU will serve you long, than having a "good enough for 90fps". It's future proof, just as the i7-8700k shown in the video. You will hit the bottleneck by year 5, but by then you'll have enough money to just purchase a better CPU, or be happy with it's performance.

  • @whosnick6506
    @whosnick6506 Жыл бұрын

    I would like to see raytracing focused CPU benchmarks, kind of like you did with Spiderman, but with more games that are known for becoming CPU-limited with RT turned on. Many non-RT games are mosty GPU limited at higher resolutions, but games like Spiderman, Callisto Protocol, Cyberpunk, Witcher, Dying Light 2 etc. are known to be CPU-heavy with RT on. Would be interesting if some CPUs are "better" at RT than others or if it just scales with rasterize performance.

  • @ZeroUm_
    @ZeroUm_ Жыл бұрын

    I'd say we're having an XY problem, what the ones asking for a matching CPU and GPU benchmark really want to know is what is the best match of two hardware so they can get the best performance for their budget. Most reviewers do proper CPU and GPU benchmarks, but they usually don't give direct answers which GPU matches well with a given CPU.

  • @blasterrr
    @blasterrr Жыл бұрын

    Great Video for educating some viewers. For multiplayer titles in non gpu bound graphicssettings the Performance differences between CPUs can be even higher. Unfortunately i Don t know of any reviewers who test the most demanding scenes of a game. As a result it looks like games run super Fluid these days and the gpu is way more important than the cpu. While in reality a game in a demanding multiplayer Scenario has a 1/4 or 1/3 of the framerates that u see in the Review. Especially in the 1% framerates. So actually for multiplayer the cpu is way more important if u go for high framerates and not for visuals. It would be great if reviewers could test more the demanding multiplayer Scenarios of games. I know that this is very difficult and not very reproducable. Bir u could do a more subjective play Analysis for multiplayer Scenarios. Great cpu bond games are warhammer vermintide 2, planetside 2 and Star citizen. Of course only in the right gameplay Scenarios. But those are actually the scenarios that do matter and not the ones where there is anyway not much load on the cpu. It would be super great if u could do a Video about this topic. To further fight against the misconception that CPUs Don t really matter for most games and to Show us multiplayer focused people how much we would really Benefit from a cpu Upgrade.

  • @YeaButCanUDoABackflp

    @YeaButCanUDoABackflp

    Жыл бұрын

    True. Hard to messure but very important. Battlefield is the worst example as far as I know.

  • @rjy87

    @rjy87

    Жыл бұрын

    This also applies to cpu demanding sim racing games which are often neglected by reviewers. Especially at wide resolutions, the cpu becomes the bottleneck most often.

  • @sceptikalmedia
    @sceptikalmedia Жыл бұрын

    I had the chance to test the 5600 non x vs 8700k . I sold the 8700k just for the 1% lows. the AMD part ran smoother every time with 1%low more than 20% higher. I dont miss quicksync.

  • @mohammedrashid1910
    @mohammedrashid1910 Жыл бұрын

    i have been look to upgrade from my Ryzen 3200g since i recently got a GTX 1660ti. i am really confused between the ryzen 3600 and 2600. is the newer generation worth it? (i know 3600 is not that new) should I consider the ryzen 3500 over the 2600? i also do some 3D modelling but mostly for gaming. 4000+ models are hard to find at good prices here in Pakistan.

  • @thedreadedgman

    @thedreadedgman

    Жыл бұрын

    3000 series is much better for gaming then 2000 series... but 5000 series are much better then both, and 5600 is pretty "cheap"

  • @Raglarak
    @Raglarak Жыл бұрын

    I feel that all this 1080p tests still make sense even with the lowest settings, for all the competitive & eSports players that do gaming for a living, some competitive games like COD Warzone with the lowest settings at 1080p, you still can't give you solid +240FPS 0.1% lows, because you're still GPU bound with a 4090 paired with a 13900k, while in other competitive games like Fortnite you still can't get solid +240FPS 0.1% lows but this time because you are CPU bound with a 13900K paired with a 4090. So yeah, thank you Hardware Unboxed for also including 1080p Benchmarks.

  • @lukeearthcrawler896
    @lukeearthcrawler896 Жыл бұрын

    I wish you also did 4X games performance review, like the built in benchmark in Civilization 6. They are perfect for CPU benchmarking, as these games are CPU heavy and less GPU heavy.

  • @saricubra2867

    @saricubra2867

    Жыл бұрын

    CPU heavy? Laughs in PS3 and Switch emulators.

  • @lukeearthcrawler896

    @lukeearthcrawler896

    Жыл бұрын

    @@saricubra2867 Yes, CPU heavy. Run the GS AI benchmark in Civilization 6 and see how many seconds you wait between turns.

  • @antonschaf4088
    @antonschaf4088 Жыл бұрын

    Nice, we havent seen the i7-8700K in a while. Impressive performance of the i3-13100 compared to the old i7. You really should compare the i7-8700K and i9-9900K and maybe the i9-10900K to 13th gen in the near future! Edit: At the end of CPU benchmark videos you could mention a realistic pairing to that tested CPUs, so the "realistic pairing" dudes have their needed answer.

  • @jonsummers3453

    @jonsummers3453

    Жыл бұрын

    8700k has really held up good overtime. 6 core is a sweet spot and that cpu could be overclocked massively helping keep it relevant for a long time.

  • @hypermatrix8999
    @hypermatrix8999 Жыл бұрын

    Out of curiosity, does higher fps at lower resolutions correlate directly to lower latency/frame times in a GPU limited scenario? In other words, is it possible for a cpu to get a higher frame rate at 1080p but not have the quickest frame preparation time when you’re GPU bound at 4K and a higher cpu frequency could out perform something like a 5800X3D that would otherwise have beaten it at 1080P?

  • @junkerzn7312

    @junkerzn7312

    Жыл бұрын

    A frame prep... well, there is not really a frame prep issue. The CPU doesn't prep the whole frame and then hand it off to the GPU. Not with a good game engine anyway. The CPU will break the scene down into static elements, semi-static elements, and MOBs (player controlled, npcs, and AI controlled). The CPU has almost no work to do prepping the static elements and very little work to do prepping the semi-static elements. The GPU, on the other-hand, spends most of its time on the static and semi-static elements and very little time rendering the actual MOBs. This means that the CPU can provide the most up-to-date MOB data just prior to directing the GPU to render the MOBs, which in turn is just prior to the GPU's frame flip. So regardless of the FPS, the MOBs tend to get rendered with up-to-date information just prior to the end of the frame generation and not the beginning of the frame generation. Almost 0 latency. However, since games almost universally double-buffer, there can be up to a 1-frame delay before it shows up on the display. Hence the latency is approximately (1.0 / FPS) regardless of the resolution, based almost solely on the FPS. At 1080P there are other problems, though. When the CPU becomes a bottleneck, that tends to mean that multiple threads in the game engine are competing with each other... including threads related to processing internet comms, mouse and keyboard action, and updating MOB information. This can lead to far lower determinism even at higher FPS rates. You get better determinism... smoothness and recognition of your inputs, at slightly lower FPS rates where the CPU is not bottlenecking. When the CPU is not bottlenecking, that means it can process mouse, keyboard, and internet comms immediately. So you are right with regards to determinism in the latency.... 1080P is not as good. Or more to the point, a bottlenecked CPU is not as good. If one games at 1080p and the CPU is not bottlenecked, then it is just as good. Though the lower resolution can cause problems in other ways. --- One also has to take into account the internet comms latency for the entire round-trip reaction time between two players, which is massive. 100 FPS is only 10mS. Best-case two-player comms latency over the internet is at least 30mS just by itself (15mS round-trip for each player), and more typically much higher... 60mS or so with 30mS round-trip latencies per player. Meaning that the FPS becomes irrelevant very quickly above 100Hz in most cases. People always like to believe that they need more, but the reality is... what players need above 100Hz is determinism, not FPS. Going too high hurts the determinism by allowing glitches that would otherwise be absorbed at lower FPSs to create distractions at higher FPSs. When comm latencies dominate, FPS latencies become far less important. (Game engines on the client side usually try to predict where a MOB would actually be by projecting movement forward based on the known comms latency, which makes internet play less annoying, but it is still just a guess and might not agree with where the server has marked the MOBs actual position. These guesses don't really confer any advantage to the player, the full 2 x player comm latency is still the limiting factor). -Matt

  • @Skarfar90
    @Skarfar90 Жыл бұрын

    Benchmarks are all about finding system bottlenecks - Which component will be the limiting factor of the overall performance. *CPU tests:* CPU testing is best done at a lower resolution (like 1080p or even 720p), to eliminate the probablility of a GPU bottleneck. This gives a much better overview of which CPU's will provide the highest framerate with the best available GPU. *GPU tests:* These are done at greater resolutions (1440p, 4K or even 8K) in order to eliminate or reduce the probability of a CPU bottleneck. The best available CPU is typically used in these tests

  • @DavidMiller-dt8mx
    @DavidMiller-dt8mx Жыл бұрын

    Fascinating. While I did understand why you tested with vastly overpowered gpus, I'm a more casual gamer, and this video showed me that I'm good with a lot less power - I also have multiple windows open while playing most games, and the range of cpus I'm looking at should be just fine. Excellent video.

  • @flameshana9

    @flameshana9

    Жыл бұрын

    Having windows open while gaming is irrelevant, just in case you're wondering. They've done a video on that specific concern because so many people think you need 12+ threads as a minimum.

  • @DavidMiller-dt8mx

    @DavidMiller-dt8mx

    Жыл бұрын

    @@flameshana9 I do it all the time, on four. Trust me, if it's taking significant memory and/or thread cycles, it matters.

  • @ChaosPootato
    @ChaosPootato Жыл бұрын

    It boggles the mind that this is still a thing. The CPU testing methodology is very well proven at this point, every single (serious) outlet uses low resolution because that's what makes sense. These people don't want a CPU benchmark, they want a benchmark of their specific targeted complete system and targeted resolution. I'll attribute to ignorance what could very much be attributed to selfishness

  • @CriticalPotato

    @CriticalPotato

    Жыл бұрын

    Excellent name, Potato brethren.

  • @SaltyMaud
    @SaltyMaud Жыл бұрын

    Oh my god thank you for mentioning performance targets. That's exactly how I view it. If I can get the game to run at _at least_ 120 FPS, I'll make the game run at _at least_ 120 FPS, no matter if that means high, medium, or below minimum quality. You can always lower the graphical settings, but you can't make the CPU run games faster than it can. I wish outlets would take this even further, the standard seems to be to test every resolution at ultra everything - I wish 1080p low was a standard test as well just to see what kind of performance target is achievable in the best case scenario. For competitive titles I'd like as high FPS as my monitor can display. 240, in my current setup. Not just average 240, I'd like lows to stay above 240 too for perfectly consistent gameplay. GPU limited data in CPU testing is completely useless unless that happens to be your exact use case and build.

  • @SaltyMaud

    @SaltyMaud

    Жыл бұрын

    On a somewhat related note; 20:48 Funny anecdote, I had my GTX1080 through 3 CPUs until a year ago before upgrading my GPU because, like I alluded to earlier, you can always lower your graphical settings to make game go faster, but if you want more frames from your CPU, you need a more powerful CPU.

  • @DbDBlackJack
    @DbDBlackJack Жыл бұрын

    I love this type of test. I'll always spend as much as I can on a gpu and like to know what cpu can push it. Great job Steve! @24:27 that DOES tell me which cpu is better: the cheaper one. If I have a 3080 already and am looking for which cpu will push it to the limit, I know to get the 13100(price to performance). At that point, I know if I want anymore, I have to upgrade both gpu and cpu in which case, I usually upgrade gpu then cpu. I see great value in these tests, Steve. Thanks again!

  • @okesik
    @okesik Жыл бұрын

    What people also do not realize is that there are plenty of purely CPU-bound games, mostly simulators or physics-heavy games (Space Enginerers, Stormworks, practically speaking most building games have a CPU limit due to drawcalls) where eventually you reach a point where more GPU != better regardless of screen res. Some even recently released games are still CPU bound on a single-thread (OGL/DX11 titles on render thread, sims on physics thread, etc.) also.

  • @tortugatech

    @tortugatech

    Жыл бұрын

    Heavy raytracing titles like spiderman, crysis remastered, sims like Microsoft flight sim and milsims like arma, squad and hell let loose are heavily cpu dependant, there's been a proliferation of heavy cpu games in the last 5 years!

  • @mrbobgamingmemes9558

    @mrbobgamingmemes9558

    Жыл бұрын

    True i play cities skylines with several mods and the gpu feels like almost idle, teardown is even worse with modded weapons, about stock/vanilla games (no modding) i managed to drop to 20 fps flying on london city and maybe tokyo again due to cpu bottleneck on xplane 12

  • @saricubra2867

    @saricubra2867

    Жыл бұрын

    I bought an i7-12700K with DDR5 for music production stuff and console emulators, it also has AVX512. Perfomance for those emulators is on par with the best Zen 4 Ryzen 9s at least from Techpowerup's testing and results. Ryzen 7 5800X3D got destroyed, 3D cache doesn't matter. Raw CPU compute do. Literally the CPU bottleneck for PS3 games is so big that using the Intel UHD graphics inside the CPU is fine.

  • @assetaden6662

    @assetaden6662

    Жыл бұрын

    Also people tend to forget that you can be CPU bottlenecked, even if RivaTuner shows 30% usage. It means that 6 of your 18 cores are being used, while other 12 are just chilling.

  • @tortugatech

    @tortugatech

    Жыл бұрын

    @@assetaden6662 exactly, you need to look at gpu utilization to see cpu bottlenecks

  • @MarkHyde
    @MarkHyde Жыл бұрын

    Controversial ;) - love you taking the trouble to do this. More power to HU!!!

  • @naswinger

    @naswinger

    Жыл бұрын

    glorious hungary!

  • @Oceanborn712
    @Oceanborn712 Жыл бұрын

    Excellent showcase. Thank you for this. I can't count the amount of arguments against people going "Lol who's gonna play like this? That's not realistic AT ALL!!!!" I've had so far. This video should be linked under each of your future testing videos.

  • @xenon787
    @xenon787 Жыл бұрын

    Cross referencing the same game performance in the CPU and GPU review for my system is such a revelation. Thanks for your incredible computer journalism and testing, you guys are such an incredible resource.

  • @CZ_Aesthetic
    @CZ_Aesthetic Жыл бұрын

    Would love to see 3440x1440 ultrawide benchmarks :)

  • @KevinM0890

    @KevinM0890

    Жыл бұрын

    its between 4k and 1440p. u can do the math on your own

  • @CZ_Aesthetic

    @CZ_Aesthetic

    Жыл бұрын

    @@KevinM0890 doesnt change anything. Still would Love to see 3440x1440p ultrawide benchmark :)

  • @josephkelly4893
    @josephkelly4893 Жыл бұрын

    Great video, shows that matching your GPU with a similar performing CPU is so important

  • @CyberneticArgumentCreator

    @CyberneticArgumentCreator

    Жыл бұрын

    For a given budget, yes, but its primary purpose is to show why buying a more powerful CPU will give you years more relevant life and how to figure out which CPU is in fact better and by how much. Woe to the people who bought that 2000 generation of Ryzen CPUs instead of an Intel i7. The 8700k still crushes gaming and the 2X00 Ryzens are slower per core than the older 7700k. That's the kind of information a good CPU benchmark can expose so people spend their money wisely. Typically, people roll their rigs forward and only buy a "full rig" of components on the first build. So knowing which CPU is the fastest is hands down more important to the person upgrading their socket and CPU than knowing what GPU to pair it with. They just want the fastest CPU for their budget at the time of buying it. Then a few years later they will want to know the fastest GPU for their budget. Rinse, repeat.

  • @aliancemd

    @aliancemd

    Жыл бұрын

    Wouldn't you also match a monitor? Once you spend $2k+ on GPU + CPU, wouldn't you switch away from 1080p?

  • @HammerHand83

    @HammerHand83

    Жыл бұрын

    @@aliancemd Yes, and that's exactly the point of this video: choose your target FPS, which is often determined by your (future) monitor's refresh rate, and then select your desired CPU and GPU based on your budget. The thing is there are hundreds of different combinations and a tech channel like HU can't possibly cover all that. Ultimately, some of the thinking and decision-making is left to the individual.

  • @pixels_per_inch

    @pixels_per_inch

    Жыл бұрын

    Always go for a better CPU if you can. You always want to be GPU-bound but never CPU-bound. Lower framerate is much better than stuttering, hitching or any sort of inconsistencies with the framerate. Not to mention Nvidia's Reflex relies on your CPU as well: the lower the CPU time, the more you benefit from Reflex.

  • @c99kfm

    @c99kfm

    Жыл бұрын

    @@CyberneticArgumentCreator Wouldn't someone who went for a 2000 generation Ryzen be able to upgrade to a 5800X3D today? For WAY cheaper than someone who's now looking for an entirely new platform? How's that "woe"-ful?

  • @hillardbishop8755
    @hillardbishop8755 Жыл бұрын

    @Hardware Unboxed My rule of thumb if I have to make a compromise is go for higher tier on CPU because it is easier to swap out graphics card later. It doesn't matter how powerful your graphics card is if it is being limited by you CPU. Ideally, I want to match CPU and GPU for the build, but sometimes due to budget constraints, one must compromise while minimizing the tradeoffs and keeping in mind future upgrade paths. I agree 1080p testing shows which CPU performs better, GPU testing shows which graphics card performs better. With those results in mind, one is better equipped to pair them appropriately in a build.

  • @TheGerudan
    @TheGerudan Жыл бұрын

    The only thing I'm missing in those CPU-Tests are Tests with RTX on. It might sound contradictory, but RTX generally also increases the performance demand on the CPU as well.

  • @SiberdineCorp

    @SiberdineCorp

    Жыл бұрын

    Very true, only seen Digital Foundry test the RTX cpu penalty. Cyberpunk 2077 with RTX needs at least a 5600x to reach 60fps.

Келесі