3080 12GB vs 4070: The Ultimate Comparison

Ғылым және технология

The RTX 4070 has been compared to the RTX 3080 in many reviews, but not the 3080 12GB, which launched a bit later, and was honestly a much more interesting GPU if you could find it at a similar price to the 10GB model. These GPUs have the same VRAM capacity but the 4070 has half the memory bus, although paired with a much larger L2 cache, more energy efficient process, and new features like frame generation. How does the 4070 stack up against a 3080 12GB? Let's find out.
Buy an RTX 4070: amzn.to/3pm7TW3
Buy an RTX 3080 12GB: amzn.to/3LOPNTU
Test system specs (Resize BAR ON):
CPU: Ryzen 7700X amzn.to/3ODM90l
Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
RAM: 32GB Corsair Vengeance DDR5 6000 CL36 amzn.to/3u563Yx
SSD: Samsung 980 Pro amzn.to/3BfkKds
Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
Mouse: Logitech G305 amzn.to/3gDyfPh
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above.
Chapters:
0:00 The 3080 12GB vs 10GB vs 4070 specs
1:10 Which models of GPU am I testing?
1:26 Frame Generation, Ray Tracing Ultra, DLSS Quality Cyberpunk 2077 1440p
2:33 Ray Tracing Ultra (no upscaling) Cyberpunk 2077 1440p
3:40 Cyberpunk 2077 1440p Ultra
4:46 Unreal Engine 5 Fortnite 1440p Epic
5:30 UE5 Ray Tracing Fortnite 1440p Epic RT On
6:12 UE5 Fortnite 1440p High
6:54 UE5 Fortnite 4K Epic
7:36 UE5 Fortnite 4K RT On DLSS Quality
8:19 UE5 Fortnite 1080p Epic RT On
9:01 A Plague Tale Requiem 1440p Ultra
9:39 A Plague Tale Requiem 4K Ultra
10:10 A Plague Tale Requiem 1080p Ultra
10:43 The Last of Us Part 1 1440p Ultra
11:17 The Last of Us Part 1 4K Ultra
11:52 The Last of Us Part 1 1080p Ultra
12:25 Resident Evil 4 Remake 1440p Max Preset (Includes RT)
12:52 Resident Evil 4 Remake 4K Max Preset (Includes RT)
13:21 Resident Evil 4 Remake 1080p Max Preset (Includes RT)
13:52 Forspoken 1440p Ultra-High (Includes RT)
14:16 Forspoken 4K Ultra-High (Includes RT)
14:41 Forspoken 1080p Ultra-High (Includes RT)
15:06 Call of Duty Modern Warfare II 1440p Balanced
15:31 Call of Duty Modern Warfare II 4K Balanced
15:53 Call of Duty Modern Warfare II 1080p Balanced
16:16 Returnal 1440p Epic
16:39 Returnal 4K Epic
17:01 Returnal 1080p Epic
17:25 The Callisto Protocol 1440p Ultra
18:09 The Callisto Protocol 4K Ultra
18:52 The Callisto Protocol 1080p Ultra
19:36 Final Thoughts

Пікірлер: 1 200

  • @abdullahjester9798
    @abdullahjester9798 Жыл бұрын

    last generation's performance at today's prices, Good job Nvidia.

  • @lukasr1166

    @lukasr1166

    Жыл бұрын

    Where I live the 3080 12gb barely exists and is far more expensive than a 4070 the few places where you can buy it. So the 4070 isn't all bad.

  • @puffyips

    @puffyips

    Жыл бұрын

    @@lukasr1166still not worth it.

  • @lukasr1166

    @lukasr1166

    Жыл бұрын

    @@puffyips It's certainly a better deal than overpriced 3000 series cards. 3070s, 80s, 80 12gbs are so overpriced.

  • @raulitrump460

    @raulitrump460

    Жыл бұрын

    @@lukasr1166 4060/60ti should get 3080 perf not 4070

  • @DiggetyDank

    @DiggetyDank

    Жыл бұрын

    ​@Lukas R buy AMD

  • @pf100andahalf
    @pf100andahalf Жыл бұрын

    The best thing about the 4070 is that it's driving down prices of used 3080's.

  • @Kage0No0Tenshi

    @Kage0No0Tenshi

    Жыл бұрын

    Yeah, thats why I like 4070 🤣

  • @pdmerritt

    @pdmerritt

    Жыл бұрын

    is it? i sure can't tell

  • @kaimojepaslt

    @kaimojepaslt

    Жыл бұрын

    @@pf100andahalf so what that it went down 50, it will add extra 50 in your power bill lmao. and add that every month. you kids dont use brain properly.

  • @astreakaito5625

    @astreakaito5625

    Жыл бұрын

    Barely, and I don't think it's worth it because frame gen is a big deal you definitely want that option just in case.

  • @pf100andahalf

    @pf100andahalf

    Жыл бұрын

    @@astreakaito5625 what I want first is price reductions. I would much rather have a used $400-450 3080 than a new $600 4070. Better performance for $150 less and about to be $200 less. So far I haven't needed frame generation since I play at 1440p and the 3080 is a best of a card.

  • @commandertoothpick8284
    @commandertoothpick8284 Жыл бұрын

    It aint just about length. Width matters too

  • @weirddylan7721

    @weirddylan7721

    Жыл бұрын

    you say it

  • @puffyips

    @puffyips

    Жыл бұрын

    More so depending on the shorty

  • @arc00ta

    @arc00ta

    Жыл бұрын

    GIRTH is WORTH

  • @megadeth8592

    @megadeth8592

    Жыл бұрын

    ;)

  • @iseeu-fp9po

    @iseeu-fp9po

    Жыл бұрын

    . . . ;

  • @sbtg2688
    @sbtg2688 Жыл бұрын

    Guys, remember that Nvidia initially launched the 4070 Ti as 4080 12GB? So they were originally planning to release this 4070 as 4070 Ti and charge us even more

  • @BlackJesus8463

    @BlackJesus8463

    Жыл бұрын

    Would've been the same result,,, nobody buying. Jensen has lost touch with reality.

  • @shanez1215

    @shanez1215

    Жыл бұрын

    That's probably where that 750 rumor came from lol.

  • @casweeden

    @casweeden

    Жыл бұрын

    The 4070 is really a 4060 ti. The 4070 ti is really a 4070. They just wanted to move their product stack up a belt loop and hope no one noticed. Didnt work for the 4080 "12 gb" and they had to pivot, but they still got what they wanted with the remainder of the stack.

  • @Tripnotik25

    @Tripnotik25

    Жыл бұрын

    AMD will set things straight and release a 7700XT with the performance of 69xx, we already have the 800$ 7900 XT thats a whopping 10% faster than 6950XT.

  • @kornelobajdin5889

    @kornelobajdin5889

    Жыл бұрын

    ​@@Tripnotik25 4 more gigs 10% more performance and 100-150 bucks more expensive. Well pick which one is better.

  • @rarigate
    @rarigate Жыл бұрын

    Remember, this was supposed to be the 4070 Ti

  • @IsraelSocial

    @IsraelSocial

    Жыл бұрын

    This is a 4060ti

  • @BlackJesus8463

    @BlackJesus8463

    Жыл бұрын

    @@IsraelSocial lolz They're using TI as a whole ass performance tier instead of added value.

  • @math3capanema

    @math3capanema

    Жыл бұрын

    @@IsraelSocial no, it's supposed to be a 4070 ti, and 4070ti is supposed to be 4080 12gb

  • @Some-person-dot-dot-dot

    @Some-person-dot-dot-dot

    Жыл бұрын

    @@alexsmirnoveu Deprived? They didn't "deprive" us of anything lmao. They messed up big time this generation though.

  • @alexsmirnoveu

    @alexsmirnoveu

    Жыл бұрын

    @@Some-person-dot-dot-dot By "deprive" I meant that we won't get GPU that was supposed to be 4070 Ti originally. I agree that Nv messed up. My point was that the 4070 isn't the original 4070 Ti.

  • @mattheww729
    @mattheww729 Жыл бұрын

    I miss the days when VRAM usage correlated with how good the game looked instead of how lacklustre the code is

  • @andersbjorkman8666

    @andersbjorkman8666

    Жыл бұрын

    When running everything in rdr2 on max settings it looks amazing, and it doesn't go over 7gb vram ^^ And some games that look inferior im every way is hogging vram like crazy. Lazy and incompetent developers when it comes to optimization

  • @mattheww729

    @mattheww729

    Жыл бұрын

    @@andersbjorkman8666 I agree with you there. RE4 doesn't look substantially better to me than games I ran on my 2gb 960 yet it uses 10gb

  • @__-fi6xg

    @__-fi6xg

    Жыл бұрын

    what do you expect, nvidia funded cyberpunk...

  • @ICCraider

    @ICCraider

    Жыл бұрын

    @@andersbjorkman8666 You don't even have to compare trash ports to rdr2. Just try comparing them to "the vanishing of Ethan Carter" it'll probably look better than the latest resident evil but will run on a GTX650 just fine. Modern PC ports are a joke. And the guys who ports them are clowns.

  • @countdespin66

    @countdespin66

    Жыл бұрын

    @@mattheww729 RE4 has very high res textures all over the place on two highest tex detail settings

  • @michaellvoltare
    @michaellvoltare Жыл бұрын

    That 4070 power draw is crazy efficient. But no way I'm giving my money to Nvidia.

  • @bookbinder66

    @bookbinder66

    Жыл бұрын

    as a 1060 owner Iam

  • @Tripnotik25

    @Tripnotik25

    Жыл бұрын

    Crazier even is that supposed "blackwell" nvidia 5xxx gets another shrink in 2024, 5090 is to be 300W TDP (vs 450W 4090). 5070 will be a 120W card.

  • @Cristianxf1

    @Cristianxf1

    Жыл бұрын

    Ngreedia

  • @HeLithium

    @HeLithium

    Жыл бұрын

    @@Tripnotik25 possibly single fan 5070?

  • @Tripnotik25

    @Tripnotik25

    Жыл бұрын

    @@HeLithium Cards around 160W and lower tend to have single fan models, 1060, 1070, 2060, 3060. As the shrink will be another power efficiency jump from current Lovelace, 5070 will 99% chance have tiny single fan cards for the ITX builds.

  • @nicoshah6288
    @nicoshah6288 Жыл бұрын

    This is such a cool and informative video. I have a 3080 12GB and there aren't many quality videos like this. Keep doing what you are doing😊

  • @butcherrengar3222
    @butcherrengar32229 ай бұрын

    Completely underrated, your breakdowns and even the test you run are perfect great job.

  • @merlingt1
    @merlingt1 Жыл бұрын

    Thank you for doing this comparison. I have a 3080 12GB and all reviewers seem to pretend it doesn’t exist.

  • @davepianist84

    @davepianist84

    Жыл бұрын

    Tbh not many people got it.

  • @lukasr1166

    @lukasr1166

    Жыл бұрын

    Because it was overpriced, still is overpriced and I doubt it had much supply.

  • @rakon8496
    @rakon8496 Жыл бұрын

    To answer to your request around wanted content: I would appreciate more eduactional testing setups that are investigate behaviour like in this video with the bus width/bandwith. And being aware of how easily you can leave performance or even screw up your experience in the Nvidia(AMD) control panel... basically an evergreen for a changing audiences. E.g. i would appreciate videos explaining single settings in depth and testing their performance/experience implications in several different game engines- alongside the usual comparative testing. That could add to your benchmarking imo. THX💙

  • @Plasmacat91
    @Plasmacat91 Жыл бұрын

    Great content, brother. You have quickly become one of the most influential voices in the community.

  • @vallin411
    @vallin411 Жыл бұрын

    I am glad I got my hands on a brand new PNY 3080 12GB for 7490 SEK at black friday last year, it's about equivalent to $540 US. Was a big upgrade from 1070 and seeing the price/performance from the 40-series I do not regret it one bit. EDIT: Since a lot of people comment on it: The price comparison is with VAT, import fees and other addons for the swedish price. Generally you can take the USD price*14 today to get a rough estimate in the swedish retail prices.

  • @farbenseher2238

    @farbenseher2238

    Жыл бұрын

    You need a beefy cooler though. 375W of power consumption are no joke.

  • @mickaelsflow6774

    @mickaelsflow6774

    Жыл бұрын

    That's about the price of a 4070 @ Webhallen though. So same same. I'm curious: how do you feel the PNY perform? Build quality and noise, mostly.

  • @oxfordsparky

    @oxfordsparky

    Жыл бұрын

    having owned a 40 series for a few months I'd take a 4070 over a 3080 12GB every single time, very similar base performance to the point where you wouldn't notice in reality but with the option of FG and using way less power makes it a no brainer if they are both available at the same time.

  • @Mattiedamacdaddy

    @Mattiedamacdaddy

    Жыл бұрын

    Feels bad my dad bought a 3080 Zotac for 1200 last year and they’ve dropped to like $500 used oof

  • @kaimojepaslt

    @kaimojepaslt

    Жыл бұрын

    @@farbenseher2238 he enjoys beefy power bill also lmao.

  • @bliglum
    @bliglum Жыл бұрын

    I miss the days when a 70 range card delivered top-tier 80Ti or Titan level performance, for a mid-range price. Now, we get barely 80 range performance, for a top-shelf price.

  • @poison7512

    @poison7512

    Жыл бұрын

    To be fair that didn't really happen until Maxwell.

  • @Noob._gamer

    @Noob._gamer

    Жыл бұрын

    Go get a playstation 5 instead of this box with crape stuff inside and with RGB lights and cannot get 30 fps at 4k and with that prices like 1000 s $

  • @rogoznicafc9672

    @rogoznicafc9672

    Жыл бұрын

    @@Noob._gamer no bcz i pirate everything so its still cheaper xD (allegedly tho)

  • @Chironex_Fleckeri

    @Chironex_Fleckeri

    Жыл бұрын

    ​@@poison7512 Yes and it's glossed over. The 8800 GTX and that architecture was Nvidia's big break. I remember people running SLI 8800 GTX to run Crysis (1). But it wasn't until after Fermi that Nvidia really started making some big leaps. Maxwell and Pascal were culmination of many years of R&D. From 2012-2015 Nvidia didn't really have, in hindsight, good product lineups in terms of price to performance. The issue is that Nvidia has gone for margin. It's likely Nvidia is preparing their business for integration into things like national security. SpaceX is another tech company doing the same. Microsoft. AMD seems to be more business as usual in the graphics segment and their long term contracts with customers like Sony, Microsoft, and all the APU devices that have AMD inside. While their footprint is smaller than Nvidia, it just seems like Nvidia doesn't really need consumer confidence in their historical revenue driver. They're acting that way and investors are aware. But Nvidia will flip right back if mining takes off again. Long term they don't want to make a 1070 or a 1080ti type of product available anymore. They're crunching their product mix in a way that resembles smartphones. The graphics card as a 2 year "can you afford it" check.. not so nice, but Nvidia has business opportunities that make them not need us as much. Perhaps there is some technology (chiplets? SOCs? The client device becoming a streaming platform for gaming?) I don't know what they're working on, but I don't think Intel would've done Arc if Nvidia had continued to make everything a Pascal and then spacing out time between generations. This would be friendly to consumers but bad for Nvidia. It's just cold business man

  • @Peppeppukii

    @Peppeppukii

    Жыл бұрын

    ​@@Noob._gamer we all know that console are generally cheaper but they're not so great in some games, like rts or competitive fps..., also you can't really setup some highres music player on it, to your liking and a lots more...

  • @Neonloverx
    @Neonloverx3 ай бұрын

    Hey does anyone know if 4070 has any latency increase due to it using less power. I don't know which stat to look for this..mainly for competitive fps games

  • @jeroendelaat6899
    @jeroendelaat6899 Жыл бұрын

    Hi daniel! I just wanted to say thank you for all the videos. 6 months ago I built my pc based off of your tips and am still happy with the result now. Im commenting now though, because I wonder how much of your audience, like myself, only watch you for a month or so, get their information they need and leave straight after. Thats why I figured an appreciation post is in place. Have a good one!

  • @LeonardPutra
    @LeonardPutra Жыл бұрын

    Recently I got a secondhand 3080 12GB with 2.5yrs warranty left for ~$400. One of the best purchase for my PC.

  • @thatguyDean05

    @thatguyDean05

    Жыл бұрын

    damn, beli dimana bro? toko ijo kah?

  • @oxfordsparky

    @oxfordsparky

    Жыл бұрын

    unless you have the original receipt you don't have any warranty at all.

  • @CoCo.-_-

    @CoCo.-_-

    Жыл бұрын

    try undervolting, ampere undervolts so well tbh, my 3090 performs the same as a 3090 ti in games while consuming 320w at most with rt and shit, usually around 300w without if not a little less (stock it was 390w with lower clocks due to power limits) doing a oc ad undervolt at the same time is really nice, my performance seems way more stable as well, i could probably have same performance as stock while consuming 280w or a little under but want that extra 5-8% increase in perf.

  • @LeonardPutra

    @LeonardPutra

    Жыл бұрын

    @@CoCo.-_- yep, mine got a real nice silicon, 800mV 1800MHz, or ‘eco’ mode 750mV 1590MHz. 220 watts on 1440p Hogwarts Legacy with nice framerate.

  • @LeonardPutra

    @LeonardPutra

    Жыл бұрын

    @@thatguyDean05 iya toko ijo. Garansi NJT 3 yr

  • @TRONiX404
    @TRONiX404 Жыл бұрын

    They went from a 384bit memory bus to 256bit on the 4080, looking like another intentional bottleneck.

  • @BleedForTheWorld

    @BleedForTheWorld

    Жыл бұрын

    Cutting costs = more profit.

  • @xxovereyexx5019

    @xxovereyexx5019

    Жыл бұрын

    sometime gpu architecture is much more important than bus width etc

  • @Zettoman

    @Zettoman

    Жыл бұрын

    they have to leave room for a super/ti version

  • @niebuhr6197

    @niebuhr6197

    Жыл бұрын

    Because the new gen G6X at 256b offers comparable bandwidth to 1st gen G6X at 320b, using 1/3 of the power, plus 10 times more of on-die cache? Nephews are hilarious

  • @ians_big_fat_cock5913

    @ians_big_fat_cock5913

    Жыл бұрын

    bit bus doesn't matter as much as total bandwidth. The design of the memory itself will matter more in many cases.

  • @johndelabretonne2373
    @johndelabretonne2373 Жыл бұрын

    Daniel, as usual, I completely agree with your assessment! I'll wait to see what kind of price/performance AMD gives us at 7700 & 7800 levels. I'd probably be most inclined to get a 7900 XT if it were at or under $700...

  • @galacticdoge5996
    @galacticdoge5996 Жыл бұрын

    finally got the video i was looking for! thanks

  • @maag78
    @maag78 Жыл бұрын

    I think this looks great. I paid 900usd for my ROG Strix 3080 10gb and that was 100 less than MSRP. Yes that's the most expensive model but wasn't the cheapest 3080 700? I really don't get peoples issue with the 4070. I'm getting great performance with it in 1440p and it sits at around 180 watts and is cool as a cucumber and extremely quiet.

  • @pascaldifolco4611
    @pascaldifolco4611 Жыл бұрын

    4070 is 95% of the 3080 but with 40% LESS power consumption (200W to 340W), which is quite insane

  • @danmckinney3589

    @danmckinney3589

    Жыл бұрын

    It's a typical Nvidia grift. They speced the bus for the 3080 way higher than the memory could actually utilize for better paper numbers that only give you a higher electric bill in practice. The next gen they cut down the bus and drop the tier to show a paper efficiently improvement on what is effective the same product. I've lost count how many times I've seen Nvidia pull that play.

  • @onik7000

    @onik7000

    Жыл бұрын

    @@danmckinney3589 200w to 340w - is like 7-7.5 usd per month if you use it 24/7.

  • @hariyanuar8222

    @hariyanuar8222

    Жыл бұрын

    @@onik7000 this is a real calculation?

  • @noThing-wd6py

    @noThing-wd6py

    Жыл бұрын

    @@onik7000 that 140w difference is at 24/7 for 30 days ~45€ per month in Germany. We pay 0,45€/kwh but that ranges from 0,30€/kwh to 0,60€/kwh.

  • @danmckinney3589

    @danmckinney3589

    Жыл бұрын

    @@onik7000 at the average electric rate in the US of about $0.20/kWh, the 140 watt difference is costing $0.03/h. 8 hours of gaming a day puts the additional cost at $7.20 a month. Let's just say you were running at full tilt round the clock. That additional 140w would cost you an extra $260 a year. Now take that $260 and divide by 24 then multiply by however many hours of gaming you do a day. That will give you your additional cost. Mind you, that's just the difference between the two. It's not including the $0.04/h you're spending on the first 200w nor the extra $0.015/h you're paying because both of these GPUs actually opporate 50w over TGP. In Total that would leave a 3080 gaming around the clock at costing $740 a year to operate Now the actual point of my comment was to say that this is how Nvidia opporates. The over spec one generation beyond what the hardware can handle, then when they down spec the next gen. They use the "efficiency increase" as an excuse to keep the price high knowing fully well that the energy saving won't even be noticable to the average consumer. That's what makes it a grift.

  • @pt6998
    @pt6998 Жыл бұрын

    Saving now for Blackwell next year. Getting a card with atleast 16gb of vram on it. Not making the same mistake I did with my 3070. Hopefully by then, competition from AMD and Intel brings Nvidia back to reality on pricing.

  • @Battleneter

    @Battleneter

    Жыл бұрын

    The 3070 was always suspect right from the start, current consoles can spare up to 10GB (out of 16GB) of shared memory for VRAM after the game data and OS. Around 10GB is basically the upper target for game developers for 4K and below, excluding the odd fringe case 10 &12GB will be fine for the next 3-5 years, it's not going to be the same situation as 8GB.

  • @YashPatel97335

    @YashPatel97335

    Жыл бұрын

    @@Battleneter you are confusing for accounting same vram in consoles and pcs. Remember console have only 1 purpose to play game and developers will definitely optimise their game so it runs perfectly on console. Pc on the other hand, do more than just gaming. So it works differently. 10gb vram on console can’t be considered same as 10 gb vram on pc. Pc might need higher vram to perform same as on console. And that is why the game developers are not bothered by vram consumption on pcs and pc ports are launching without proper optimisation. And judging by the current trend of AAA released games, we can get games that will eat more than 12-16 gb of vram on pc in THIS generation alone. And this vram consumption will increase with newer generation of consoles. So yeah if you have budget, go for higher vram and native performance rather than software like dlss and fsr. That chide will last you longer. So 10gb console vram doesn’t equal to 10gb pc vram.

  • @MegaLoquendo2000

    @MegaLoquendo2000

    Жыл бұрын

    ​@@YashPatel97335The situation's so bad that Jedi survivor has a native sub 720p resolution on ps5, I wouldn't be surprised if current gen consoles (ps5 and xsx) end up using the equivalent to low settings before the generation ends.

  • @demetter7936

    @demetter7936

    11 ай бұрын

    Same as me. I'm putting aside £3000 so I can completely upgrade my PC next year, i'll sell my current parts to get back roughly half the cost.

  • @Battleneter

    @Battleneter

    11 ай бұрын

    @@demetter7936 You can argue old consoles also have value, but again consoles are just toys. It's like comparing a screw driver (consoles) to the price of a Swiss army knife (PC), sure both will play games but the PC does a crap ton more.

  • @nguyentrananhnguyen7900
    @nguyentrananhnguyen7900 Жыл бұрын

    hello, what's your take on the news about someone soldering more VRAM to the RTX 3070 turning it to a 16GB VRAM card?

  • @mathesar
    @mathesar Жыл бұрын

    It's actually kind of crazy how many anti-consumer moves Nvidia has pulled with the 40 series cards. I'm gonna hold off until the RTX 50 series and hope things get better especially with VRAM. also recently learned they're going to release a 4060 Ti 16GB, What a mess lol.

  • @Hexenkind1

    @Hexenkind1

    Жыл бұрын

    Would not be surprised if they also release another 4080 with more VRAM again.

  • @retrofizz727

    @retrofizz727

    Жыл бұрын

    where did you see this information

  • @Elleozito

    @Elleozito

    Жыл бұрын

    same, i was using 1660TI and got a 3060TI for my 1660TI + some money, i'll use until idk, 5070TI comes out, cuz this generation was dogshit, without the new Nvidia Tech 'frame generator' literally didn't change a single thing... but they increased the price, said the moore's law is dead and also as u said, VRAM? BUS WIDTH? like hello Nvidia? we're stuck at 2018?

  • @memoli801

    @memoli801

    Жыл бұрын

    You know , there are other companies out there?

  • @retrofizz727

    @retrofizz727

    Жыл бұрын

    @@memoli801 no

  • @ARobotIsMe
    @ARobotIsMe Жыл бұрын

    Great work Owen!

  • @kalestra4198

    @kalestra4198

    Жыл бұрын

    Michael oweeeeen

  • @technologicalelite8076
    @technologicalelite8076 Жыл бұрын

    1:10 I like how the name of the card is now see through with your computer screen, some interesting innovation!

  • @hackmaster4953
    @hackmaster4953 Жыл бұрын

    1:13 wow it's limited edition with transparent box, nice catch Daniel ;D

  • @blackcaesar8387
    @blackcaesar8387 Жыл бұрын

    Now we know for sure that frame gen will never come to 30 series. Its literally the only thing selling the 4070 now. I imagine that would be the same for every lower tier card still to come.

  • @Horendus123
    @Horendus123 Жыл бұрын

    Great video My suggestion for another video 3080 10GB vs 3080 12GB On current gen titles, has the extra 2BG been of much benefit

  • @cbdeakin

    @cbdeakin

    Жыл бұрын

    Yes, it would be interesting to see how much the extra 2GB of VRAM helps at 1440p/4K. Particularly in demanding games like the Witcher3 next gen.

  • @adaeptzulander2928

    @adaeptzulander2928

    3 ай бұрын

    It not just the extra 2 GB, its the bus size increase 320 -> 384 bit.

  • @WilFitzz
    @WilFitzz Жыл бұрын

    Have you considered doing any monitor reviews? I know you have a lot on your plate already, but it would be cool for someone to pick out some value monitor picks!

  • @SongOfLife_
    @SongOfLife_Ай бұрын

    Hi Daniel i have a question The 3080 used less vram in most times does that mean more memory bus less vram usage?

  • @YetMoreCupsOfTea
    @YetMoreCupsOfTea Жыл бұрын

    I recently picked up a used RX 6800 non-xt, which after some mild overclocking is performing in 3080 territory (ray tracing off, of course). It only cost me around 300 USD.

  • @Birawster
    @Birawster Жыл бұрын

    The most notice difference between 30 & 40 series is the power draw, even tho you can have the same fps performance but you get less power usage, pretty good to consider your monthly electric bill

  • @Bdot888

    @Bdot888

    Жыл бұрын

    Thats the main reason I went for a 4070ti. It will pull a little over half the power of the 3080 and temps are a lot better. That was the main selling point overall for me

  • @Noob._gamer

    @Noob._gamer

    Жыл бұрын

    The performance differs is huge compared to 30 series you are comparing 70class to80class of last generation when you compare 4080 to 3080 it’s a big jump in performance and 4070 ti is = last gen powerful GPU 3090 if you compare 4090 to 3090 it is also a big jump in performance AMD 7900xtx also performance similar to 4080 still but ha prices are higher for that performance if it comes same prices as 30 series it would sell like cupcakes

  • @lethanhtung4972

    @lethanhtung4972

    Жыл бұрын

    @@Noob._gamer Please remember that the 3070 = 2080ti, the 4070 should be at least on the 3090 level not 3080

  • @Noob._gamer

    @Noob._gamer

    Жыл бұрын

    @@lethanhtung4972I agree

  • @robertcarhiboux3164

    @robertcarhiboux3164

    Жыл бұрын

    we now wait for a 150watt card at the price of 150/200dollars,that can max out any title at 1080p just like it used to be.

  • @necro4468
    @necro446811 ай бұрын

    My boy just got a 4070 build from a sponsor and was like "I have the best pc in the group now." I have a 3080 12 gb and said nah your just up here with the big dogs

  • @onik7000
    @onik7000 Жыл бұрын

    I dont get that Frame Gen. More FPS, fine. But do i get more "control fps"? If my game running at 10 fps (laggy as hell) and Frame Gen make in like 100 fps - what will happen if i move my mouse? Will it lag?

  • @91plm
    @91plm Жыл бұрын

    this is more confirmation that the rtx 4070 should have been a 4060 TI at most, delivering a 10% worse performance in most scenarios than an rtx 3080 12 gb

  • @raulitrump460

    @raulitrump460

    Жыл бұрын

    4060 because 3060 ti faster than 2080 super

  • @pdmerritt

    @pdmerritt

    Жыл бұрын

    that was NEVER going to happen....think about it....we had the 4080 12GB before it became the 4070ti so this card was ALWAYS slotted for the 4070 position no matter whether it was spec'd more like a 4060ti or not.

  • @andreabriganti5113

    @andreabriganti5113

    Жыл бұрын

    With those specs and performances, this is actually the 4060 TI and should be priced below 499$.

  • @IsraelSocial

    @IsraelSocial

    Жыл бұрын

    It was a 4060ti and the 4070ti is the 4070 but they couldnt justify the price rise! Imagine buying the 4070 for $800 😂 is what actually is happening but everybody thinks that they are buying the ti version

  • @pdmerritt

    @pdmerritt

    Жыл бұрын

    @@andreabriganti5113 oh I agree it should be 499 or below but like it or not....this is definitely what we got as a 4070 this time around. Nvidia got WAYYYY greedy!

  • @MrIsmaeltaleb85
    @MrIsmaeltaleb85 Жыл бұрын

    Love my 3080 12gb. Suprim X model. Very good silicone in mine. Been running it since September 2022 at 1980mhz @0.875v +1200 on memory. Pushed to it limits at 430w it can hold 2295mhz in unigine heaven 4k.

  • @xxovereyexx5019

    @xxovereyexx5019

    Жыл бұрын

    3000series is very power hungry, 4070 only needs +200watts

  • @Kage0No0Tenshi

    @Kage0No0Tenshi

    Жыл бұрын

    Nice undervolt for 1980Mhz, I stick to stock voltage cap it to stock and was able to overclock to 2115Mhz stable for my rtx 3070 msi gaming x trio sadly a$$hole Nvidia cap wattage to 255W I can not pass 260W what I even do. unigine heaven is pretty bad now days and would recommend to use firestrike at last or even timespy, I can do stable 2200Mhz something too, on unigine heaven but does not stress GPU. Lucky you that your GPU can pull 430W my stuck 255W and below.

  • @Kage0No0Tenshi

    @Kage0No0Tenshi

    Жыл бұрын

    4070 is taking about 50W less than rtx 3070. Not big deal and 4070 have really bad overclocking potential vs 4080 and 4090

  • @franks8127

    @franks8127

    Жыл бұрын

    I have a 3080 12g that takes an undervolt at 8.35 1875 mhz and plus 500 on the memory.... it uses about 280w and is really stable...... its a gigabyte oc card

  • @MrIsmaeltaleb85

    @MrIsmaeltaleb85

    Жыл бұрын

    @xXOverEyeXx i have 4 profiles in MSI Afterburner. If i run the efficient mode (for my kids) 0.825v @1860 with stock memory i can keep it around the 200w in most games. Not far off the efficiency of a 4070 with about the same performance. Mind you, you can also undervolt a 4070.

  • @charlesgoin8217
    @charlesgoin82174 ай бұрын

    I appreciate this video.. I have an EVGA 3080 12GB FTW Ultra Hybrid. Wasn't sure if it was going to be worth going up. Seems even though this isn't a comparison between the 4070 Super and mine. But it gives me comparable to work with. I think I will wait till the 50xx series comes out. As it seems you have proven that the big jump would be the 4090 and if I am going to get a 4090 I will get a hybrid.. and well.. wait for the price to come down on that.

  • @diablo29
    @diablo29 Жыл бұрын

    wait, what issues does the MSI Ventus X3 cooler design have for the RTX 4070?

  • @soppingclam
    @soppingclam Жыл бұрын

    The 12gb 3080 was a significant upgrade from the 10gb version. Lots of differences

  • @rohanb2711

    @rohanb2711

    Жыл бұрын

    It's gonna age way way better than 10 gb model

  • @Kage0No0Tenshi

    @Kage0No0Tenshi

    Жыл бұрын

    It's exactly same only diff is vram memmory and may be pushed higher on stock clocks only

  • @WayStedYou

    @WayStedYou

    Жыл бұрын

    ​@@Kage0No0Tenshi extra bus width 2gb extra vram more cores and SMs

  • @IsraelSocial

    @IsraelSocial

    Жыл бұрын

    5% for 17% more money no thanks

  • @nicane-9966

    @nicane-9966

    Жыл бұрын

    ​@Jesus is Lord wrong. Wider bus as the 3090. Thats why it literally performs toe to toe with 3080ti...

  • @karehaqt
    @karehaqt Жыл бұрын

    Tbh the only thing I see as a big difference is the power draw. DLSS 3 isn't a big thing for me as I never turn it on as it just doesn't feel good, personally it feels janky as hell so I only ever use DLSS for it's image reconstruction, not the frame gen.

  • @Greez1337

    @Greez1337

    Жыл бұрын

    Wow. You're the only person I've see remark the jankiness of DLSS3. A lot of people think it's gonna save them from a 40fps stutterfest experiences in the AAA goyslop released now instead of exacerbate the input delay compared to the smoothened frame rate they see.

  • @CoCo.-_-

    @CoCo.-_-

    Жыл бұрын

    @@Greez1337 most of the people using it are 4090 owners playing at 4k or 4070 to 4080 owners playing 1080p mostly and some 1440p with the base fps being atleast around 90fps, its not surprising they don't notice the jank from it, i bet they would if the base fps was 60 or lower tho, honestly it just reminds me of how motion blur makes it look like more frames xD

  • @karehaqt

    @karehaqt

    Жыл бұрын

    @@Greez1337 For me it feels the same as turning on motion interpolation on a modern TV.

  • @PainWrld
    @PainWrld Жыл бұрын

    Just snagged a used 3080ti rog strix for 600, W or L? Wonderin if its still worth it considering the 12gb vram cuz I do wanna play most new and old games at either 1440p high frame rates or at least 4k 60 but im wondering if 4k 60 is still achievable with this card on new games cuz a lot of times in benchmarks Im seein like 4k 40s-50s. Ik with older games Ill be more than good but it seems like this vram and poor optimization issue is gonna be more prevalent overtime. Idk yall lmk, I still think its a good buy cuz its still stronger than 4070 and the price on a used rog strix 3080 ti is still pretty damn high (min $675-$700 and avrg bein like $750 plus)

  • @soplam9555
    @soplam9555 Жыл бұрын

    Daniel Owen, would it be a bad Idea to use one gpu to play games like mmo and or a competitive genre at a lower tdp and on occasion use a gpu with a high tdp for AAA game with story driven games, would swappping out the gpu out from the socket every now a and then wear the pci port?

  • @rohanchooramun7288
    @rohanchooramun7288 Жыл бұрын

    What's the point of having supposingly better and more powerful RT Cores but then you decrease the amount of RT cores on the RTX 4070.

  • @Impossibly-Possible

    @Impossibly-Possible

    3 ай бұрын

    Because the 70 series is really the 60 series and so on, if you look at it like that you understand the performance, they pushed the slider over by changing the names on everything, the 90 is the 80 and 80 is 70 and 70 is 60 and 60 is 50 and prices went up for what would have been the real product stack and then slide the names over making profits go up many many times, selling a 60 series card that should have been 250 dollars as a 70 series for 600 is INSANE mark up in price to performance.

  • @col.hanslanda2013
    @col.hanslanda2013 Жыл бұрын

    3080 3080ti 3090 3090ti had like 5% difference. between each other when overclocked a bit. Unless the game required massive amounts of VRAM ofcourse.

  • @Kage0No0Tenshi

    @Kage0No0Tenshi

    Жыл бұрын

    Imagine 4 years later rtx 4070ti and 3080ti and below can not run AAA games becuze of low vram memory. Would pick rtx 3090 any day over 4070ti or even rx 6950xt

  • @wasd-moves-me

    @wasd-moves-me

    Жыл бұрын

    Wrong

  • @CoCo.-_-

    @CoCo.-_-

    Жыл бұрын

    @@wasd-moves-me yeah, i think the perf difference at lets say 4k between the 3080 and 3080 ti is atleast 15% while stepping up each after is 5% depending on model

  • @CoCo.-_-

    @CoCo.-_-

    Жыл бұрын

    @@InnerFury666 3090 and 3090 ti are within a few percent of each other depending on the 3090 model, mine runs about on par with one for example here is a video kzread.info/dash/bejne/gI51pax_gc-we5s.html, you can achieve the same result with an undervolt (depending on lottery) as out of all of ampere the 3090 is the most power limited, if the clocks are on par with the 3090 and 3090 ti, they are within 2-3% of each other, but fe vs fe due to the higher power limit on the 3090 ti its about 10% yeah

  • @ketrub
    @ketrub Жыл бұрын

    i managed to cop a 4070 a decent deal below MSRP (for my country, EU so all prices are fucked, basically) on launch day, so combined with the lower energy usage and the fact i don't use 4K i think it was an okay option i do agree with your conclusion though, not very happy with Nvidia but for what it's worth 4070 happened to align perfectly where i live

  • @IStrikerXLI
    @IStrikerXLI Жыл бұрын

    FYI, the replay feature in Fortnite has options to set and lock the time of day. Could be helpful in eliminating variance between runs.

  • @TheSlowEcoboost
    @TheSlowEcoboost Жыл бұрын

    somehow my 4070 uses even less watts than my 3060 ti and when you undervolt it a tiny bit the card just becomes the best choice for older power supplies or bills

  • @andersbjorkman8666

    @andersbjorkman8666

    Жыл бұрын

    I got a 4070 for a good price in EU and did some maths, and I'm saving about 40 dollars a year compared to if I'd have gotten a 6950xt :P

  • @jesusbarrera6916

    @jesusbarrera6916

    Жыл бұрын

    If you can afford a 4070 you can afford a better PSU

  • @TheSlowEcoboost

    @TheSlowEcoboost

    Жыл бұрын

    @@jesusbarrera6916 it cuts down costs to save an old one

  • @froznfire9531

    @froznfire9531

    Жыл бұрын

    @@jesusbarrera6916 I mean ofc you can but say 100 bucks for PSU and the money a 3080 will cost you over the years, this will stack up. It only makes sense at a very big discount in my eyes

  • @hakdergaming
    @hakdergaming Жыл бұрын

    its the nerfed CUDA cores on the 4070 that cause non-RT rasterized performance to drop significantly while they peddle frame generation that uses the SAME tensor cores as previous generations, but they softlock it to 4000 series in drivers. while also giving you a bit more RT cores to polish the turd of the rest of the GPU.

  • @MaxIronsThird

    @MaxIronsThird

    Жыл бұрын

    nah, it's just the bus width. 4070 RT cores are better than the 3080's, so even with LESS RT cores, it performs better, they don't matter in raster though.

  • @hakdergaming

    @hakdergaming

    Жыл бұрын

    @@MaxIronsThird sorry for the misunderstanding. when I mentioned they "nerfed" the CUDA cores, I meant they reduced the CUDA core count. with the RTX 4070 only having 5888 CUDA cores, and the 3080 12GB having 8960 cores. yes i know the efficiency of the 4070 is MUCH better, and even though the difference in cuda cores is around 35% it manages to perform only 8-18% worse in raster graphics while also consuming much less wattage. however this drop in raster performance, especially for the price of the card and the fact that nvidia has historically released new 70 class cards that are supposed to match an 80 class card of last gen. makes this newer card a much worse deal when raw performance is considered. however, if you need less power consumption, and energy in your area is rather costly, than the 4070 would make more sense. and frame generation is still a pretty decent technology even though frame interpolation has been out for years, and this tech is legit just frame interpolation with some AI smoothing to prevent the image from looking unnatural. which again, uses the SAME EXACT tensor cores that every nvidia card has been using for DLSS since the 2000 series

  • @ESKATEUK
    @ESKATEUK9 ай бұрын

    Now this is a proper comparison. So many just stick DLSS 3.0 on and frame gen, just showing the 4070 perform better. You’ve covered everything which is great.

  • @jasonh4534
    @jasonh4534 Жыл бұрын

    If it was solely memory bus that is causing the drop in relative performance at higher resolutions, I would think there would be more correlation to total graphics memory usage. It seems that the amount of memory in use doesn’t play a big role. Games that use 5-6gb exhibit the same behavior as ones using 10-11gb.

  • @sethdunn96
    @sethdunn96 Жыл бұрын

    It's interesting to see NVidia do this. But going from 1080 -> 2070 things were on par, but maybe a 3%-5% edge in favor of the 2070, then a 3060 was on par w/ the 2070 with about a 3%-5% boost in favor of the 3060. And now, you would expect a 3080->4070 to be on par with a slight advantage of 3%-5%, but it's actually at a disadvantage of 3%-5%. Very sad.

  • @LeonardPutra
    @LeonardPutra Жыл бұрын

    You should test the undervolting capability of your 3080 12GB. Usually it runs fine on 800-850mV, with 1750-1920MHz GPU clock. At 850mV, 1920MHz, it is pretty much like a stock frequency, with a lot lower power consumption at 240-280 watts. But then, the 4070 underclocks really well too, I’ve seen it undervolts to 140-150 watts while maintaining pretty much the same performance. That’s GTX 1070 level!

  • @Checkout17

    @Checkout17

    Жыл бұрын

    Indeed, I have a Rog Strix RTX 3080 12GB (OC version) and run it at 1900Mhz with 900mV, runs cooler and stay between 250-280 watt! Thats almost 100 watts less power consumption without losing much performance. The default clock of this card is 1860MHz. Also, this card has same chip as the 3080ti/3090. And same cool block as the 3090, (for the ASUS ROG STRIX version) They had to do this of course :-D

  • @Krenisphia

    @Krenisphia

    Жыл бұрын

    I undervolted my 4090 to 825mV @ 2400Mhz. Power consumption dropped to 220-250W for all my games. That's a pretty big drop from 400W stock, so I'm very happy. Temps-wise, GPU core stays around 40 degrees and vram / hotspot gets to 50 degrees. Never even touches 60. The best part is performance loss was very small, less than 10% which is not even noticeable for me. :)

  • @LeonardPutra

    @LeonardPutra

    Жыл бұрын

    @@Krenisphia That's cool! (pun intended). That really show the power of TSMC 4nm process node!

  • @PaulLemars01
    @PaulLemars01 Жыл бұрын

    Quick question, how does an RX6800XT compare to the 4070? I know that the AMD board is EOL but they are currently going for just over $500 with 16GB of VRAM and a much bigger bus. It's just a thought and I know that the RX6800XT stock is being burnt through fairly quickly.

  • @OCX600RR

    @OCX600RR

    Жыл бұрын

    Literally neck and neck with performance, with some games favoring AMD over Nvidia and vice versa.

  • @ce5834
    @ce5834 Жыл бұрын

    thanks very much for this video, been waiting ages for it.

  • @gramostv_official
    @gramostv_official Жыл бұрын

    So 3080 is outperforming 4070 at 1440p and higher? Nvidia 🤣

  • @juanpaulocabugsa771
    @juanpaulocabugsa771 Жыл бұрын

    Bought RTX 3080 2 weeks ago for 450usd. prices of RTX 4070 here in Japan is around 740usd which is too damn high. Thanks for this video 😊

  • @Rodrigo-rr6ym
    @Rodrigo-rr6ym Жыл бұрын

    Tech noob here. So, if frame generation is usable if you start with a 60 frames base line, then there's no benefit for someone with a 60hz display if you use Vsync, right?

  • @Elevator_Inc
    @Elevator_Inc Жыл бұрын

    At 3:40 with 1440p ultra no RT, 3080 reserves 6Gb of VRAM while 4070 reserves 8Gb. Is that a settings mistake?

  • @rileyhance318
    @rileyhance318 Жыл бұрын

    4070 doing all of that at 180w vs 340w on the 3080. glad to see some effort is going into efficiency. never the less i will be keeping my 3080 12gb

  • @BlackJesus8463

    @BlackJesus8463

    Жыл бұрын

    Still a $600 1440p card tho. I'd rather have the power to run 4K TBH

  • @MaxIronsThird

    @MaxIronsThird

    Жыл бұрын

    not effort, just a better node TSMC's N4 vs Samsung's 8nm

  • @rileyhance318

    @rileyhance318

    Жыл бұрын

    @@BlackJesus8463 3080 was a 1300-2000 dollar card for over 50% of its life. 600 is not bad

  • @NostalgicMem0ries

    @NostalgicMem0ries

    Жыл бұрын

    @@BlackJesus8463 1440p and 4k differs very little and monito/tv upgrade costs a lot. not to mention 4070 can run most games at 4k 60 fps , for 4k 120 144hz gaming you will need 4090 that is super expensive and 7900 xtx can match that too

  • @BlackJesus8463

    @BlackJesus8463

    Жыл бұрын

    ​@@NostalgicMem0ries Not really. 1440p monitors are often more expensive than 4K tvs, especially oled. Everything else you said is bs too.

  • @nipa5961
    @nipa5961 Жыл бұрын

    Luckily new 6800XT and 6950XT cards are still available. Nvidia is not an option at these prices right now.

  • @nr1771

    @nr1771

    Жыл бұрын

    Depends what you want to use the card for. Nvidia's drivers are still better than AMDs for a lot of things.

  • @nipa5961

    @nipa5961

    Жыл бұрын

    @@nr1771 AMD's drivers are also still better than Nvidia's for other things. It's not worth paying a few hundred more for Nvidia cards.

  • @nr1771

    @nr1771

    Жыл бұрын

    @@nipa5961 AMD's drivers are not better than Nvidia's for anything. For a few things, they are about as good, but if you use your card for anything besides standard raster gaming, you will start to appreciate just how far behind AMD's software is compared to Nvidia's. In pure raster, they've improved, but they still have issues like massive idle power draw or dysfunctional multidisplay support. To take just one example of the many software issues AMD still has, just talk to all the people who bought a 7900XTX to use for VR and then returned it because it either A) refused to recognize their headset, B) crashed upon launching any VR platform, or C) performed worse than a 3070.

  • @nipa5961

    @nipa5961

    Жыл бұрын

    @@nr1771 I've made very different experiences it seems. AMD drivers were much more stable for me than Nvidia's. Also, a friend just "upgraded" to a 3060 and has massive problems with his monitors not waking up properly now and is forced to blindly restart his machine a few times a day. Just one example. Nvidia also has no equivalent feature to Radeon chill. Speaking of power consumption RDNA was way more power efficient than Ampere. Strange how everyone seemed not to care last gen but brings it up since last fall. XD They both have their pros and cons, but in the end they are very equal. So again, sadly AMD is the only option right now, since all equivalent Nvidia cards are much more expensive and lack VRAM.

  • @nr1771

    @nr1771

    Жыл бұрын

    @@nipa5961 I'm glad you've had a good experience with AMD's drivers, but it's definitely not shared by a lot of people out there. All you have to do is look at any AMD forum and you'll see a lot of people talking about driver issues still. And I'm not saying this to fanboy Nvidia. I hate the way they've priced this generation of cards. I'm just saying that if you care about anything other than standard raster gaming, AMD still has a lot of issues (if all you care about is standard raster gaming, AMD is the better value and you should buy AMD).

  • @frankguy6843
    @frankguy6843 Жыл бұрын

    Grabbed a barely used 3080 12GB last year and have not regretted it at all, the recent drama around cards being limited at 8GB are of no concern, and the 40 series has nothing that is compelling comparatively. Not that NVIDIA couldn't produce something compelling, but they chose not to and here we are.

  • @axeivy
    @axeivy Жыл бұрын

    With new AMD and used 30 series card costing as much and at times way more than the 4070 where I live, it's hard to persuade me to not replace my 1070ti with the 4070. Luckily enough, I came across learning how to undervolt and overclock my GPU which should get my distracted before I make the final decision.

  • @arditm2178
    @arditm2178 Жыл бұрын

    Hello Daniel. Is it possible to make any benchmarks with AI tools? Like stable diffusion. 3080 vs 4070. I'm kinda new to AI, but it seems like nvidia is the way to go.

  • @ladrok97

    @ladrok97

    Жыл бұрын

    You can't go AI (i.e. image upscaling) on AMD. In image upscaling like 90% of models is on CUDA and only +/-% is based on Vulkan

  • @Suilujz

    @Suilujz

    Жыл бұрын

    ​@@ladrok97 saying you can't go AI on amd is just a lie, sure it isn't as easy as cuda but there's directml and vulkan implementations for a lot of projects. AI is also much more than "image upscaling"

  • @eqrmn3934

    @eqrmn3934

    Жыл бұрын

    @@Suilujz no its not worth it getting an AMD for AI. Too much work to get it running, slower speed than nvidia cards and most tutorials on youtube would be using software that support nvidia. Its a shame though because AMD cards have more vram

  • @ladrok97

    @ladrok97

    Жыл бұрын

    @@Suilujz Maybe it's a lie, but yesterday I wanted to test other upscaler to get X4 from 480x360 and majority is blocked by cuda and brute forcing it with 6600xt it's pointless. I plan to upgrade to 7900xt (or maybe wait for 8800), maybe then brute forcing this limit will work. But if someone wants to use AI, then it's far easier going with Nvidia than AMD and I doubt situation of "max 20% work on Vulkan" won't apply to most of AI use case

  • @Suilujz

    @Suilujz

    Жыл бұрын

    ​@@ladrok97 I got my 6950 XT today, just upscaled a 512x512 image by 4x in two seconds with r-esrgan 4x+. Don't know which one you trying to use but there's perfectly working ones out there

  • @xTurtleOW
    @xTurtleOW Жыл бұрын

    Bigger price than used 3080 and less than 3080 performance very nice

  • @mr.cookie8265

    @mr.cookie8265

    Жыл бұрын

    well it uses about half the amount of electricity. The 4000 cards are super efficient

  • @victorxify1

    @victorxify1

    Жыл бұрын

    @@mr.cookie8265 wow u can save $6 a year on your power bill, thanks Nvidia

  • @whiteerdydude

    @whiteerdydude

    Жыл бұрын

    ​@@mr.cookie8265 They better be. They are on an a significantly more efficient transistor node. The fact that this gens 4070 can't consistently beat last gens 3080 (12 gb, but all 3080's should have been this for the price) in raw performance is really sad. And to top it off, Nshitia wants 100 bucks more for this card than last gens 3070. It sucks. If this card was matching the 3080 ti like how the 3070 was matching the 2080ti this wouldn't even be a discussion.

  • @fafski1199

    @fafski1199

    Жыл бұрын

    Like anything used, it's always a risky gamble. Most of those used 3080's listed on EBay, will no doubt be ex-mining cards or will have had a fair amount of wear and tear. Having the safeguard and peace of mind of a 3 year warranty, in itself is worth paying out an extra $100, at thier price range. Secondhand will always be cheaper, with the potential of getting a better bargain. However there is always several risks and cons involved. You could find yourself 2 months later with a dead GPU, with no way to get a refund and be $500 out of pocket.

  • @mr.cookie8265

    @mr.cookie8265

    Жыл бұрын

    @@victorxify1 the 4000 series cards are also alot cheaper than the 3000 series cards (at least in europe) the 4070 costs about 600€, the 3080 10GB costs about 770€ and the 3080 12 GB costs about 2270€

  • @MateusSilva-ps6xr
    @MateusSilva-ps6xr10 күн бұрын

    I'm here in Brazil considering buying a 4070. Here the prices are very high due to the conversion of the dollar into local currency and the inhumane taxes. Congratulations on the particular analysis. One more subscriber.

  • @fabiozannettino
    @fabiozannettino Жыл бұрын

    @danielowentech Can I suggest you to set the frequencies of the custom models at the values of the founders edition? The result would be less variable and more representative of the effective difference between the the official SKUs. Anyway, GG Daniel. 🙂

  • @dwedj
    @dwedj Жыл бұрын

    that power draw tho

  • @cks2020693

    @cks2020693

    Жыл бұрын

    10C temp difference too

  • @emanuelacosta2541

    @emanuelacosta2541

    Жыл бұрын

    Exactly man, this generation is so efficient. I have a RX 6800 and is a monster at low wattage, but this 4070 is something else.

  • @cks2020693

    @cks2020693

    Жыл бұрын

    @@emanuelacosta2541 there is a chinese tech guy that UV 4070 to 100W and took all the FANS OFF, and the highest temp was like 81C, while still maintaining 85%+ performance

  • @FenrirAlter

    @FenrirAlter

    Жыл бұрын

    ​@@cks2020693 u can do that with an 3060 too. Almost like the 4070 is a 4060

  • @emanuelacosta2541

    @emanuelacosta2541

    Жыл бұрын

    @@cks2020693 Damn now I want to do the same with my RX 6800, I need to try, that's incredible!

  • @siyzerix
    @siyzerix Жыл бұрын

    So now we're just paying for software upgrades basically and reduced power draw (which is something you EXPECT from a new gen of gpu's. Its like saying the new gen will be faster). So now we're paying for software upgrades. Brilliant. Keep defending it guys. You're soon going to see performance uplifts be a thing of the past.

  • @raulitrump460

    @raulitrump460

    Жыл бұрын

    Power isnt realy reduced this 4070 is 4060 with 190-200w. 3060 12gb is 170w.

  • @niebuhr6197

    @niebuhr6197

    Жыл бұрын

    ​@@raulitrump460 this gpu has 46 SM, it barely consumes more than 28 SM, but it's 90% faster. 4060 lmao

  • @HosakaBlood

    @HosakaBlood

    Жыл бұрын

    I mean this is not suspose to remplaze a 3080 12gb owner this is still a uplift if you compared to what buyer they are targeting witch is the 3070 and below there's a reason the 4090 have a huge gap this Gen maybe because every single review talk crap about the 3090 being a scam on being 15% faster than a 3080 for 2x the prices maybe Thas why nvidia nerf lower tier this Gen so review stop talking shet but guess what the 7900xtx was barely a uplift for the same prices so who is here to blame

  • @siyzerix

    @siyzerix

    Жыл бұрын

    @@HosakaBlood Nvidia and AMD are just colluding at this point. The xx60 class generally have performance close to the last gen xx80 class GPU. This ''rtx 4070'' is that. Thats basically what the 4060ti should be at the minimum. So, what nvidia is doing is selling us at best a 4060ti for $600. Its a pathetic performance jump. I mean its barely 15% faster than the rtx 3070ti. And the vram, its been long overdue. We should be having 16gb vram at this price at minimum and realistically having 4070ti performance at minimum for what is suppose to be a 4070.

  • @siyzerix

    @siyzerix

    Жыл бұрын

    @@raulitrump460 Yeah, thats pretty much true. At best its a 4060ti.

  • @Vis117
    @Vis117 Жыл бұрын

    Does DLSS 2 lower vram usage since it’s rendering at a lower resolution?

  • @jetkat6935
    @jetkat6935 Жыл бұрын

    For some reason 4070 footage looks laggy and riddled with stutters even though the frame times are very similar. Something wrong with the recording of 4070 results?

  • @several.
    @several. Жыл бұрын

    DLSS and Frame generation is going to be the saving grace of the 40 series. It's just insane tech, and theoretically only going to get better and wider supported.

  • @Vespyr_

    @Vespyr_

    Жыл бұрын

    It's software. It's software they could push to previous generations to make them better if they wanted to. But they won't and lock it to their latest cards for profit. Unlike AMD which allows even their oldest cards support their newest technology.

  • @froznfire9531

    @froznfire9531

    Жыл бұрын

    @@Vespyr_ lets see how well FSR 3 will be Currently, FSR 2 is clearly behind DLSS 2 so I wouldnt expect wonders. I like that AMD releases it Open Source but if it works better on Nvidia, you just cant help it

  • @Vespyr_

    @Vespyr_

    Жыл бұрын

    @@froznfire9531 When a company patronizes a market in this manner, it is specifically targeting established customers of previous generations of their brand. New customers specifically are unaffected by this exclusivity, until the next model releases. Nothing stops them from backward porting this technology. They won't even do it by one cycle, just a few years apart. There is more to a company, than performance. They did this with Gsync too, until sales forced them to concede.

  • @DeadPhoenix86DP
    @DeadPhoenix86DP Жыл бұрын

    Too bad the 3080 would have not worked with my current PSU. So i went with the 4070 instead. I paid MSRP price. Over my older GPU i only use about 30 watts more, but having over double the performance and VRAM.

  • @kaimojepaslt

    @kaimojepaslt

    Жыл бұрын

    and thats what smart people do, and dont have to dump double the power bill monthly.

  • @thatbritishgamer

    @thatbritishgamer

    Жыл бұрын

    ​@@kaimojepaslt smart people don't buy a 4070 as they know they're a rip off.

  • @Rodrigo38745

    @Rodrigo38745

    Жыл бұрын

    @@thatbritishgamer if you want a new gpu for that price whats the better solution them? exactly

  • @jesusbarrera6916

    @jesusbarrera6916

    Жыл бұрын

    @@Rodrigo38745 used 6950XT and a better PSU....

  • @Rodrigo38745

    @Rodrigo38745

    Жыл бұрын

    @@jesusbarrera6916 I said new card, a lot of people dont want used cards also like me I use my gpu to do work and nvidia is miles ahead in most cases.

  • @skromee
    @skromee Жыл бұрын

    Thanks for showing the difference. I have a A770 right now looking to upgrade. I play Rainbow Six Siege can you can add that game to your benchmark next video? I want my Lowest fps to be 360 on Ultra 1080p right now Lowest fps is 245 with the A770 Ultra 1080p

  • @korosoid
    @korosoid Жыл бұрын

    Have you tried the overlay from Riva Tuner? It's way better IMO.

  • @lughor82
    @lughor82 Жыл бұрын

    For everyone interested why the bandwidth got smaller... The VRAM is added to the graphics cards as chips and every GDDR6/GDDR6X/GDDR7 Chip has 32 data pins to connect to which is called a 32-bit wide bus. The memory bus can be shared between 2 memory chips, but I don't think there are recent examples of such shared bus systems. (the 660 Ti would be an example) Usually you can simply multiply the 32 bit bandwidth of a chip times the number of chips and you will get the memory bandwidth.

  • @lughor82

    @lughor82

    Жыл бұрын

    I have found the method which is used to give more VRAM without making the bus bigger. It is called the Clamshell mode. There is a x16 mode (two 16 bit channels) and a x8 mode (two 8 bit channels). The x8 mode is called the clamshell mode and you can put double the VRAM without using a bigger memory controller. You can still use the VRAM chips in parallel, but you only get halve the bandwidth per chip.

  • @DrearierSpider1
    @DrearierSpider1 Жыл бұрын

    We all know the cache and memory bandwidth of the 4070 were meant to gimp its performance at 4K, so you'll be upsold to a higher tier GPU.

  • @justlamb
    @justlamb Жыл бұрын

    in the uk, the 3080 12gb costs £100 more than the 4070, uses double the power, and has no DLSS3 features. but good comparison

  • @AvaDonKos
    @AvaDonKos Жыл бұрын

    Thanks for benchmarks! So for 1440p 12GB is actually enough?

  • @captainalex157

    @captainalex157

    Жыл бұрын

    yeah i think we're safe for the foreseeable future, dropping 1200 on a 4080 just isnt worth twice the price of a 4070 for 1440 imo.

  • @autoglobus
    @autoglobus Жыл бұрын

    The 3080 12Gb doesn't just have more memory and bus to support that memory, it also has more Cuda cores , more Tensor cores and a higher TDP. The 3080 12GB is just another in the long line of nVidia examples where they named things the same even though it has most specs different. It's the exact same thing they tried to pull with the 4080 12GB vs 4080 16GB , but backed off and called it 4070Ti eventually. And it's not the same as with the 4060Ti 8GB vs 4060Ti 16Gb where memory is the single difference between them and is somewhat excusable.

  • @atomicfrijole7542

    @atomicfrijole7542

    Жыл бұрын

    Yep. The 3080 12gb is a treasure.

  • @michaelwoods7770
    @michaelwoods7770 Жыл бұрын

    The fact that they are banking on tricks to sell cards is simply silly. They artificially limit Rtx 30 cards so they don’t use these tricks cause they would lose all credibility at that point.

  • @B_Rye402
    @B_Rye402 Жыл бұрын

    Can you test the extreme preset on MW2 please? With a 144hz 1440p panel extreme settings make more sense given you have a beefy enough GPU

  • @quukskev4970
    @quukskev4970 Жыл бұрын

    Well the drivers of the 3000 series are some what final, the 4000 drivers are still being configured for all games, so it will likely chance over time

  • @davidord5228
    @davidord5228 Жыл бұрын

    I'll admit I caved in and bought the RTX 4070 however before everyone jumps on me let me clarify why. I had about its price in terms of budget and I was well aware of the other gpu options however, going older gen or AMD meant I would need to upgrade my PSU and I didn't have the budget for that and without compromising on performance the 4070 runs perfect on my 550W psu. I also appreciate the fact that Nvidia cards and software is supported more for the creative work I do for uni ( Blender, 3D, Photoshop) whilst also giving me gaming at 1440p with decent frames and with the added benefit of DLSS 3 frame gen as a user I noticed smoother gameplay without noticeable artifacting which is just great for user experience and I understand that it gets stick for the price, it isn't justifiable for some upgrades but going from igpu to workloads on this gpu was immense. I think the vram could be an issue however Nvidia has just show new texture compression which is more detailed and has smaller file sizes so I reckon that will help all of their gpus and why they have refrained from upgrading vram by large amounts. I may be a singular case where the 4070 as an upgrade made sense but i think you would agree it was the best option for me. Great to see Daniel continue to dive into how it compares and thanks to others who responded on my last comment on one of these vids giving me upgrade suggestions.

  • @soapa4279

    @soapa4279

    Жыл бұрын

    I don't think anyone is going to stone you for buying one. The 4070 is still a good product. It's just named and priced wrong.

  • @BleedForTheWorld

    @BleedForTheWorld

    Жыл бұрын

    The 4070 is actually a great option as a gpu upgrade from two generations prior. The problem is the price which is still very much overpriced at 600. Others are right that it should be at around 400 dollars but since wealth is relative, this number doesn't seem as much to some than it is to others.

  • @Dave-kh6tx

    @Dave-kh6tx

    Жыл бұрын

    why admit you caved in? your needing it for blender and photoshop were valid reasons. everything else you said though makes me believe you're full of it or won't be very good in your major. W/ added benefit of DLSS3 noticed smoother gameplay without noticeable artifacting? you're just regurgitating words without knowing what you're saying. just how much do you think texture compression will help an overpriced gimped gpu with low ram and bus width? nvidia already has texture compression tech way ahead of AMD and using AI to "compress" isn't really compression. it's using AI to add extra details that weren't there. I'm not flogging you about choosing a 4070 because of your psu but that was just another reason you added that doesn't make a lot of sense if you needed for studies. but then again, add up all the other reasons asides from studies and something isn't right here.

  • @davidord5228

    @davidord5228

    Жыл бұрын

    @@Dave-kh6tx All I meant by caved in was more of like I made the jump to get it, more of finally deciding on what I was getting, does that make it clearer? I thought needing to spend money on upgrading to a higher psu for less efficient cards was a fair variable to factor into my options (I've saved for a year and I was trying to balance value and quality performance for what id use it for) and I'm not a computer science or super experienced builder but I have followed along with latest new etc which doesn't have anything to do with my degree so I don't think that comment was necessary about my competency in my degree however due to my lack of experience compared to some I will apologise if i've used a term or fact wrong. I just wanted to share my experience and why I made the decision to go with the 4070 but I do appreciate the points you made about price and compression but I think my views on it as a user are justified when playing games and experiencing DLSS 3, just my opinion and you have the right to yours :)

  • @zxbc1

    @zxbc1

    Жыл бұрын

    4070 right now is massively cheaper than 3080 even the 8GB version. That with the fact that you get DLSS 3 and massively less power consumption it's a no brainer to choose the 4070. For me it's the same boat, if I consider upgrade now with my old PSU being 550W I will end up paying $150 more for the 3080 at the very least, and I can't even find a reasonably priced 3080 12GB anymore. The way the 3000 series cards were priced made them such poor value that the new 4070 ended up looking good despite being also poor in value. Talking about performance per dollar based on MSRP is just not useful at this point.

  • @kilosera
    @kilosera Жыл бұрын

    I dont know if frame gen is such a great feature if it adds ~20ms lag. I'm not that fast, when you ask me something I sometimes answer after a few seconds, most likely with a 'what!?' and yet I feel massive improvement in playing forza on my gaming monitor with ~3ms vs my tv with ~40ms. It's nice that fgen is added but I wonder how much single player that single player title has to be to actually be enjoyable with lag. Maybe nvidia just wants to silently force feed new gamers with input latency to later jump them smoothly to their streaming platform ;)

  • @UnRu1eD
    @UnRu1eD7 ай бұрын

    Still debating between which of this cards considering I am doing youtube and twitch streaming. I am going to pair it with an i5-13600k

  • @cptnsx
    @cptnsx Жыл бұрын

    So glad at least YOU are telling AND showing the truth about Frame Gen - its Motion Smoothness - it will NEVER have the latency of REAL FRAMES at the indicated FPS counter.

  • @Revanse
    @Revanse Жыл бұрын

    There is only gpu which we can buy for 600 USD and that's 6950XT , 20% quicker , 16 GB Vram

  • @TheMadYetti
    @TheMadYetti Жыл бұрын

    energy usage is VERY impressive. its 2x the FPS per W used, so at least one thing nvidia did good

  • @HypnoticSuggestion
    @HypnoticSuggestion Жыл бұрын

    Interesting comparison, it's definitely strange how the cadence has changed, the new xx70 card used to handily beat the older xx80 series card, been that way at least a decade. What I'm most curious about is what effect all that L2 cache is having, like, if Ada didn't have all that how different would the performance be. I bet the memory performance would be a much bigger problem on the 70 and 70 Ti. I wonder if there's a good benchmark out there that can test for differences. Anyway I think Nvidia has done a damn fine job of pissing off basically every human being on the planet, so credit to them.

  • @insomnia20422

    @insomnia20422

    Жыл бұрын

    There are multiple benchmarks that do that (even though not by design). They are mostly productivity benchmarks that need to constantly load lots of new assets which means cache doesnt work because the data is always new. They show massive problems with the 40 series cards. TL;DR the low VRAM and bandwidth models are all shit for productivity.

  • @MegaLoquendo2000

    @MegaLoquendo2000

    Жыл бұрын

    Nvidia pulled an Nvidia as usual, RDNA 2 cards started including their so called infinity cache, while keeping vram bandwidth at a good level, nshittia on the other hand made the choice to add that extra cache while murdering the memory bandwidth.

  • @farmerowga5890
    @farmerowga5890 Жыл бұрын

    When considering to buy a used card, I believe the price difference should be at least 20-25% to cover the risks. If both the 3080 and the 4070 cost the same, the 4070 could have up to a 20% worse performance and be still seen as an equal buy. The performance deficit of the 4070 is smaller in most cases, but the price is about 20% higher (600 $ new vs. 500 $ used). I am not sure if the new features can balance out the gap. In this performance tier RT is less relevant, so I would go for AMD Radeon 6950 XT. It is newer than the 3080 with more raw performance than both nVidia alternatives. Frame generation is an option only: - when you have at least 60 base FPS - if you don't play competitive on-line games

  • @BlackJesus8463

    @BlackJesus8463

    Жыл бұрын

    25% discount off new isn't enough to entice me. People bitch about scalpers for years then they go buy a used 3070 for $500 on ebay. ¯\_(ツ)_/¯

  • @ChancySanah
    @ChancySanah Жыл бұрын

    to be fair to the 4070 released at msrp, those 3080's were 2-3 grand if you could find them. Pretty sure the msrp for the 12gb was like 1200 bucks anyways.

  • @alpha007org

    @alpha007org

    Жыл бұрын

    In the EU, during shortages, I frequently saw 3090 for lower price than 3080 (ti). Ridiculous and too bad I didn't screenshot some. 3090 1800 EUR, 3080ti 2000+ EUR. I don't get it how the same store could sell a weaker GPU for more money with a straight face.

  • @zdspider6778

    @zdspider6778

    Жыл бұрын

    Those were scalper prices. There's a Tom's Hardware article from June 2022: "Grab an RTX 3080 12GB at its Lowest Ever Pr*ce of $769".

  • @alpha007org

    @alpha007org

    Жыл бұрын

    @@zdspider6778 Sure, but in the same store, like ComputerUniverse from Germany, which is quite a reputable company with decent pricing for the EU? That why I said, too bad I didn't screenshot those listings.

  • @ChancySanah

    @ChancySanah

    Жыл бұрын

    In general comparing prices with last gen things that came out in mid 2020 and into 2021 isn't really fair, because msrp was a fantasy and around 2021 the chip shortage started kicking in so it probably cost wayyyy more to r&d the 40 series or the 7000 series. A lot of things went wrong to get us here.

  • @CoCo.-_-

    @CoCo.-_-

    Жыл бұрын

    @@alpha007org me: got my 3090 for £670...

  • @GiddyGoat
    @GiddyGoat Жыл бұрын

    It’s wild how power efficient the rtx 4070 is compared to the rtx 3080

  • @CaptToilet
    @CaptToilet Жыл бұрын

    As I said before Nvidia just threw all RnD into the 4090 and then said screw it to the other class of cards. This 4070 should have been a 4060ti at best. Improvements to the tensor cores and RT cores is one thing, but that improvement means jack if the memory bandwidth can't keep up.

  • @MaxIronsThird
    @MaxIronsThird Жыл бұрын

    Nvidia thought the big L2 cache would be way more performant than it is or they just want the 4070 to be 1440p card and not a 4K card. Even with better RT cores and more Optical Flows acc, the 4070 isn't able to match the 3080 12GB in RT mode. That's ridiculous.

  • @pepoCD
    @pepoCD Жыл бұрын

    man such a weak -70 card. this is a 4060ti at best and $150 overpriced. all 4000series cards but the 4090 are complete disappointments

  • @christophermullins7163
    @christophermullins7163 Жыл бұрын

    This guys is going places. The gimped vram bus of 40 series will go down in history as the worst change nvidia ever made. We need more ram AND the same number of ram chips. The 384bit bus will make 3080 far better at demanding 4k raster.

  • @GewelReal

    @GewelReal

    Жыл бұрын

    And then you woke up and ran out of VRAM

  • @zipperman6045

    @zipperman6045

    Жыл бұрын

    The thing is that you are right the memory bus is leading to it being faster at 4K however its drawing 70% more power for 10-12% more performance and this relatively small gap in performance means as DLSS3 becomes more prolific it won't matter as much

  • @lexiconprime7211

    @lexiconprime7211

    Жыл бұрын

    I have some doubts that a lot of people are looking for a 4070 for 4k gaming. Some might, but I doubt it's a significantly large amount of folks.

  • @christophermullins7163

    @christophermullins7163

    Жыл бұрын

    ​@@GewelRealyeah 12gb isn't really enough anyway but at least the 3080 12gb has a firm lead in 4k. I get the draw of NVIDIA but I'm going AMD next. Enough vram without selling body parts sounds good to me.

  • @lukasr1166

    @lukasr1166

    Жыл бұрын

    the 4070 ti and below does suck at 4k. But I wouldn't recommend these for 4K gaming even if the specs were the same as the 30 series. 4080/7900xtx and up would probably be the best buy for 4k. That's by design of course.

  • @CrazieAsianBoi
    @CrazieAsianBoi8 ай бұрын

    Great video!

  • @ogcryptowealth
    @ogcryptowealth5 ай бұрын

    Nobody talks about the huge efficiency upgrade for the same performance of a card that was once selling for close to $1500 at a new cheaper price of $600, so not only is power cost lower which pays for itself over time if comparing to RTX 3080 power cost but also lower temps on the newer architecture means the 4070 will likely have more endurance than the power hungry 3080! Just a thought I know is not popular because all people worry about is raw performance but I value efficiency and endurance more than raw performance!

Келесі