Home Artists Posts Import Register

Content

Hi everyone, this is Oliver, writing to you from the (very) early hours of the morning in Las Vegas! Alex and I are planning to record DF Direct 196 on-site on Wednesday, discussing the massive Nvidia keynote address, including the Blackwell GPU announcements, plus reportage on AMD, Intel, Razer, and ASUS.

We’ll also be discussing the various varieties of displays that caught our eye during the show, as well as other CES ejecta. But the show wouldn’t be the same without you, so please feel free to pose any questions - about CES, other gaming news items, or anything else that happens to be on your mind - in the box below.

Files

Comments

Richard Leadbest

Hi DF, thanks for doing the on-site coverage! Many internet people took issue with NVIDIA treating interpolated frames as real frames in performance comparisons, going as far as equating the 5070 with a 4090. Regarding this practice, where do you stand on a scale of "they're not wrong" to "maliciously deceptive"? Cheers!

1040STF

Now that we know that Call of Duty Black Ops 6 did cost a whooping 700 millions of dollars (marketing included), do we know if it moved the needle of the number of gamepass subscribers?

Nick

So two questions: Do you think any of the new DLSS or RTX features revealed at NVIDIA's keynote, and after, would be viable on Switch 2 with the obvious exception of frame gen? And have you stopped by Genki's booth to see their Switch 2 mockups?

(From Another) Richard (edited)

Comment edits

2025-01-08 18:04:38 With DLSS 4 supporting older generation Nvidia graphics chipsets, is it likely we will see it supported on the Switch 2 and what benefits could it provide a device of that power grade given the hardware we expect to see.
2025-01-08 10:41:05 With DLSS 4 supporting older generation Nvidia graphics chipsets, is it likely we will see it supported on the Switch 2 and what benefits could it provide a device of that power grade given the hardware we expect to see.

With DLSS 4 supporting older generation Nvidia graphics chipsets, is it likely we will see it supported on the Switch 2 and what benefits could it provide a device of that power grade given the hardware we expect to see.

Peter

Hi chaps, hope you are having a great time at CES! Nvidias pricing really surprised me, and that 5070 has the potential to be a bit of a game changer depending on real world performance. What are your thoughts on the 12gb of vram, albeit faster RAM than the 40 series? Will this be enough to overcome the RAM bottleneck we have seen on 8gb cards? Also, do you know if the upgraded "transformer" DLSS carries a greater frame time cost than the current iteration?

goregutz

Hi DF crew! AMD didn't announce their 9000 series. They claim they didn't have time but I don't buy it. Do you think they cancelled it last minute to wait for a potential 5060 announcement or at least 50 series reviews to see how they should market and price their products?

Gati

Do you think Nintendo has gone far too conservative with delaying the Switch reveal? Do you think the early leaks hurt them or have no impact in the big picture?

Perfect_Organism

I’ve seen some comments mention DLSS 4 costing more in terms of perf on older RTX GPUs. Are there any validity to these claims?

Perfect_Organism

Also, hope early morning Patreon posting means a fun night was had by all!

Dissecta

Have you been able to hands on any 50 series cards running MFG, if so how “natural” and flowing does game response seem?

HelenX

Hello. As you trapsed the CES showfloor did you see any presentations of devices or service that could have appeqared in the old Innovations catalogue? Anything weird or stupid or, ideally, weird, stupid *and* intriguing?

lord.frosty

Hey DF Crew. Not going to lie, the RTX 5070Ti looks like the most interesting card out of the bunch. I was wondering, for all of us still hanging on the AM4 Platform with the 3D V-Cache CPUs, would Blackwell be a viable upgrade? Do you think that the PCIe 3 of AM4 platform will hold back the performance of the upcoming graphics cards which are PCIe 5 combatible?

CABBAC

Of all the spectacular things that you saw at CES which was your favourite out of Half-Life 3, Switch 2, Beyond Good and Evil 2, Starcraft Ghost, or Dreamcast 2?

Jonny_5a

Hope you having fun State side. What’s the most ‘out there but I love it’ thing you’ve seen on the floor so far?

Perfect_Organism

Idk if this is a regular DF or a CES direct so this might not be applicable. I remember Rich mentioning about DF Direct guests, and it seems like there is mutual respect between yourselves and Daniel Owen. Would love to see him as a guest on a DF Direct if strings could be pulled, if you’re still open to guests?

i_like_licorice

Hey DF! If all the Switch 2 leaks are to be believed, the console will have the same form factor as the previous one, and will be backwards compatible. If that's the case, what backwards compatibility features would you like to see, and which you think are probable? A boost mode, an FPS boost, some kind of global auto upscaler, a smart delivery equivalent? Thanks!

andre surles

Do you think Mark Cerny is kicking himself with the AMD choice and having to fix the image quality themselves? How will AMD compete in the next generation?

gartenriese

Probably the GTA Trailer 2 that was shown exclusively to CES guests.

Samson

Hey DF gangstas!! Do you think it’s safe to say that apart from the the halo beast that is the 5090, Nvidia’s pricing of the 50 series was significantly more competitive than most expected? I think Nvidia hit AMD on all fronts. Pricing, Halo product, software slate and most of all that sweet sweet hype! Pure speculation but do you think AMD deciding to only announce RDNA 4 in their press briefing with no details of price or FSR 4 made Nvidia go on the offensive? Get excited fellas!! New GPUs just dropped! Cheers DF gangstas!

Dissecta

I know, but second opinions and perhaps different games…

gartenriese

Do you think Half Life 3 will be a launch title for the new Steam machines?

Voln

Hello there, I just wanted to pick your brains regarding the fps metric and whether or not it is fair to use it in advertising these days. I feel like both fps and resolution don't mean as much these days as they used to. Wouldn't it make more sense to show something like smoothness, clarity, and responsiveness differences in press materials? I feel like they would be easier for the average consumer to understand and more factual for the enthusiasts than just fps and resolution since we do so much scaling and interpolation these days. Here's a little visualization I just came up with to better illustrate what I mean https://imgur.com/a/aoZ4TOm (the numbers are just off the top of my head), but each difference on the graph would incorporate both raw performance gains and advances in latency mitigation techniques / scaling / interpolation models.

Samson

Lord! How I wish one of the premium consoles ran on Nvidia hardware! Oh well! At least we got… *checks *notes…Tegra T239 on the Switch 2

Cold Pancake

Hey gents, quick and simple question: Is 16GB VRAM enough to future proof for the next 5/6 years? I'm concerned the RTX 5080 is too light on VRAM to really last for a whole generation. Especially with more demanding titles like Indiana Jones being released. Thoughts?

Spoggi99

Hey DF, I hope you’re having a great time at CES! With the advancements of DLSS and Multi Frame Generation, do you think there’s a risk of developers becoming less focused on optimizing their games? I don’t want to beat a dead horse, but as someone who still finds DLSS and Frame Generation artifacts distracting in certain titles, I’m concerned that native resolution gameplay might become less viable as DLSS and FSR increasingly become the norm. Don’t get me wrong - I’m all for technical progress in this area, and I think DLSS looks fantastic in many games. However, there are definitely titles and engines where graphical issues in motion make me prefer the native, non-DLSS presentation. Thank you for all your amazing work, and I’m looking forward to your videos in 2025! :)

Samson

Nvidia only gave limited access to press. Only using Cyberpunk. Linus also got hands on access. He had the same restrictions

jamesmck486

With Mark Cerny’s presentation just a few weeks ago talking about the benefits of using a CNN to do the scaling for PSSR, how do you think this looks now in light of Nvidia claiming that CNN technology is maxed out? Will Sony/AMD now have to scramble to R+D a transformer model, or will they, once again, just be stuck multiple years behind?

jamesmck486

I also assume that Amethyst was probably started to more quickly respond to developments like this, I just doubted the need would come so quickly.

Tamerator

Hello DF friends, it seems that DLSS 4 could be a massive game changer for the console market if it makes its way to the 10th console generation. Do you think there is any chance for it to happen? If not, do you think we will see a significant gap between the 10th generation of consoles and PCs due to AMD lagging so far behind Nvidia? I hope that you are having a fantastic CES!

Alexandru Stefanica

Hey DF ! Hope you are enjoyuing CES ! With the new feature of DLSS and MFG , at what point can we say Nvidia is actually building the game and not a random studio ? Do you think game development is moving to a more declarative form (studios focusing more on image quality and just describing to an ML powered game engine what they expect, rather than coding it out right) ? Then again maybe the entire software development industry is moving that way with all the AI nonsense that's going on. Thanks and keep up the great work !

David Ruppelt

Do you know if the frame warping of NVidia Reflex 2 is also applied on the MFG frames? Do you even have tools to measure this? Click latency would still be bad, but mouse movement latency would be improved.

Takeshino

Hi DF. A conversation about the push for realistic graphics made me realise that we don't really see devs mess around with the RT buffer for stylisation, right? Why is that? Surely adding a texture overlay or dithering pattern to shadows or ambient occlusion or a subtle distortion to reflections isn't that complicated, right? It doesn't have to be full on Spiderverse shenanigans but I do find it weird that we haven't really seen stylised RT.

Snorlax jobless

Which is the new jensen’s prophesy now that you have seen him in the flesh? He is talking robotics and super computing but what do you think will come true soon? Also I know oliver loves AI, and jensen said their omni model is the chat gpt of real world datasets… at least during his 5 minute interview with bloomberg

Anxiously Chrono Triggered

Hey Alex and Oliver, Hope you guys have some time to rest while covering CES for all of us. Is DLSS4 the most exciting thing you’ve seen so far? Do you think there will be some performance hit when using DLSS4 on the 40 series and older GPUs? FG is supposed to gain some performance on the current gen, but what about DLSS4 without FG? Cheers!

Alex’sRayTracedBottom

Lads! I’m struggling to see the point of DLSS 4 Multi-Frame Generation. Who is asking for 300+ fps in Indiana Jones? The only extreme high-framerate gamers I know are competitive esports players, playing twitch shooters at 300+ fps where framegen is avoided due to input lag. I’ve always seen frame generation as a way to path trace single-player games, pushing graphics to the extreme while keeping them playable—boosting sub-60fps to 100+ fps. But why would I need three times that framerate for my single-player adventures? Yes, more framerate = good, but I would rather see NVIDIA focus on better raster and ray tracing performance over Multi-Frame Generation. Maybe I am just salty there is no 300hz TV. Thoughts? All the best!

Anxiously Chrono Triggered

Hey Guys, Have you had a chance to try out SteamOS on a non-Steam Deck device yet? Also, MS commented they were going to improve Windows on PC handhelds and bring it closer to the Xbox experience. Don’t you think it might be too late since Valve has started „infecting” other handhelds with their amazing SteamOS virus? Cheers

EvilRacer329

Good Morning DF Gents! With framegen now capable of creating frames far beyond what my 4K 144Hz display can actually display, I really do think that framerate has stopped being a relevant performance metric. I like pushing framerates up into the 90's because of the extra, and noticeable responsiveness in e.g. Doom Eternal, but doubling (or tripling, or quadrupling) those numbers would actually require my display to drop/skip frames, and would surely result in uneven frame-pacing (which looks and feels awful). It seems like nVidia really is doubling down on framegen for the 50x series, but it also seems like a useless feature unless the new GPU caps itself to e.g. 72fps? Related question: what happens when you set a framerate limit when using these new framegen tools? Rich talked about capping his output to 120 for the recent video; is the driver smart enough, then, to cap rasterisation to 60, 40, or 30fps when 2x, 3x, and 4x framegen are turned on?

EvilRacer329

Another related question: can you foresee a future where the game simply targets your monitor's maximum refresh rate, renders as many frames as it can, and then adjusts DLSS quality setting/internal resolution, to get input latency down, and use screen warp and framegen to get you your smooth output? Are we approaching a point where the raw performance of your videocard will dictate the image quality, rather than the smoothness of the output?

David Ruppelt

For me it's the opposite, I strongly dislike the idea of needing FG to get playable fps. Instead, I want native playable fps and then use FG to go to 1000fps. This is to get better motion clarity, similar to BFI, but without the flicker and brightness loss. While there currently are no such displays on the market, they do already exist in the labs. For FG to be actually good for sub-60fps, it would need to change from interpolation to extrapolation and more advanced forms of reprojection. Reflex 2 is a first step in that direction, but far from my goal.

Rakete

Frohes neues Jahr! (Happy new year) As someone who a month ago re-entered PC gaming after leaving it in the 90s for console gaming I have a question: On my 4080 I find the DLSS artifacts in Indiana Jones (e.g. around the UI when holding the map and walking at the same time) very disturbing and rather want to play with 120fps but clean frames instead of 240 plus frames with DLSS-smear. How mich improvement will a 5090 be in this regards? How much stronger is it for ‚real‘ frames compared to a 4090?

Rakete

Not a question but additional comment: your great content made me jump back into pc gaming after a very long time due to your great coverage. And I am happy with my decision playing Indiana Jones in WQHD with over 100 FPS. Thanks gents!

CABBAC

Nah, they saw that months and months ago. That's old news to them.

Marc Reis

Most likely not of great concern, but are there Infos on how the new GPU Gen and it's tech are for VR? E.g. CNN DLSS relate issue can be quite prominent in VR (Luke's Mod for Cyperpunk or in DCS) , especially in HMDs like the pimax 's. Looks like the transformer models do better?

lil' ecto-1

Howdy digital cowboys! My favorite part of upgrading my GPU is to dive into my backlog and play older games at maxed out setting with HFR. Nothing quite as exciting as brute-forcing a PS3 era game to display in 4K120 when it barely was able to maintain 540p30 a few generation back. Should we be concerned that FG/MFG is becoming the future of GPU performance gains as raster improvements take a back seat? It’s not like many older, raster based and non-AI enhanced titles will be able to utilize the headline grabbing benefits of these new RTX cards. Yee-haw! (in a British accent)

Mr Bespoke

Hi DF! Now that the latest LG TVs can out put up to 165Hz do you think that the next gen consoles will neutralise the potential bringing us a "smooth graphical performance" or do you think that it would cause more issues with the likes of back compat titles and incorrect frame pacing?

Anxiously Chrono Triggered

Hey Guys, Have you had a chance to look at the Micro-LED screens? I’ve seen some huge panels from Hisense on Vincent’s channel but maybe smaller and cheaper have also been shown there? Cheers

GenerativeJake

Hey DF Crew! What have you seen at CES that has excited you most that you were not expecting? It could be a prototype technology you think has a lot of potential or something that is an actual product releasing which you are excited to checkout. Either way, would love to know what technology you didn’t know was releasing that has gotten you hyped for 2025 and beyond!

Container7

What was the vibe in the room when the audio for Jensen's 'shield gag' didn't work?

Skyrim Super Championship Turbo Edition II

Nvidia called out Text-to Animation in their presentation as a new feature. Is the idea that this will help make on the fly adjustments to mo-cap easier, or eliminate mo-cap from the production process?

LosCV29

Hey, guys! Happy new year! I can already see some segments of tech tubers roll their eyes at the "fake frames".. With modern game engines requiring more horsepower, I don't see a way to brute force our way to HFR territory. I understand some of the negatives to frame gen... But why do you think there is so much resistance to it?

Rich

Reflex would indeed get you closer to the goal of 1Kfps by decreasing input latemcy to what a 1Kfps update rate would provide (ie mouse movement perfectly in sync with a 1Khz polling rate). You're too focused on numbers instead of the actual goal of those framerates (low latency and motion fluidity). Also, you vastly underestimate how many PC Gamers of all types play on high refresh rate monitors, a number that is far higher than PCs connected to 4K TVs. And there are indeed 240hz 4K displays. Check the Steam Hardware Survey.

Richard

Hi folks! With the news that DLSS4 has moved away from a CNN and seen some great improvements, do you think Sony could follow suit with PSSR? Would this be technically possible considering their quote-unquote bastardisation of GPU hardware to run ML workloads on the Pro, and would they need to lean into the Amethyst partnership with AMD to make this financially possible? Should they even give up trying to fully fuse the existing PSSR CNN?

The Invisible Man

Hi Gentlemen, with the apparent introduction of optical sensors embedded into the Switch 2 Joy cons. Do you feel this feature is a gimmick or possible game changer for Nintendo? Love the show guys! P.S when can we get more Audi on the channel?

Rich

CNNs are less expensive to run and especially less expensiveto train. AMD has made strides in AI but their consumer GPUs and APUs still have a lot of catching up to do to be as performant as Nvidia Tensors.

Iuri Grangeiro

Hello DF crew! I think, as many of us out there, I was plenty satisfied with the quality dlss had when using the CNN model and reasonable resolutions (think the resolutions dlss chooses on auto). The improvements seem fairly large at resolutions that already looked pretty good, but what I'm more interested is, what happens to resolutions that didn't look that good before? Is 1080p performance more convincing? Is ultra performance comparable to the old performance? What does one gain if they were already satisfied with the previous look dlss had?

Rich

The tech isn't finished yet, drivers probably won't be finalized until the cards actually drop.

Bulbous_Bow

If I can get my hands on a 5070 around launch day would Alex trade me his tired, old 4090? Having watched the presentation I’m aware I’d be doing him a bit of a favour here, but such is my commitment as a Patreon of DF I’d be willing to make the sacrifice 🙏

Rich

AMD themselves (Frank Azor, etc.) did a roundtable discussion with journalists after their presentation, I read the coverage from TechTechPotato including a full transcript. Based on what AMD themselves said, RDNA4 isn't ready yet. It seems that regardless of what Nvidia did or fidnt do RDNA4 won't be ready for preview until Iater this year.

Rich

To be clear Nvidia clearly labeled everything in all of their slides. Nvidia has never actually passed off frame generated performance as performance as such without clearly explaining in writing on each document. Indeed, some of the slides Nvidia showed explicitly included games that don't support MFG, FG, or DLSS at all.

Bulbous_Bow

Hi crew. I understand from previous directs that a large proportion of your #content is watched on phones. However, I would be interested to know what the stats are on fellow DF-enjoyers consuming said content on their $10,000 custom liquid cooled entertainment centres? I’ll certainly be watching the next direct from mine, dressed to the nines in my crocodile skin jacket to mark the occasion

Samdenn

Hey founders! This is a two parter on the new framegen tech from nvidia: you guys spoke previously about frame gen tech hopefully evolving to the point where it could simply generate enough frames to fill the max refresh of the monitor. Well, my first question is - why didn’t they do this? It seems there will be some manual input required from the user, and it seems like possibly an easy win to implement such a feature which automatically determines how many frames to generate, especially given the new tech for frame pacing… Second question - do you think it’s possible to accidentally downgrade your experience if you choose the 4x multi frame gen when only 2 or 3 are required to get to your refresh? For example, if you’re running a game natively at say, 80fps on a 120hz monitor, and then enable 4x multi frame gen, wouldn’t you be effectively limiting your native fps to 30 and suffering the quality/latency penalty?

Dave Brown

Hi guys! Do you think this 'fake frames' hogwash will eventually run its course at some point? What will it take for that to happen? Or is the 'angry gamer' contingent here to stay?

zephyr

Hi DF team! Given multi frame generation and the larger embrace of dynamic rez on PC, will VRR be obsolete in the near future? Seems like the dream of running every game at your monitor’s maximum refresh rate could actually come true

AGSMA

Hi guys. The morning after CES I opened Bluesky and saw Alex's posts rather concerning. I hope he's doing a speedy recovery from that night. Lots of tea and a match of Impossible Creatures should do the trick.

Someguyperson

I saw that the Indie game Antonblast turned a profit in less than a month of sales a week ago. It seems like only Indie games and companies who are relatively stubborn in their ways (Nintendo, FromSoftware, etc.) are the ones turning a profit. It's also not really about meeting AAA graphical standards, as Capcom and Insomniac are able to generate a good amount of sales while still having some of the best graphics in the business. Why do all these "failing" companies keep setting unreasonable standards and then fail to hit them?

Someguyperson

Questions for you guys about the new LG TV lineup: Are the panels bright enough for Oliver now? How often do you use the input button on the remote and would the loss of said button be a deal breaker for you? (I just have each input mapped to long pressing a number on the remote) Finally, is there too much AI on these TVs?

Someguyperson

What's with AMD "announcing" the 9070 & 9070 XT without a price, release date, or any real specifications? Did AMD release the concept of a name of a GPU? How useless of a "First!" comment is this announcement?

Someguyperson

The new transformer model version of DLSS Super Resolution looks substantially better than the old model, but they did mention that it takes 4x more compute than the old CNN model. How feasible is this on older 20 and 30 series cards if it's that much more expensive? Would you say this is as good looking and as performant as DLAA, or is this heavier?

Someguyperson

When Nvidia presented a game running with 3 generated frames, I thought it looked kinda bad with a bunch of artifacting all over the place. How do you guys feel about it? I think it might only be worth it if you are already rendering at 60 FPS before frame-gen. That way these artifacts will only be a couple frames out of ~240.

Someguyperson

Is it just me, or do Nvidia's Neural Faces and that text to animation feature look absolutely dreadful?

Someguyperson

What do you make of the fact that all 4 cards Nvidia announced have 12 GB+ of memory? Also, the press release called out "enhanced compression designed to reduce memory footprint", which is something I predicted would come to these cards. DLSS Frame Generation also has a reduced memory footprint now. With all these features, do you think memory capacity will still be an issue on these cards?

Someguyperson

It seems like Nvidia is trying to create their own branch of all key UE5 features with "RTX Kit Technologies". RTX Mega Geometry = Nanite, RTXGI = Lumen, RTX Character Rendering = Metahumans, the list goes on & on. Why would a developer use these features over the Unreal implementations? Would a developer just have these RTX APIs as options to replace the UE5 featureset, or would someone build a game centered around these technologies?

Someguyperson

If you listened carefully, Jensen also said that the 5080 had 4090 levels of performance, so one of those statements is a lie (it's the 5070 comparison).

Someguyperson

Right now PSSR takes ~2ms of time to upscale the image. I would assume that DLSS takes a bit more compute to calculate, so let's call it 3 ms to calculate DLSS on a PS5 Pro as a rough estimate (it would be much harder to get that working in reality). According to Nvidia, the Transformer version of DLSS takes 4x the compute time of the CNN version, so it would take a PS5 Pro 12+ ms to simply upscale the image, which isn't useful at all.

Someguyperson

There are some AM4 motherboards with PCIe 4 ( I have one), but PCIe 3 might be a bridge too far for some of those higher end cards.

Someguyperson

Nvidia said the Transformer model of Super Resolution was 4x more expensive in their own marketing materials. The only real way to check it out is by running it against older cards, which hasn't been done yet, but expect it to be heavy.

Someguyperson

Nintendo was never going to reveal their new console until after the holiday season to not cannibalize sales in the biggest quarter of the year. They also don't want to compete with CES or any other big announcement, so I think they absolutely know what they're doing and they've just fine.

Richard Leadbest

But they’re throwing out the headline that a 5070 is as performant as a 4090 based on these metrics, which is definitely misleading.

Someguyperson (edited)

Comment edits

2025-01-08 18:04:38 Well, Nvidia added that text in a very small font in grey against a grey background, while the bars on the chart that they wanted you to look at were in bright green. I think it's very deliberately trying to be misleading and that's 100% up to the . marketing team. Jensen did mention how the 5080 actually had 4090 levels of performance on stage though, which is the actual estimation of performance. The 5070 should have 4070 Ti levels of performance.
2025-01-08 15:48:25 Well, Nvidia added that text in a very small font in grey against a grey background, while the bars on the chart that they wanted you to look at were in bright green. I think it's very deliberately trying to be misleading and that's 100% up to the marketing team. Jensen did mention how the 5080 actually had 4090 levels of performance on stage though, which is the actual estimation of performance. The 5070 should have 4070 Ti levels of performance.

Well, Nvidia added that text in a very small font in grey against a grey background, while the bars on the chart that they wanted you to look at were in bright green. I think it's very deliberately trying to be misleading and that's 100% up to the marketing team. Jensen did mention how the 5080 actually had 4090 levels of performance on stage though, which is the actual estimation of performance. The 5070 should have 4070 Ti levels of performance.

Levander Davis

Consoles are more than just a GPU. AMD has shown that they can deliver a powerful cpu/gpu/bandwidth combo at an affordable price while also taking into account their partners own hardware additions.

Richard Leadbest

Further, even if they’re labelling it, a lot of folks might not be aware that Multi Frame Generation comes with caveats. The graphs only include fps, not input lag

Salman

Hello, how do you think multi frame gen is going to work in games with frame pacing (Fromsoft like) or animation (Silent Hill 2, Star Wars, etc) jitter issues?

Salman

Do you think Jensen's jacket was generated by AI, and if so does that model require more training? Also who do you think Jensen's hanging out with 10k liquid nitrogen setups? The 5090 is going to cost more than than the rest of my new high end setup combined (including a new chair)

1040STF

There's a full debate about the latency feeling regarding DLSS 4, which I can understand: even upscaled to 240fps, a game that render at 27fps internally will still feel like a 27fps game, which is not the best. My personal theory is that with those new Nvidia cards, we should be wise regarding AAA games and tweak its settings without DLSS until we reach a convincing 60fps (let's say it can fluctuate to 50) and THEN use DLSS to upscale to whatever framerate we need. So we get both the fluidity and the responsiveness. What do you think?

kate

Nvidia claims that the 5070 has '4090 performance' but only when utilising the power of AI. Can we take this to mean only when using 4x framegen on the 5070 vs 2x on the 4090, and if so does that mean it's only roughly half the performance of the older card?

DaJaCo (Dan) (edited)

Comment edits

2025-01-08 18:04:38 Happy half Fortnight Lads! - I have a couple of questions relating to display technologies emerging at CES. 1- With HDMI 2.2 supporting 4K@480Hz - it must naturally also support the equally divisible (32, 40, 48, 60, 80, 96, 120, 160, 240Hz). Is this likely to mean that the PS6 etc. might treat 48Hz as "the new 40Hz" and 80Hz as "the new 60Hz" or will frame-gen be so ubiquitous by then that fixed targets will be pure folly? 2- We all know Oliver loves screen brightness above all else, but if he can tolerate brief eyeballing sessions on those dim & dusky self-emissive panels, can he tell us which has better colour volume in dark scenes - the latest Samsung QD-OLEDs - OR Panasonic / LG's TANDEM OLED..?
2025-01-08 16:38:43 Happy Half Fortnight Lads! - I have a couple of questions relating to display technologies emerging at CES. 1- With HDMI 2.2 supporting 4K@480Hz - it must naturally also support the equally divisible (32, 40, 48, 60, 80, 96, 120, 160, 240Hz). Is this likely to mean that the PS6 etc. might treat 48Hz as "the new 40Hz" and 80Hz as "the new 60Hz" or will frame-gen be so ubiquitous by then that fixed targets will be pure folly? 2- We all know Oliver loves screen brightness above all else, but if he can tolerate brief eyeballing sessions on those dim & dusky self-emissive panels, can he tell us which has better colour volume in dark scenes - the latest Samsung QD-OLEDs - OR Panasonic / LG's TANDEM OLED..?

Happy Half Fortnight Lads! - I have a couple of questions relating to display technologies emerging at CES. 1- With HDMI 2.2 supporting 4K@480Hz - it must naturally also support the equally divisible (32, 40, 48, 60, 80, 96, 120, 160, 240Hz). Is this likely to mean that the PS6 etc. might treat 48Hz as "the new 40Hz" and 80Hz as "the new 60Hz" or will frame-gen be so ubiquitous by then that fixed targets will be pure folly? 2- We all know Oliver loves screen brightness above all else, but if he can tolerate brief eyeballing sessions on those dim & dusky self-emissive panels, can he tell us which has better colour volume in dark scenes - the latest Samsung QD-OLEDs - OR Panasonic / LG's TANDEM OLED..?

yogi

How concerning is the RTX 5080's 16 gbs of vram? It feels a bit low for me considering the 4070 ti super, 4080 and 4080 super had 16gb as well.

VeryProfessionalDodo

Hey there, I come to you with a meta-question about artistic control and the progressive AI-fication of graphics. We have two sides of this camp. On one side you have Oliver's interview with Mark Cerny and Mike Fitzgerald, in which the latter states "I want to be able to control the PSSR version against what we test". On the other, we have Nvidia saying "for every one pixel the game renders, we can hallucinate up to 15 more, but it's a really good hallucination, we swear!", all while giving the user the option to forcefully override DLLs. This is effectively eliminating any studios possibility of controlling image quality when using DLSS, since maybe 2 years from now, a new tech will randomly appear that changes the way DLSS looks. Before DLSS4, I thought Nvidia's approach was better, but I'm now wondering whether we are straying too far from traditional rendering, and that we're losing something in the process. What do you think?

VeryProfessionalDodo

We all know DLSS is excellent as it is, but during the DLSS4 presentation there were several times where I wondered whether what I was seeing was real. Not in a "wow, this looks insane" way, but more of a "I'm not entirely sure if this is real detail, or just a really convincing approximation of what it should be". For this reason, there is one test I would really like you guys could do when reviewing DLSS4. Could you do a 16K natively rendered screenshot, and then compare it against a 4K image using DLSS4 performance? And if the two images happen to be quite different, do you personally believe that artist intent matters more than getting a "good enough" approximation of detail at improved frame rates?

VeryProfessionalDodo

Speaking of personal experience, I have not had a single good experience with frame gen. With a 4070 it hit VRAM limits too often for it too matter, and when it didn’t, I really didn’t like the look of the fake frames, especially when it craps it’s pants around UI elements. It got to the point where I preferred a good, well paced 40fps over artifacty 70 to 80 fps. On top of that, a 5070 is probably going to be already great for traditional workloads, to the point where only in path traced games would this halo feature be useful

Sergio Martinez (edited)

Comment edits

2025-01-08 18:04:38 Greetings gents! My question is for Oliver and Alex. What piece of tech, if any, did you see at CES that is flying under the radar and perhaps deserves a broader conversation?
2025-01-08 17:00:41 Greetings gents! My question is for Oliver and Alex. What, if any, piece of tech did you see at CES that is flying under the radar and perhaps deserves a broader conversation?

Greetings gents! My question is for Oliver and Alex. What, if any, piece of tech did you see at CES that is flying under the radar and perhaps deserves a broader conversation?

VeryProfessionalDodo

You will get an improvement to your existing 4090 apparently, they improved the stability of frame gen in general

SplitScream

Will DLSS Multiframe gen stay on only 5000 series or will it trickle to the last generation cards?

Ryan Luker

I think I saw that they used the ol' "requires special hardware to work properly" trick (similar to x1 framegen from 40xx series) so probably doubtful? (I could be wrong though!)

Ryan Luker

I wonder Nvidia has test suites where they render a set scene multiple times (once with their new DLSS4 and once without) and then do cross comparisons to decide how accurate the end results were to the "grounded truth" version?

Ryan Luker

I feel like consoles will lean towards the conservative path of no dynamic upscaling swap outs (maybe by default but let the users pick a PSSR version?) while PC will be it's usual DYI hacker self and allow you do to whatever.

Ryan Luker

I think the navtive 27fps is first upscaled via DLSS4 so maybe the framegen x4 is based of off ~60fps?

Ryan Luker

I was a bit confused here as well and couldn't find anything clarifying if the "4x more compute" was referencing the training requirement or the runtime requirement?

Ryan Luker

Most youtubers released the material as is but I think they pulled the pricing bits after getting wind of Nvidia's pricing... Not a great look for sure.

Auro

Right now our top targets are 1440p480hz and 2160p240hz, after reflex takes 6% for VRR bias, we are left with around 450hz for 1440p and 225hz for 2160p. Is a base framerate of 56fps high enough for responsive gameplay? I'm glad they are also offering 2x and 3x frame generation because I think 3x could be the sweet spot for 2160p240hz!

Someguyperson (edited)

Comment edits

2025-01-08 18:04:38 To quote Edward Liu in this presentation: https://youtu.be/qQn3bsPNTyI?si=dYVm2lkHNnJMwfOJ&t=202
2025-01-08 18:02:24 To quote Edward Liu in this presentation: https://youtu.be/qQn3bsPNTyI?si=dYVm2lkHNnJMwfOJ&t=202 "DLSS 4 also introduces a more powerful Transformer based model for Super Resolution, Ray Reconstruction, and DLAA using 4 times more tensor core processing power to reconstruct images at even better image quality for all RTX owners." Also, at 4:18 in the same video: "Transformers scale much more effectively than CNNs, so our Transformer models ingest over 2 times more parameters and requires 4 times more compute during inference." The specific call out to inference performance (not training) and the call out to "4 times more tensor core processing power " makes this pretty clear. Also, just generally Transformer models are extremely expensive to run, particularly compared to a CNN. Logically, there's no free lunch here and people shouldn't expect an improvement for no cost.

To quote Edward Liu in this presentation: https://youtu.be/qQn3bsPNTyI?si=dYVm2lkHNnJMwfOJ&t=202 "DLSS 4 also introduces a more powerful Transformer based model for Super Resolution, Ray Reconstruction, and DLAA using 4 times more tensor core processing power to reconstruct images at even better image quality for all RTX owners." Also, at 4:18 in the same video: "Transformers scale much more effectively than CNNs, so our Transformer models ingest over 2 times more parameters and requires 4 times more compute during inference." The specific call out to inference performance (not training) and the call out to "4 times more tensor core processing power " makes this pretty clear. Also, just generally Transformer models are extremely expensive to run, particularly compared to a CNN. Logically, there's no free lunch here and people shouldn't expect an improvement for no cost.

Someguyperson

That 27 FPS figure is without any upscaling. I believe Nvidia's number they gave out had performance going from 27 FPS to ~71 FPS using DLSS Super Resolution only. That way, the game feels like it's running at 71 FPS, not 27.

The Knight Who Says Ni!

What’s been your favorite piece of non-gaming tech unveiled at CES so far? Anything particularly innovative or exciting caught your attention?

tod weitzel

If Microsoft wanted to stop the spread of SteamOS, what options do they have that could sabotage Proton and the Steam storefront?

David Ruppelt

For Reflex v1 that was a no brainer, as it was purely beneficial. For Reflex v2 that is not necessarily the case, as you need some gpu horsepower to warp the frames. Also this needs interactions with the game engine through the CPU and the whole point of frame gen was to help in a CPU limit. If you have to warp all three generated frames you are further lowering your base frame rate. NVidia could have thought it was not a worthwhile tradeoff. Or there could be other technical limitions I'm not aware of. So just because Reflex is active in the menu doesn't necesserily mean that the v2 aspect of Reflex is doing anything on the generated frames, or even at all. That's why I'd be interested if there is either official info from NVidia or if there is already a way to measure this.

Horselle

The Asus Flow z13 looks awesome and seems to have decent performance. Is it a handheld killer and does Oliver think that it could be used on a 48 inch 4K tv for gaming?

Tony Escobar

Not a question, just a note of thanks. I took 6 months off from the supporter program (and from Patreon) to reassess the memberships that meant the most to me, and it became clear that DF Retro was among my favorite programs. DF is an important part of my life, and although we have never met, I genuinely care for and enjoy each member of the team and the content they produce. It is good to be back.

Aaron

Have you seen the 3D-printed Switch Pro mockups floating around CES? What do you make of the supposed optical sensor in the Joycon, which is rumored to provide cursor functionality?

Abhi

What do you think of the FF7 Rebirth pc specs?

Saffsanity

Hello Foundrymen! When reading NVIDIA's press release about the new transformer-based Frame Generation, it stated they are abandoning using the hardware optical accelerator as it was slower than their sparkly new AI model. Rewinding to when the 40 series launched, they claimed the reason 30 and 20 series cards could not use the original Frame Generation was due to a lack of speed with their optical flow accelerators. What could be limiting the new Frame Generation to 40 series and above? FP8 support, hardware segmentation or something else?

Abhi

Is it interesting to you that CDPR continues to return the Red engine with updates to cyberpunk such as RT overdrive or now integrating DLSS 4 (which appears to require new inputs) especially given CDPR has said they're moving away from red engine for future releases. Also do you have any ideas why they're doing this work with Nvidia but don't appear to be interested in doing a ps5 pro update?

DaJaCo (Dan)

Agree, that's my interpretation - the 27fps is pure native render / no upscale.

DaJaCo (Dan)

If you focus on securing the new chair first, at least you can be sitting down while you hear any subsequent GPU prices.

DaJaCo (Dan)

"DLSS 4 also introduces the graphics industry’s first real-time application of the transformer model architecture. Transformer-based DLSS Ray Reconstruction and Super Resolution models use 2x more parameters and 4x more compute to provide greater stability, reduced ghosting, higher details and enhanced anti-aliasing in game scenes. DLSS 4 will be supported on GeForce RTX 50 Series GPUs in over 75 games and applications the day of launch." This is confusing.. however the subject of the sentence is the "models", "the "models" use 2x parameters and 4x the compute". It's not very clear if this means during the training of the models - or the application of them in real-time.

DaJaCo (Dan)

Maybe this is the "biggest leap ever in a generation" MS referred to. Multi-Frame Generation would certainly be an easy way to loosely fulfill that PR promise.

DaJaCo (Dan)

Given the compute required to shift to transformer models, it likely would not have been feasible for the PS5 Pro anyway, not given that it's AMD hardware. For PS6 perhaps they could follow this path. I'm very curious to see Nintendo's lightweight CNN solution - since it takes a very different approach - by switching dynamically between multiple models depending on the render resolution and output resolution. I have a hunch that it could be surprisingly performant - since each model will be trained on specific resolution gaps.. it loads the computational burden heavily onto the training phase - and very lightly onto the real-time compute. Makes total sense for a portable.

DaJaCo (Dan)

I think MS is going to jump ship. They got nothing to lose.. green on green mean machine. Multi-Frame Gen would certainly allow "the biggest technical leap in a generation"..

DaJaCo (Dan)

I think we'll probably get a DLSS derived 720p -> 1080p upscale for the handheld screen, and fewer dropped frames on titles that struggled. I can't see Nintendo doing any more than this, but would love to be proven wrong.

DaJaCo (Dan)

Indeed, they'll wait for a quiet day. All the rabid folks going crazy over leaks will buy it day 1 anyway.. The unsuspecting masses are just that.

DarkRod99

Hi DF team.. Exciting announcements coming from CES specially NVIDIA.. but something has me worried.. and is the focus of generating frames in order to disguise a performance bump.. I understand that's a cool feature and all and NVIDIA is doing their best to up the quality and minimize latency.. but what about raw performance it doesn't matter anymore? People estimates that the 5090 improvements against 4090 in terms of raw performance are low, what if you favorite game doesn't support DLSS 4 multi-frame I'm fearing that we will get to the point of these companies fooling everybody into thinking that they have more performance, when they aren't. Anyways thanks for reading.

Alan

Hi Guys People are hyped for the Switch 2, but not every Nintendo console is a success! What does Nintendo need to do to ensure it is a success, is just a more powerful Switch 1 enough? the games, another Mario Kart which would just be MK8 with some new tracks?, and third party support by way of old PS4 ports is this enough?

GimmeMoreFramerate

Hi Gents, I know you guys use larger monitors for PC gaming. My question is: how far are you actually sitting away from those huge panels (some of you use 42" OLEDs) and what do you consider too close for larger panels? I finally jumped on the OLED train and ordered 27" QD OLED (the MSI271QRX) which I'm about to receive in a couple of days. I was really torn between 32" and 27" and decided to stick with 27" (actually the new monitor is 26.5", so its actually a size downgrade from my 27" TN panel). But now the 32" 4k FOMO got me and I have sleepless nights over my monitor decision. Should I have upgraded to 32" panel? I had no other 32" panel around and was not able to test it on my desk, but imagining it makes it almost feel too big. Even so at times I think a size upgrade would be nice. But then again, I'm sitting only about 22 inches away from the monitor. What is your take on viewing distance for gaming monitors? I'm running a 4080 and I like its 1440p Path Tracing performance. This is another reason I sticked with that size.

BespokeExclamationPoint

But if your eyes can’t perceive a difference and you can’t feel the difference when playing, what does it matter? People infatuate themselves with numbers, but if an image looks 4K to you and feels like 240fps with no noticeable difference to input lag, does it really matter that the game is actually running at 1080p 30? Cause it shouldn’t. Nobody pauses their game to zoom in 200x to notice imperfections or says it feels like 240fps, but deep down I know it’s 30, so I’m angry. DF does it because they are interested in the tech behind it and if they can better understand it they can relay the information to us in layman’s terms, not for fan boys to clip and take out of context

Expansion Pak

Tom Henderson of Insider gaming reported last year that the PS5 Pro's audio processor has 35% more performance than the base machine - that more convolution reverbs and FFTs can be processed. I have no idea what any of this means. Do you see any tangible benefits from this?

Salman

Sitting at ~1500 + tax, chair included (still outstanding: case, GPU, chair assembly)

Watershed

Should I upgrade from my 4090 to a 5090?

Esteban

Given the significant improvements in FSR4, which we would expect to see in next gen consoles, and the rather noticeable issues with the current version of PSSR, that still needs to catch up to the current upscalers, let alone the new ones coming out.. does it feel like somewhat of a waste of resources from Sony at this point? They will need to invest time and resources improving PSSR, which is only going to be used in a really small install base for now, only for FSR4 to be available on the base PS6 anyway (and thats assuming they can get PSSR to FSR4 levels by the time PS6 comes out)

Colin Robinson

Based on the assumption that Nvidia will again insult consumers with a measly 8GB of VRAM on the 5060 (and MAYBE 12GB on the 5060 Ti if there is one), would it be better for someone on a budget to simply purchase a 40 series card farther up the chain? Or perhaps look at the new AMD offering if Nvidia's feature set isn't really that important to them? Of course there's always Battlemage as an option as well, assuming driver support is there. Should someone who already has a 40 series card just wait for the next generation? Thank you and have a great weekend!

Alan

Hi Guys What do you think the chances are AMD's new AI upscaling tech being in the PS6? What are your thoughts on the new TV/display 8k tech at CES for gaming specifically? Did you manage to get to see or hold any of the Switch 2 accessories / peripherals?

Jiruuino

Hey gents, given the fact that all upscalers use spatiotemporal data, it is unclear to me how the image quality and smoothness of a game can be optimized. Do you think the final quality will be improved by a higher rendering resolution or by a higher base frame rate? For example (ignoring artifacts), do you expect a sharper/smoother image from a 4k/30Hz (doubled to 60Hz using frame gen) compared to 1080p/60Hz (upscaled to 4k using dlss performance)? Thanks and keep up the good work!

BespokeExclamationPoint

FSR4 likely is PSSR, I mean why else would they have formed Amythest if they weren’t going to split up the work load. It’s the only way they stand a chance at catching Nvidia. The reason FSR4 looks a lot better is cause the AMD GPU is not only running on RDNA4 and not RDNA2.X like the pro, but it also has way more and much faster RAM and way way way more cache and likely a crap load more tops. I mean it is literally like 5 games out of over 100 that looked worse.. PSSR is mostly fantastic

Fake Plastic Tree

Dear DF Dudes, hope you are all doing well! How did rockstar manage to make rdr2 work so well on a ps4/xb1? Is it a case of money/resources, or ... ? Rich recently answered that RDR2 was one of the best looking last gen games, and I think (blurriness aside), it is still one of the best looking games of this gen.

Marc Reis

Just do give some more insight, Luke Ross has just posted some cool updates which point to my technical interest /questions on steroscopic DLSS https://www.patreon.com/posts/118426847?utm_campaign=postshare_fan&utm_content=android_share

VeryProfessionalDodo

Probably not mate, unless the only game you play is Cyberpunk Path Traced, I think your 4090 could last a good long while. Heck people are still holding on to 2060s, so they can't even imagine upgrading from a 4090

VeryProfessionalDodo

I think a killer launch line up is the way to go, maybe they're just holding on until software gets there (a Mario Odyssey 2, or something new of the sorts)

VeryProfessionalDodo

I can agree with your sentiment up to a point. If all of it is fake, then "it looks good to me" might be good for some, but not all. If only 1 out of 16 pixels is "real", I have a hard time understanding how any of it is what it should be, because at that point, we're just having really good guesses at what we should be seeing, pretty much all over the image. Again, might be good for some, but definitely not good to all. I would rather have devs (and Nvidia) spend more time in R&D of rendering solutions to make the need to have only 1 in 8 pixels be real, rather than just moving on and say "screw it, 1 in 64 pixels is real"