Home Artists Posts Import Register

Content

Hi everyone, this is Oliver, writing to you from the (very) early hours of the morning in Las Vegas! Alex and I are planning to record DF Direct 196 on-site on Wednesday, discussing the massive Nvidia keynote address, including the Blackwell GPU announcements, plus reportage on AMD, Intel, Razer, and ASUS.

We’ll also be discussing the various varieties of displays that caught our eye during the show, as well as other CES ejecta. But the show wouldn’t be the same without you, so please feel free to pose any questions - about CES, other gaming news items, or anything else that happens to be on your mind - in the box below.

Files

Comments

Richard Leadbest

Hi DF, thanks for doing the on-site coverage! Many internet people took issue with NVIDIA treating interpolated frames as real frames in performance comparisons, going as far as equating the 5070 with a 4090. Regarding this practice, where do you stand on a scale of "they're not wrong" to "maliciously deceptive"? Cheers!

1040STF

Now that we know that Call of Duty Black Ops 6 did cost a whooping 700 millions of dollars (marketing included), do we know if it moved the needle of the number of gamepass subscribers?

Nick

So two questions: Do you think any of the new DLSS or RTX features revealed at NVIDIA's keynote, and after, would be viable on Switch 2 with the obvious exception of frame gen? And have you stopped by Genki's booth to see their Switch 2 mockups?

(From Another) Richard

With DLSS 4 supporting older generation Nvidia graphics chipsets, is it likely we will see it supported on the Switch 2 and what benefits could it provide a device of that power grade given the hardware we expect to see.

Peter

Hi chaps, hope you are having a great time at CES! Nvidias pricing really surprised me, and that 5070 has the potential to be a bit of a game changer depending on real world performance. What are your thoughts on the 12gb of vram, albeit faster RAM than the 40 series? Will this be enough to overcome the RAM bottleneck we have seen on 8gb cards? Also, do you know if the upgraded "transformer" DLSS carries a greater frame time cost than the current iteration?

goregutz

Hi DF crew! AMD didn't announce their 9000 series. They claim they didn't have time but I don't buy it. Do you think they cancelled it last minute to wait for a potential 5060 announcement or at least 50 series reviews to see how they should market and price their products?

Gati

Do you think Nintendo has gone far too conservative with delaying the Switch reveal? Do you think the early leaks hurt them or have no impact in the big picture?

Perfect_Organism

I’ve seen some comments mention DLSS 4 costing more in terms of perf on older RTX GPUs. Are there any validity to these claims?

Perfect_Organism

Also, hope early morning Patreon posting means a fun night was had by all!

Dissecta

Have you been able to hands on any 50 series cards running MFG, if so how “natural” and flowing does game response seem?

HelenX

Hello. As you trapsed the CES showfloor did you see any presentations of devices or service that could have appeqared in the old Innovations catalogue? Anything weird or stupid or, ideally, weird, stupid *and* intriguing?

lord.frosty

Hey DF Crew. Not going to lie, the RTX 5070Ti looks like the most interesting card out of the bunch. I was wondering, for all of us still hanging on the AM4 Platform with the 3D V-Cache CPUs, would Blackwell be a viable upgrade? Do you think that the PCIe 3 of AM4 platform will hold back the performance of the upcoming graphics cards which are PCIe 5 combatible?

CABBAC

Of all the spectacular things that you saw at CES which was your favourite out of Half-Life 3, Switch 2, Beyond Good and Evil 2, Starcraft Ghost, or Dreamcast 2?

Jonny_5a

Hope you having fun State side. What’s the most ‘out there but I love it’ thing you’ve seen on the floor so far?

Perfect_Organism

Idk if this is a regular DF or a CES direct so this might not be applicable. I remember Rich mentioning about DF Direct guests, and it seems like there is mutual respect between yourselves and Daniel Owen. Would love to see him as a guest on a DF Direct if strings could be pulled, if you’re still open to guests?

i_like_licorice

Hey DF! If all the Switch 2 leaks are to be believed, the console will have the same form factor as the previous one, and will be backwards compatible. If that's the case, what backwards compatibility features would you like to see, and which you think are probable? A boost mode, an FPS boost, some kind of global auto upscaler, a smart delivery equivalent? Thanks!

andre surles

Do you think Mark Cerny is kicking himself with the AMD choice and having to fix the image quality themselves? How will AMD compete in the next generation?

gartenriese

Probably the GTA Trailer 2 that was shown exclusively to CES guests.

Samson

Hey DF gangstas!! Do you think it’s safe to say that apart from the the halo beast that is the 5090, Nvidia’s pricing of the 50 series was significantly more competitive than most expected? I think Nvidia hit AMD on all fronts. Pricing, Halo product, software slate and most of all that sweet sweet hype! Pure speculation but do you think AMD deciding to only announce RDNA 4 in their press briefing with no details of price or FSR 4 made Nvidia go on the offensive? Get excited fellas!! New GPUs just dropped! Cheers DF gangstas!

Dissecta

I know, but second opinions and perhaps different games…

gartenriese

Do you think Half Life 3 will be a launch title for the new Steam machines?

Voln

Hello there, I just wanted to pick your brains regarding the fps metric and whether or not it is fair to use it in advertising these days. I feel like both fps and resolution don't mean as much these days as they used to. Wouldn't it make more sense to show something like smoothness, clarity, and responsiveness differences in press materials? I feel like they would be easier for the average consumer to understand and more factual for the enthusiasts than just fps and resolution since we do so much scaling and interpolation these days. Here's a little visualization I just came up with to better illustrate what I mean https://imgur.com/a/aoZ4TOm (the numbers are just off the top of my head), but each difference on the graph would incorporate both raw performance gains and advances in latency mitigation techniques / scaling / interpolation models.

Samson

Lord! How I wish one of the premium consoles ran on Nvidia hardware! Oh well! At least we got… *checks *notes…Tegra T239 on the Switch 2

Cold Pancake

Hey gents, quick and simple question: Is 16GB VRAM enough to future proof for the next 5/6 years? I'm concerned the RTX 5080 is too light on VRAM to really last for a whole generation. Especially with more demanding titles like Indiana Jones being released. Thoughts?

Spoggi99

Hey DF, I hope you’re having a great time at CES! With the advancements of DLSS and Multi Frame Generation, do you think there’s a risk of developers becoming less focused on optimizing their games? I don’t want to beat a dead horse, but as someone who still finds DLSS and Frame Generation artifacts distracting in certain titles, I’m concerned that native resolution gameplay might become less viable as DLSS and FSR increasingly become the norm. Don’t get me wrong - I’m all for technical progress in this area, and I think DLSS looks fantastic in many games. However, there are definitely titles and engines where graphical issues in motion make me prefer the native, non-DLSS presentation. Thank you for all your amazing work, and I’m looking forward to your videos in 2025! :)

Samson

Nvidia only gave limited access to press. Only using Cyberpunk. Linus also got hands on access. He had the same restrictions

jamesmck486

With Mark Cerny’s presentation just a few weeks ago talking about the benefits of using a CNN to do the scaling for PSSR, how do you think this looks now in light of Nvidia claiming that CNN technology is maxed out? Will Sony/AMD now have to scramble to R+D a transformer model, or will they, once again, just be stuck multiple years behind?

jamesmck486

I also assume that Amethyst was probably started to more quickly respond to developments like this, I just doubted the need would come so quickly.

Tamerator

Hello DF friends, it seems that DLSS 4 could be a massive game changer for the console market if it makes its way to the 10th console generation. Do you think there is any chance for it to happen? If not, do you think we will see a significant gap between the 10th generation of consoles and PCs due to AMD lagging so far behind Nvidia? I hope that you are having a fantastic CES!

Alexandru Stefanica

Hey DF ! Hope you are enjoyuing CES ! With the new feature of DLSS and MFG , at what point can we say Nvidia is actually building the game and not a random studio ? Do you think game development is moving to a more declarative form (studios focusing more on image quality and just describing to an ML powered game engine what they expect, rather than coding it out right) ? Then again maybe the entire software development industry is moving that way with all the AI nonsense that's going on. Thanks and keep up the great work !

David Ruppelt

Do you know if the frame warping of NVidia Reflex 2 is also applied on the MFG frames? Do you even have tools to measure this? Click latency would still be bad, but mouse movement latency would be improved.

Takeshino

Hi DF. A conversation about the push for realistic graphics made me realise that we don't really see devs mess around with the RT buffer for stylisation, right? Why is that? Surely adding a texture overlay or dithering pattern to shadows or ambient occlusion or a subtle distortion to reflections isn't that complicated, right? It doesn't have to be full on Spiderverse shenanigans but I do find it weird that we haven't really seen stylised RT.

Snorlax jobless

Which is the new jensen’s prophesy now that you have seen him in the flesh? He is talking robotics and super computing but what do you think will come true soon? Also I know oliver loves AI, and jensen said their omni model is the chat gpt of real world datasets… at least during his 5 minute interview with bloomberg

Anxiously Chrono Triggered

Hey Alex and Oliver, Hope you guys have some time to rest while covering CES for all of us. Is DLSS4 the most exciting thing you’ve seen so far? Do you think there will be some performance hit when using DLSS4 on the 40 series and older GPUs? FG is supposed to gain some performance on the current gen, but what about DLSS4 without FG? Cheers!

Alex’sRayTracedBottom

Lads! I’m struggling to see the point of DLSS 4 Multi-Frame Generation. Who is asking for 300+ fps in Indiana Jones? The only extreme high-framerate gamers I know are competitive esports players, playing twitch shooters at 300+ fps where framegen is avoided due to input lag. I’ve always seen frame generation as a way to path trace single-player games, pushing graphics to the extreme while keeping them playable—boosting sub-60fps to 100+ fps. But why would I need three times that framerate for my single-player adventures? Yes, more framerate = good, but I would rather see NVIDIA focus on better raster and ray tracing performance over Multi-Frame Generation. Maybe I am just salty there is no 300hz TV. Thoughts? All the best!

Anxiously Chrono Triggered

Hey Guys, Have you had a chance to try out SteamOS on a non-Steam Deck device yet? Also, MS commented they were going to improve Windows on PC handhelds and bring it closer to the Xbox experience. Don’t you think it might be too late since Valve has started „infecting” other handhelds with their amazing SteamOS virus? Cheers

EvilRacer329

Good Morning DF Gents! With framegen now capable of creating frames far beyond what my 4K 144Hz display can actually display, I really do think that framerate has stopped being a relevant performance metric. I like pushing framerates up into the 90's because of the extra, and noticeable responsiveness in e.g. Doom Eternal, but doubling (or tripling, or quadrupling) those numbers would actually require my display to drop/skip frames, and would surely result in uneven frame-pacing (which looks and feels awful). It seems like nVidia really is doubling down on framegen for the 50x series, but it also seems like a useless feature unless the new GPU caps itself to e.g. 72fps? Related question: what happens when you set a framerate limit when using these new framegen tools? Rich talked about capping his output to 120 for the recent video; is the driver smart enough, then, to cap rasterisation to 60, 40, or 30fps when 2x, 3x, and 4x framegen are turned on?

EvilRacer329

Another related question: can you foresee a future where the game simply targets your monitor's maximum refresh rate, renders as many frames as it can, and then adjusts DLSS quality setting/internal resolution, to get input latency down, and use screen warp and framegen to get you your smooth output? Are we approaching a point where the raw performance of your videocard will dictate the image quality, rather than the smoothness of the output?

David Ruppelt

For me it's the opposite, I strongly dislike the idea of needing FG to get playable fps. Instead, I want native playable fps and then use FG to go to 1000fps. This is to get better motion clarity, similar to BFI, but without the flicker and brightness loss. While there currently are no such displays on the market, they do already exist in the labs. For FG to be actually good for sub-60fps, it would need to change from interpolation to extrapolation and more advanced forms of reprojection. Reflex 2 is a first step in that direction, but far from my goal.

Rakete

Frohes neues Jahr! (Happy new year) As someone who a month ago re-entered PC gaming after leaving it in the 90s for console gaming I have a question: On my 4080 I find the DLSS artifacts in Indiana Jones (e.g. around the UI when holding the map and walking at the same time) very disturbing and rather want to play with 120fps but clean frames instead of 240 plus frames with DLSS-smear. How mich improvement will a 5090 be in this regards? How much stronger is it for ‚real‘ frames compared to a 4090?

Rakete

Not a question but additional comment: your great content made me jump back into pc gaming after a very long time due to your great coverage. And I am happy with my decision playing Indiana Jones in WQHD with over 100 FPS. Thanks gents!

CABBAC

Nah, they saw that months and months ago. That's old news to them.

Marc Reis

Most likely not of great concern, but are there Infos on how the new GPU Gen and it's tech are for VR? E.g. CNN DLSS relate issue can be quite prominent in VR (Luke's Mod for Cyperpunk or in DCS) , especially in HMDs like the pimax 's. Looks like the transformer models do better?

lil' ecto-1

Howdy digital cowboys! My favorite part of upgrading my GPU is to dive into my backlog and play older games at maxed out setting with HFR. Nothing quite as exciting as brute-forcing a PS3 era game to display in 4K120 when it barely was able to maintain 540p30 a few generation back. Should we be concerned that FG/MFG is becoming the future of GPU performance gains as raster improvements take a back seat? It’s not like many older, raster based and non-AI enhanced titles will be able to utilize the headline grabbing benefits of these new RTX cards. Yee-haw! (in a British accent)

Mr Bespoke

Hi DF! Now that the latest LG TVs can out put up to 165Hz do you think that the next gen consoles will neutralise the potential bringing us a "smooth graphical performance" or do you think that it would cause more issues with the likes of back compat titles and incorrect frame pacing?

Anxiously Chrono Triggered

Hey Guys, Have you had a chance to look at the Micro-LED screens? I’ve seen some huge panels from Hisense on Vincent’s channel but maybe smaller and cheaper have also been shown there? Cheers

GenerativeJake

Hey DF Crew! What have you seen at CES that has excited you most that you were not expecting? It could be a prototype technology you think has a lot of potential or something that is an actual product releasing which you are excited to checkout. Either way, would love to know what technology you didn’t know was releasing that has gotten you hyped for 2025 and beyond!

Container7

What was the vibe in the room when the audio for Jensen's 'shield gag' didn't work?

Skyrim Super Championship Turbo Edition II

Nvidia called out Text-to Animation in their presentation as a new feature. Is the idea that this will help make on the fly adjustments to mo-cap easier, or eliminate mo-cap from the production process?

LosCV29

Hey, guys! Happy new year! I can already see some segments of tech tubers roll their eyes at the "fake frames".. With modern game engines requiring more horsepower, I don't see a way to brute force our way to HFR territory. I understand some of the negatives to frame gen... But why do you think there is so much resistance to it?

Rich

Reflex would indeed get you closer to the goal of 1Kfps by decreasing input latemcy to what a 1Kfps update rate would provide (ie mouse movement perfectly in sync with a 1Khz polling rate). You're too focused on numbers instead of the actual goal of those framerates (low latency and motion fluidity). Also, you vastly underestimate how many PC Gamers of all types play on high refresh rate monitors, a number that is far higher than PCs connected to 4K TVs. And there are indeed 240hz 4K displays. Check the Steam Hardware Survey.

Richard

Hi folks! With the news that DLSS4 has moved away from a CNN and seen some great improvements, do you think Sony could follow suit with PSSR? Would this be technically possible considering their quote-unquote bastardisation of GPU hardware to run ML workloads on the Pro, and would they need to lean into the Amethyst partnership with AMD to make this financially possible? Should they even give up trying to fully fuse the existing PSSR CNN?

The Invisible Man

Hi Gentlemen, with the apparent introduction of optical sensors embedded into the Switch 2 Joy cons. Do you feel this feature is a gimmick or possible game changer for Nintendo? Love the show guys! P.S when can we get more Audi on the channel?

Rich

CNNs are less expensive to run and especially less expensiveto train. AMD has made strides in AI but their consumer GPUs and APUs still have a lot of catching up to do to be as performant as Nvidia Tensors.

Iuri Grangeiro

Hello DF crew! I think, as many of us out there, I was plenty satisfied with the quality dlss had when using the CNN model and reasonable resolutions (think the resolutions dlss chooses on auto). The improvements seem fairly large at resolutions that already looked pretty good, but what I'm more interested is, what happens to resolutions that didn't look that good before? Is 1080p performance more convincing? Is ultra performance comparable to the old performance? What does one gain if they were already satisfied with the previous look dlss had?

Rich

The tech isn't finished yet, drivers probably won't be finalized until the cards actually drop.

Bulbous_Bow

If I can get my hands on a 5070 around launch day would Alex trade me his tired, old 4090? Having watched the presentation I’m aware I’d be doing him a bit of a favour here, but such is my commitment as a Patreon of DF I’d be willing to make the sacrifice 🙏

Rich

AMD themselves (Frank Azor, etc.) did a roundtable discussion with journalists after their presentation, I read the coverage from TechTechPotato including a full transcript. Based on what AMD themselves said, RDNA4 isn't ready yet. It seems that regardless of what Nvidia did or fidnt do RDNA4 won't be ready for preview until Iater this year.

Rich

To be clear Nvidia clearly labeled everything in all of their slides. Nvidia has never actually passed off frame generated performance as performance as such without clearly explaining in writing on each document. Indeed, some of the slides Nvidia showed explicitly included games that don't support MFG, FG, or DLSS at all.

Bulbous_Bow

Hi crew. I understand from previous directs that a large proportion of your #content is watched on phones. However, I would be interested to know what the stats are on fellow DF-enjoyers consuming said content on their $10,000 custom liquid cooled entertainment centres? I’ll certainly be watching the next direct from mine, dressed to the nines in my crocodile skin jacket to mark the occasion

Samdenn

Hey founders! This is a two parter on the new framegen tech from nvidia: you guys spoke previously about frame gen tech hopefully evolving to the point where it could simply generate enough frames to fill the max refresh of the monitor. Well, my first question is - why didn’t they do this? It seems there will be some manual input required from the user, and it seems like possibly an easy win to implement such a feature which automatically determines how many frames to generate, especially given the new tech for frame pacing… Second question - do you think it’s possible to accidentally downgrade your experience if you choose the 4x multi frame gen when only 2 or 3 are required to get to your refresh? For example, if you’re running a game natively at say, 80fps on a 120hz monitor, and then enable 4x multi frame gen, wouldn’t you be effectively limiting your native fps to 30 and suffering the quality/latency penalty?

Dave Brown

Hi guys! Do you think this 'fake frames' hogwash will eventually run its course at some point? What will it take for that to happen? Or is the 'angry gamer' contingent here to stay?

zephyr

Hi DF team! Given multi frame generation and the larger embrace of dynamic rez on PC, will VRR be obsolete in the near future? Seems like the dream of running every game at your monitor’s maximum refresh rate could actually come true

AGSMA

Hi guys. The morning after CES I opened Bluesky and saw Alex's posts rather concerning. I hope he's doing a speedy recovery from that night. Lots of tea and a match of Impossible Creatures should do the trick.

Someguyperson

I saw that the Indie game Antonblast turned a profit in less than a month of sales a week ago. It seems like only Indie games and companies who are relatively stubborn in their ways (Nintendo, FromSoftware, etc.) are the ones turning a profit. It's also not really about meeting AAA graphical standards, as Capcom and Insomniac are able to generate a good amount of sales while still having some of the best graphics in the business. Why do all these "failing" companies keep setting unreasonable standards and then fail to hit them?

Someguyperson

Questions for you guys about the new LG TV lineup: Are the panels bright enough for Oliver now? How often do you use the input button on the remote and would the loss of said button be a deal breaker for you? (I just have each input mapped to long pressing a number on the remote) Finally, is there too much AI on these TVs?

Someguyperson

What's with AMD "announcing" the 9070 & 9070 XT without a price, release date, or any real specifications? Did AMD release the concept of a name of a GPU? How useless of a "First!" comment is this announcement?

Someguyperson

The new transformer model version of DLSS Super Resolution looks substantially better than the old model, but they did mention that it takes 4x more compute than the old CNN model. How feasible is this on older 20 and 30 series cards if it's that much more expensive? Would you say this is as good looking and as performant as DLAA, or is this heavier?

Someguyperson

When Nvidia presented a game running with 3 generated frames, I thought it looked kinda bad with a bunch of artifacting all over the place. How do you guys feel about it? I think it might only be worth it if you are already rendering at 60 FPS before frame-gen. That way these artifacts will only be a couple frames out of ~240.

Someguyperson

Is it just me, or do Nvidia's Neural Faces and that text to animation feature look absolutely dreadful?

Someguyperson

What do you make of the fact that all 4 cards Nvidia announced have 12 GB+ of memory? Also, the press release called out "enhanced compression designed to reduce memory footprint", which is something I predicted would come to these cards. DLSS Frame Generation also has a reduced memory footprint now. With all these features, do you think memory capacity will still be an issue on these cards?

Someguyperson

It seems like Nvidia is trying to create their own branch of all key UE5 features with "RTX Kit Technologies". RTX Mega Geometry = Nanite, RTXGI = Lumen, RTX Character Rendering = Metahumans, the list goes on & on. Why would a developer use these features over the Unreal implementations? Would a developer just have these RTX APIs as options to replace the UE5 featureset, or would someone build a game centered around these technologies?

Someguyperson

If you listened carefully, Jensen also said that the 5080 had 4090 levels of performance, so one of those statements is a lie (it's the 5070 comparison).

Someguyperson

Right now PSSR takes ~2ms of time to upscale the image. I would assume that DLSS takes a bit more compute to calculate, so let's call it 3 ms to calculate DLSS on a PS5 Pro as a rough estimate (it would be much harder to get that working in reality). According to Nvidia, the Transformer version of DLSS takes 4x the compute time of the CNN version, so it would take a PS5 Pro 12+ ms to simply upscale the image, which isn't useful at all.

Someguyperson

There are some AM4 motherboards with PCIe 4 ( I have one), but PCIe 3 might be a bridge too far for some of those higher end cards.

Someguyperson

Nvidia said the Transformer model of Super Resolution was 4x more expensive in their own marketing materials. The only real way to check it out is by running it against older cards, which hasn't been done yet, but expect it to be heavy.

Someguyperson

Nintendo was never going to reveal their new console until after the holiday season to not cannibalize sales in the biggest quarter of the year. They also don't want to compete with CES or any other big announcement, so I think they absolutely know what they're doing and they've just fine.

Richard Leadbest

But they’re throwing out the headline that a 5070 is as performant as a 4090 based on these metrics, which is definitely misleading.

Someguyperson

Well, Nvidia added that text in a very small font in grey against a grey background, while the bars on the chart that they wanted you to look at were in bright green. I think it's very deliberately trying to be misleading and that's 100% up to the . marketing team. Jensen did mention how the 5080 actually had 4090 levels of performance on stage though, which is the actual estimation of performance. The 5070 should have 4070 Ti levels of performance.

Levander Davis

Consoles are more than just a GPU. AMD has shown that they can deliver a powerful cpu/gpu/bandwidth combo at an affordable price while also taking into account their partners own hardware additions.

Richard Leadbest

Further, even if they’re labelling it, a lot of folks might not be aware that Multi Frame Generation comes with caveats. The graphs only include fps, not input lag

Salman

Hello, how do you think multi frame gen is going to work in games with frame pacing (Fromsoft like) or animation (Silent Hill 2, Star Wars, etc) jitter issues?

Salman

Do you think Jensen's jacket was generated by AI, and if so does that model require more training? Also who do you think Jensen's hanging out with 10k liquid nitrogen setups? The 5090 is going to cost more than than the rest of my new high end setup combined (including a new chair)

1040STF

There's a full debate about the latency feeling regarding DLSS 4, which I can understand: even upscaled to 240fps, a game that render at 27fps internally will still feel like a 27fps game, which is not the best. My personal theory is that with those new Nvidia cards, we should be wise regarding AAA games and tweak its settings without DLSS until we reach a convincing 60fps (let's say it can fluctuate to 50) and THEN use DLSS to upscale to whatever framerate we need. So we get both the fluidity and the responsiveness. What do you think?

kate

Nvidia claims that the 5070 has '4090 performance' but only when utilising the power of AI. Can we take this to mean only when using 4x framegen on the 5070 vs 2x on the 4090, and if so does that mean it's only roughly half the performance of the older card?

DaJaCo (Dan)

Happy half Fortnight Lads! - I have a couple of questions relating to display technologies emerging at CES. 1- With HDMI 2.2 supporting 4K@480Hz - it must naturally also support the equally divisible (32, 40, 48, 60, 80, 96, 120, 160, 240Hz). Is this likely to mean that the PS6 etc. might treat 48Hz as "the new 40Hz" and 80Hz as "the new 60Hz" or will frame-gen be so ubiquitous by then that fixed targets will be pure folly? 2- We all know Oliver loves screen brightness above all else, but if he can tolerate brief eyeballing sessions on those dim & dusky self-emissive panels, can he tell us which has better colour volume in dark scenes - the latest Samsung QD-OLEDs - OR Panasonic / LG's TANDEM OLED..?

yogi

How concerning is the RTX 5080's 16 gbs of vram? It feels a bit low for me considering the 4070 ti super, 4080 and 4080 super had 16gb as well.

VeryProfessionalDodo

Hey there, I come to you with a meta-question about artistic control and the progressive AI-fication of graphics. We have two sides of this camp. On one side you have Oliver's interview with Mark Cerny and Mike Fitzgerald, in which the latter states "I want to be able to control the PSSR version against what we test". On the other, we have Nvidia saying "for every one pixel the game renders, we can hallucinate up to 15 more, but it's a really good hallucination, we swear!", all while giving the user the option to forcefully override DLLs. This is effectively eliminating any studios possibility of controlling image quality when using DLSS, since maybe 2 years from now, a new tech will randomly appear that changes the way DLSS looks. Before DLSS4, I thought Nvidia's approach was better, but I'm now wondering whether we are straying too far from traditional rendering, and that we're losing something in the process. What do you think?

VeryProfessionalDodo

We all know DLSS is excellent as it is, but during the DLSS4 presentation there were several times where I wondered whether what I was seeing was real. Not in a "wow, this looks insane" way, but more of a "I'm not entirely sure if this is real detail, or just a really convincing approximation of what it should be". For this reason, there is one test I would really like you guys could do when reviewing DLSS4. Could you do a 16K natively rendered screenshot, and then compare it against a 4K image using DLSS4 performance? And if the two images happen to be quite different, do you personally believe that artist intent matters more than getting a "good enough" approximation of detail at improved frame rates?

VeryProfessionalDodo

Speaking of personal experience, I have not had a single good experience with frame gen. With a 4070 it hit VRAM limits too often for it too matter, and when it didn’t, I really didn’t like the look of the fake frames, especially when it craps it’s pants around UI elements. It got to the point where I preferred a good, well paced 40fps over artifacty 70 to 80 fps. On top of that, a 5070 is probably going to be already great for traditional workloads, to the point where only in path traced games would this halo feature be useful

Sergio Martinez

Greetings gents! My question is for Oliver and Alex. What piece of tech, if any, did you see at CES that is flying under the radar and perhaps deserves a broader conversation?

VeryProfessionalDodo

You will get an improvement to your existing 4090 apparently, they improved the stability of frame gen in general

SplitScream

Will DLSS Multiframe gen stay on only 5000 series or will it trickle to the last generation cards?

Ryan Luker

I think I saw that they used the ol' "requires special hardware to work properly" trick (similar to x1 framegen from 40xx series) so probably doubtful? (I could be wrong though!)

Ryan Luker

I wonder Nvidia has test suites where they render a set scene multiple times (once with their new DLSS4 and once without) and then do cross comparisons to decide how accurate the end results were to the "grounded truth" version?

Ryan Luker

I feel like consoles will lean towards the conservative path of no dynamic upscaling swap outs (maybe by default but let the users pick a PSSR version?) while PC will be it's usual DYI hacker self and allow you do to whatever.

Ryan Luker

I think the navtive 27fps is first upscaled via DLSS4 so maybe the framegen x4 is based of off ~60fps?

Ryan Luker

I was a bit confused here as well and couldn't find anything clarifying if the "4x more compute" was referencing the training requirement or the runtime requirement?

Ryan Luker

Most youtubers released the material as is but I think they pulled the pricing bits after getting wind of Nvidia's pricing... Not a great look for sure.

Auro

Right now our top targets are 1440p480hz and 2160p240hz, after reflex takes 6% for VRR bias, we are left with around 450hz for 1440p and 225hz for 2160p. Is a base framerate of 56fps high enough for responsive gameplay? I'm glad they are also offering 2x and 3x frame generation because I think 3x could be the sweet spot for 2160p240hz!

Someguyperson

To quote Edward Liu in this presentation: https://youtu.be/qQn3bsPNTyI?si=dYVm2lkHNnJMwfOJ&t=202