checks Team Fortress 2
500+ fps
checks PCSX2
60fps locked and stable
checks Elden Ring
…it works, we’ll put it that way
I’m good. What good is a $3k graphics card in a generation with no games?
His jacket annoys me more than it should.
Management always wearing the same thing in public appearances is fucking weird. Steve Jobs, Zuck, Todd Howard, probably more I just haven’t noticed. It’s like they’re trying to be cartoons.
Now that AMD is supposedly out of the high end graphics card market, it seems like high end gpus are going to go for a fun pricing range.
What? When did they say that? I just bought a high-end AMD card and it’s wicked fast.
They said, about a month ago, that their 8000 series GPUs won’t have a high end option, only low and mid tier.
But rumours also say that RDNA 4 will be somewhat of a half-assed generation and that they’re putting resources in something maybe called “UDNA” instead, which will compete at the top end.
That’s crazy! But cards are so fast these days it doesn’t seem like you need a high end model anyway. My machine with a 980 to can still run everything on Ultra, except for VR which it runs on medium.
A lot of newer AAA games rely on ray tracing and AI upscaling to even be able to properly render at 1080p . This is their way to make up for developer’s shit hardware optimization and insistence on investing in diminishing returns for graphical realism.
I think a lot of games these days are keeping low powered devices like the steamdeck in mind. I think we are heading for igpu only for gamers and all the dgpu resources are going to go towards AI.
$3.3 billion dollars seems like it should be more than fine, but I guess that’s why I’m not a greedy business CEO.
My Nvidia cards anytime anyone suggests I should “upgrade”.
I’m still rocking my Zotac 1060. I might as well ride that thing into the ground at this point.
I have a non ti 1080 and every subsequent generation seems to have been worse with everything from firmware issues to actual fires.
I still have a computer with a 980 ti, and I haven’t played a single game that doesn’t run on ultra.
Unfortunately, screens have gotten larger lately.
Jensen can keep his GPUs, the way AAA gaming is going I’ll be playing indies for the foreseeable future or playing on console. I have evolved into a patient gamer. It also has the advantage that if I want to see some theorycrafting I won’t be swamped with youtube suggestions about pronouns or diversity. I’m not starting to use an account, I’m not American nor am I a conservative google, stop trying. I’m fine with different looking people in my games telling different stories. Also, TLOUII was one the greatest games of all time.
5060 will be $1999
That headline sounds expensive for us gamers.
this is the new normal… only way to fight is to squeeze the most of what you got, don’t feed the troll
Not to worry, discrete GPUs will likely go away completely within a few generations.
With how well AMD APUs are doing in regular PCs and the Steam Deck, I see it as a way for developers to standardize their game optimization like they would on consoles.
That headline is not directed at us gamers.
I already have a 4090 and I’m going to buy a 5090 and there’s nothing any of you can do to stop me.
What if we put a bunch of Legos on the floor around your bed while you slept?
I would buy it from my bed using my phone and then summon my cats to bat the Legos away. I am unstoppable.
Random Tangent: I rather buy a nicer HDR monitor with that money. HDR Gaming is where it’s at. SDR looks like washed out, dim dogshit to me now.
I offer an alternative to people who want nicer graphics: consider an OLED HDR monitor. HDR has like no performance impact but looks 10,000x better.
What’s considered a good “hdr” monitor?
Preferably an OLED monitor (or TV). Either wOLED or QD-OLED. Both have strengths and weaknesses. I have a wOLED.
If not that, a high end miniLED.
I’m using a 55 inch LG C4 as a monitor
Then buy a bigger and newer TV. Do you have any screen burn-in? We have burn-in on our LG 65" OLED from playing Diablo 3.
I just replaced my old 48 inch C1 with this TV because of burn in. Was able to return it to Costco and get all the money back and put it towards this new one.
Damn, you got lucky! They told us to contact the manufacturer. They didn’t do anything. So we called the warranty company for the extended warranty that Costco sold us and they just refunded the warranty fee. Now we just deal with it. We paid $3300 for this thing, so we’re in no hurry to replace it. The same size is only $2k now, but that’s still a lot of cheddar. We have the 65" C6.
Anyone who has a 4090 more than likely already has an HDR monitor.
Otherwise, what are they doing?
I have a 4090 and my display supports HDR and its disabled because fuck HDR. Do people actually like getting blasted with a floodlight because HDR automatically maxes your screen brightness? Like fuck I think my screen with HDR enabled is brighter than my lamp.
That’s not at all how HDR works.
You don’t have it configured correctly. That’s not what HDR is supposed to do. Or you have a fake HDR monitor.
That’s default behaviour on windows11, I have an MSI titan laptop that definitely supports it. I’ve done 0 to configure it. My brightness is typically at minimum or near to, enabling HDR immediately and consistently maxes brightness on the majority of the screen. It’s utterly unusable. It goes with things that people say matters like 1000nits of brightness or whatever when I actually want like 300, so my eyes don’t start bleeding.
I’ll say it again. You have it misconfigured. That isn’t what HDR is supposed to do. You either have settings wrong in Windows, or in the games.
That is default windows, if it’s misconfigured then it ships misconfigured. I’ve done nothing except click the slider that turns it on, and then 5 seconds later off again as the dark mode settings menu blasts me with the full force of the screen.
If default doesn’t work, I don’t really feel like repeatedly blasting my eyes with full brightness until I figure out how to make it not burn out my retinas.
I think the issue is HDR requires the r part, being range, and I want it to not blast my eyes which anything above 20% brightness does if I’m inside, and default probably uses 100% of the brightness range and that actually causes pain.
Edit: to be clear, never made it to a game. Getting flash banged by the settings menu was bad enough thanks, don’t need to deal with something intended to be a flashbang.
This is extremely based
Not just an AI company, then.
It is, don’t use Nvidia cards for gaming the p2p isn’t worth it