Well I am shocked, SHOCKED I say! Well, not that shocked.

  • sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    45
    ·
    edit-2
    6 days ago

    In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.

    (Lowest price I can find)

    … That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.

    The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.

    This reality is a farce.

    Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.

    RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.

    If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.

    That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.

    Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.

    • CybranM@feddit.nu
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      5 days ago

      The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures

      You clearly don’t know what you’re talking about here. Ray tracing has nothing to do with textures and very few games force you to use RT. What is “allowing” devs to skimp on optimization (which is also questionable, older games weren’t perfect either) is DLSS and other dynamic resolution + upscaling tech

      • lordbritishbusiness@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 days ago

        Doom the Dark Ages is possibly what they’re referring to. ID skipped lighting in favour of Ray tracing doing it.

        Bethesda Studios also has a tendency to use hd textures on features like grass and terrain which can safely be low res.

        There is a fair bit of inefficient code floating around because optimisation is considered more expensive than throwing more hardware at a problem, and not just in games. (Bonus points if you outsource the optimisation to some else’s hardware or the modding community)

        • sp3ctr4l@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          5 days ago

          That is a prominent example of forced RT… basically, as I described with the TAA example in my other reply…

          idTech 8 seems to be the first engine that just literally requires RT for its entire render pipeline to work.

          They could theoretically build another version of it off of vulkan-base, to enable you to be able to turn RT off… but that would likely be a massive amount of work.

          On the bright side… at least the idTech engines are actually well coded, and they put a lot of time into making the engine actually very good.

          I didn’t follow the marketing ecosystem for Doom Dark Ages, but it would have been really shitty if they did not include ‘you need a GPU with RT cores’.

          On the other end of the engine spectrum:

          Bethesda… yeah, they have entirely lost control of their engine, it is mangled mess of nonsense, the latest Oblivion remaster just uses UE to render things slapped on top of Gamebryo, because no one at Bethesda can actually code worth a damn.

          Compare that to oh I dunno, the Source engine.

          Go play TitanFall 2. 10 year old game now, built on a modified version of the Portal 2 Source engine.

          Still looks great, runs very efficiently, can scale down to older hardware.

          Ok, now go play HL Alyx. If you don’t have VR, there are mods that do a decent job of converting it into M+K.

          Looks great, runs efficiently.

          None of them use RT.

          Because you don’t need to, if you take the time to actually optimize both your engine and game design.

      • sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        5 days ago

        I meant they also just don’t bother to optimize texture sizes, didn’t mean to imply they are directly related to ray tracing issues.

        Also… more and more games are clearly being designed, and marketed, with ray tracing in mind.

        Sure, its not absolutely forced on in too many games… but TAA often is forced on, because no one can run raytracing without temporal intelligent upscsling and frame gen…

        …and a lot of games just feed the pixel motion vectors from their older TAA implementations into the DLSS / FSR implementations, and don’t bother to recode the TAA into just giving the motion vectors as an optional API that doesn’t actually do AA…

        … and they often don’t do that because they designed their entire render pipeline to only work with TAA on, and half the games post procrssing effects would have to be recoded to work without TAA.

        So if you summarize all that: the ‘design for raytracing support’ standard is why many games do not let you turn off TAA.

        That being said: Ray tracing absolutely does only really make a significant visual difference in many (not all, but many) situations… if you have very high res textures.

        If you don’t, older light rendering methods work almost as well, and run much, much faster.

        Ray tracing involves… you know, light rays, bouncing off of models, with textures on them.

        Like… if you have a car with a glossy finish, that is reflecting in its paint the entire scene around it… well, if that reflect map that is being added to the base car texture… if that reflect map is very low res, if it is generating it from a world of low res textures… you might as well just use the old cube map method, or other methods, and not bother turning every reflective surface into a ray traced mirror.

        Or, if you’re doing accumulated lighting in a scene with different colors of lights… that effect is going to be more dramatic, more detailed, more noticable in a scene with higher res textures on everything being lit.

        I could write a 60 page report on this topic, but no one is paying me to, so I’m not going to bother.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      6 days ago

      Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          6 days ago

          I tried Mint and Ubuntu but Linux dies a horrific death trying to run newly released hardware so I ended up on ghost spectre.
          (I also assume your being sarcastic but I’m still salty about wasting a week trying various pieces of advice to make linux goddamn work)

          • bitwolf@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            6 days ago

            Levelone techs had relevant guidance.

            Kernel 6.14 or greater Mesa 25.1 or greater

            Ubuntu and Mint idt have those yet hence your difficult time.

      • sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        6 days ago

        Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.

        I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that full wattage for running raytracing in 4k.

        Does that sound about right?

        Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:

        Consoles cannot really do what they claim to do at 4K… at actual 4K.

        They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.

        Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 days ago

          1000W PSU for theoretical maximum draw of all components at once with a good safety margin. But even when running a render I’ve never seen it break 500W.

  • ItsMrChristmas@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 days ago

    GPU prices are what drove me back to consoles. It was time to overhaul my PC as it was getting painfully out of date. Video card alone was gonna be 700. Meanwhile a whole ass PS5 that plays the same games was 500.

    It’s been 2 years since and I don’t regret it. I miss mods, but not nearly as much as I thought. It also SOOO nice to play multiplayer games without cheaters everywhere. I actually used to be one of those people who thought controllers gave an unfair advantage but… you can use a M/KB on PS5 and guess what? I do just fine! Turns out that the problem was never controllers, it was the cheaters.

    But then there is that. The controller. Oh my lord it’s so much more comfortable than even the best gaming mouse. I’ve done a complete 180 on this. So many game genres are just so terrible to play with M/KB that I now tell people whining about controller players this:

    Use gaming equipment for gaming and leave office equipment in the office.

  • Deflated0ne@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 days ago

    Ah capitalism…

    Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.

  • smeg@infosec.pub
    link
    fedilink
    English
    arrow-up
    15
    ·
    6 days ago

    Increasingly across many markets, companies are not targeting average or median consumers. They’re only chasing whales, the people who can pay the premium. They’ve decided that more mid tier customers aren’t worth it – just chase the top. It also means a lower need for customer support.

  • Mongostein@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    There’s so many games out there I’d like to play, but I’m an adult with responsibilities. I don’t need the newest game or gaming hardware because no matter how hard I try to catch up I never will, so I don’t bother to try and I always have something to play on my hardware.

  • moktor@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 days ago

    I’m still surviving on my RX580 4GB. Limping along these days, but no way I can justify the price of a new GPU.

    • RvTV95XBeo@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 days ago

      But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn’t that awesome???

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 days ago

        Bullshitted pixels per second seem to be the new currency.

        It may look smooth in videos, but 30fps upframed(?) to 120fps will still feel like a 30fps game.

        Modern TVs do the same shit, and it both looks and feels like ass. And not good ass.

        • el_bhm@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 days ago

          Points with a finger and laughs

          Look at that loser not using AI

  • simple@piefed.social
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    1
    ·
    6 days ago

    Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.

    • imetators@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 days ago

      Ex-fucking-actly!

      Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      5 days ago

      5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at ‘normie’ AI bros trying to use them online, shit doesn’t work.

      4090 is… mediocre because it’s expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.

      Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.

      The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia’s (and AMD’s) fault for literally being anticompetitive.

  • WereCat@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    6
    ·
    5 days ago

    The progress is just not there.

    I’ve got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn’t care less about.

    • DefederateLemmyMl@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 days ago

      bit better RT performance about which I couldn’t care less about.

      Yeah raytracing is not really relevant on these cards, the performance hit is just too great.

      The RX 9070 XT is the first AMD GPU where you can consider turning it on.

      • WereCat@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 days ago

        But I wouldnt turn it on and actually play with it even if I could because I will always take the better performance.

        I’ve actually tried Path Tracing in CP2077 running at native Steam Deck resolution streamed to my Steam Deck OLED from my PC at max settings and it could do 30FPS locked fairly well (overlocked by 20% though). But the game looks absolutelly horrible in motion with it’s terrible LOD so no amount of RT or PT can save it. It looks dope for screenshots though. But that’s PT, RT is basically almost indistinguishable. And PT is many, many years away for it to be viable for majority of people to use.

        https://www.youtube.com/watch?v=yNcYZ5l_c48

        (The game reports W10 but it was Fedora42 actually)

        • DefederateLemmyMl@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          But I wouldnt turn it on and actually play with it even if I could because I will always take the better performance.

          Depends. In Cyberpunk I can get 90-100fps on 1440p on ultra with raytracing on and FSR4 Quality (via Optiscaler). That is a very good experience IMO, to the point that I forget about “framerate” while playing.

          That’s Windows though, in Linux the raytracing performance is rather worse for some reason and it slips below the threshold of what I find noticeable, so I go for 1440p native.

  • Alphane Moon@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    6 days ago

    It seems like gamers have finally realized that the newest GPUs by NVIDIA and AMD are getting out of reach, as a new survey shows that many of them are skipping upgrades this year.

    Data on GPU shipments and/or POS sales showing a decline would be much more reliable than a survey.

    Surveys can at times suffer from showing what the respondents want to reply as opposed to what they do.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      8
      ·
      6 days ago

      I mean, as written the headline statement is always true.

      I am horrified by some of the other takeaways, though:

      Nearly 3 in 4 gamers (73%) would choose NVIDIA if all GPU brands performed equally.
      
      57% of gamers have been blocked from buying a GPU due to price hikes or scalping, and 43% have delayed or canceled purchases due to other life expenses like rent and bills.
      
      Over 1 in 4 gamers (25%) say $500 is their maximum budget for a GPU today.
      
      Nearly 2 in 3 gamers (62%) would switch to cloud gaming full-time if latency were eliminated, and 42% would skip future GPU upgrades entirely if AI upscaling or cloud services met their performance needs.
      
      • GrindingGears@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 days ago

        That last one is especially horrifying. You don’t own games when you cloud game, you simply lease them. We all know what that’s done for the preservation of games. Not to mention encouraging the massive amounts of shovel ware that we get flooded with.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          6 days ago

          I don’t know that cloud gaming moves shovelware in either direction, but it really sucks to see the percentage of people that don’t factor ownership into the process at all, at least on paper.

        • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          6 days ago

          You don’t own games when you cloud game, you simply lease them.

          That’s also how it is with a game you purchased to play on your own PC, though. Unless you have it on physical media, your access could be revoked at any time.

      • xep@fedia.io
        link
        fedilink
        arrow-up
        10
        ·
        6 days ago

        if latency were eliminated

        I’m sure we’d all switch to room temperature fusion for power if we could, too, or use superconductors in our electronics.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          6 days ago

          That’s the problem with surveys, isn’t it? What’s “latency being eliminated”? On principle it’d be your streamed game responds as quickly as a local game, which is entirely achievable if your target is running a 30fps client on a handheld device versus streaming 60 fps gameplay from a much more powerful server. We can do that now.

          But is that “latency free” if you’re comparing it to running something at 240Hz in your gaming PC? With our without frame generation and upscaling? 120 Hz raw? 60Hz on console?

          The question isn’t can you get latency free, the question is at what point in that chain does the average survey-anwering gamer start believing the hype about “latency free streaming”?

          Which is irrelevant to me, because the real problem with cloud gaming has zero to do with latency.

      • Alphane Moon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        That’s why it’s best to focus on absolute unit shipment numbers/POS.

        If total units increased compared to the previous generation launch, then people are still buying GPUs.

          • Alphane Moon@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 days ago

            Shipment/POS do not telling you anything about unfulfilled demand or “unrealized supply”.

            It’s just how unit were shipped into the channel and sales at retail respectively.

            These are the best data points that we have to understand demand dynamic.

            Gamers are also a notoriously dramatic demography that often don’t go through on what they say.

    • snoons@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      It really depends if they hired a professional cognitive psychologist to write the survey for them. I doubt they did…

  • excral@feddit.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 days ago

    For me it’s the GPU prices, stagnation of the technology (most performance gains come at the cost of stupid power draw) and importantly being fed up with AAA games. Most games I played recently were a couple years old, indie titles or a couple years old indie titles. And I don’t need a high powered graphics card for that. I’ve been playing far more on my steam deck than my desktop PC, despite the latter having significantly more powerful hardware. You can’t force fun through sheer hardware performance

  • EndlessNightmare@reddthat.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 days ago

    I don’t buy every generation and skip 1 if not 2. I have a 40xx series and will probably wait until the 70xx (I’m assumimg series naming here) before upgrading.

  • monogram@feddit.nl
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 days ago

    I downgraded from a gtx1060 to a ryzen 5000g igpu terraria & factorio don’t need much.