Hard to believe it’s been 24 years since Y2K (2000) And it feels like we’ve come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways

I’m a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don’t want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.

Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we’re not going to see huge gains in performance anymore because AMD isn’t caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn’t very widespread. We’re again a decade away from seeing anything really substantial in terms of performance

Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It’s so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they’re just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that’s copyrighted, but claim it’s for the public good, and then randomly swap to a for-profit model. Doesn’t make any sense and just looks like they’re going to be a vessel for widespread economic poverty…

It just seems like there’s a lot of bubbles that are about to burst all at the same time, like I don’t see how things are going to possibly get better for a while now?

  • frezik@midwest.social
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    1
    ·
    2 days ago

    . . . with 10% increase in performance rather than 50 or 60% like we really need

    Why is this a need? The constant push for better and better has not been healthy for humanity or the planet. Exponential growth was always going to hit a ceiling. The limit on Moore’s Law has been more to the economic side than actually packing transistors in.

    We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

    Sure you can, today, and this is why:

    So many gaming companies are incapable of putting out a successful AAA title because . . .

    Regardless of the reasons, the AAA space is going to have to pull back. Which is perfectly fine by me, because their games are trash. Even the good ones are often filled with micro transaction nonsense. None of them have innovated anything in years; that’s all been done at the indie level. Which is where the real party is at.

    Would it be so bad if graphics were locked at the PS4 level? Comparable hardware can run some incredible games from 50 years of development. We’re not even close to innovating new types of games that can run on that. Planet X2 is a recent RTS game that runs on a Commodore 64. The genre didn’t really exist at the time, and the control scheme is a bit wonky, but it’s playable. If you can essentially backport a genre to the C64, what could we do with PS4 level hardware that we just haven’t thought of yet?

    Yeah, there will be worse graphics because of this. Meh. You’ll have native 4K/144Hz just by nature of pulling back on pushing GPUs. Even big games like Rocket League, LoL, and CS:GO have been doing this by not pushing graphics as far as they can go. Those games all look fine for what they’re trying to do.

    I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      None of them have innovated anything in years

      Well, they’ve innovated news ways to take up disk space…

      There’s a reason I don’t play new release AAA games, and it’s because they’re simply not worth the price. They’re buggy at launch, take up tons of disk space (with lots of updates the first few months), and honestly aren’t even that fun even when the bugs are fixed. Indie games, on the other hand, seem to release in a better state, tend to be fairly small, and usually add something innovative to the gameplay.

      The only reason to go AAA IMO is for fancy graphics (I honestly don’t care) and big media franchises (i.e. if you want Spiderman, you have to go to the license holder), and for me, those lose their novelty pretty quickly. The only games I buy near release time anymore are Nintendo titles and indie games from devs I like. AAA just isn’t worth thinking about, except the one or two each year that are actually decent (i.e. Baldur’s Gate 3).

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      2 days ago

      This post really nails my take on the issue. Give me original cs level graphics or even aq2 graphics, a decent story, more levels, and a few new little gimmicks (rocket arena grappling hook, anyone?!?!) and you don’t need 4k blah blah bullshit.

      The #1 game for kids is literally Minecraft or Roblox…8 bit level gfx outselling your horse armor hi res bullshit.

      The last game i bought was 2 days ago. Mohaa airborne for PC for $5 at a pawn shop Give me 100 of this quality of game instead of anything PS5 ever made.

      • solomon42069@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        2 days ago

        Here are the number of hours I’ve spent on indie games VS AAA titles, according to my Steam library:

        • Indie - Valheim - 435 hours
        • Indie - Space Haven - 332 hours
        • Indie - Satisfactory - 215 hours
        • Indie - Dyson Sphere Program - 203 hours
        • AAA - Skyrim - 98 hours
        • AAA - Control - 47 hours
        • AAA - Far Cry 6 - 29 hours
        • AAA - Max Payne 3 - 43 minutes

        If we’re talking about value - the amount of playtime I’ve gotten out of games with simpler graphics and unique ideas blows the billions spent by the industry out of the water.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          Depending on where you draw the line, mine looks similar:

          1. EU4 - >800 hours
          2. Cities Skylines - ~180 hours
          3. Magic: Arena - >100 hours
          4. Crusader Kings 2 - ~100 hours

          After that it depends on the length of the game. I normally just play through the campaign on most games once (except the above, which have lots of replay value), so looking at playtime isn’t particularly interesting IMO. The ratio of games with interesting playtime (i.e. I probably rolled credits) between indie and AAA is easily 2:1, if not something way higher like 5:1 or even 10:1, but again, that really depends on where you draw the line. If we look at 100% completion, I have 22 indie games and zero AAA games, because I rarely find AAA games to be worth going after achievements in. If I sort by achievement completion, the top two AAA games are Yakuza games (I love that series), and that’s after scrolling through dozens of indies, many of which have a fair amount of achievements (i.e. you need to do more than just roll credits).

          So yeah, AAA games really don’t interest me. If you compare the amount I’ve spent on indie vs AAA games, it would be a huge difference since I pretty much only play older AAA games if I get them on sale, and that’s mostly so I can talk about them w/ friends…

    • gandalf_der_12te@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.

      I agree. Wholeheartedly. I think it’s just so obvious how quality dramatically takes off when the people creating it feel safe, sound, and economically stable. Financial Security (UBI) drives creativity probably more than anything else. It’s a huge win!

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      The limit on Moore’s Law has been more to the economic side than actually packing transistors in.

      The reason why those economic limits exist is because we’re reaching the limit of what’s physically possible. Fabs are still squeezing more transistors into less space, for now, but the cost per transistor hasn’t fallen for some time, IIRC about 10nm thereabouts is still the most economical node. Things just get difficult and exponentially fickle the smaller you get, and at some point there’s going to be a wall. Of note currently we’re talking more about things like backside power delivery than actually shrinking anything. Die-on-die packaging and stuff.

      Long story short: Node shrinks aren’t the low-hanging fruit any more. Haven’t been since the end of planar transistors (if it had been possible to just shrink back then they wouldn’t have engineered FinFETs) but it’s really been taking up speed with the start of the EUV era. Finer and finer pitches don’t really matter if you have to have more and more lithography/etching/coating steps because the structures you’re building are getting more and more involved in the z axis, every additional step costs additional machine time. On the upside, newer production lines could spit out older nodes at pretty much printing press speed.

  • LordCrom@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 days ago

    I would love to have a VR headset that didn’t require a damn account with a 3rd party just to use it. I don’t need an account for my monitor or my mouse. Plus when I bought the thing, it was just Oculus, then meta bought it and promised nothing would change, before requiring a meta account to use the fucking thing.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      That unfortunately is the consequence of letting a company have a monopoly. The US govt should’ve opposed that, and should’ve forced them to sell it. They own such a huge share of the entire VR market right now it’s unbelievable, and Pico by byte dance isn’t legally able to be sold on the USA

    • capital@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      If I get back into it, I’ll probably try out Bigscreen. I haven’t dug deep enough into it to know if it requires an account but I wouldn’t expect this one to require it.

  • Jolteon@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 days ago

    I agree with you on the GPU hardware and AI bubbles, but I’m not sure I would consider VR/AR to be a bubble right now. The hype has mostly died down by now, and I think it’s stabilized to the point where it will remain until we have new advances in hardware.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 days ago

      VR is on the verge of collapse in the USA thanks to the US government banning byte dance. We can’t even order the new Pico 4 ultra, which is one of the most anticipated VR sets in the world right now. Meta basically has a monopoly and just announced they’re cutting funding to VR

      • tee9000@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Sorry but a new pico headset wouldnt do much of anything. New meta headset, new valve headset would give a bump.

        Really needs better content. The hardware is almost there (in terms of cost and accessibility of the experience).

        Its slowly getting there. But the current population of vr users is characterized by: who would play the same limited experiences consistently with hardware that is often cumbersome and loading screens that arent super long but become your entire existence and its annoying.

        Meta sucks but they have been a boon for vr development.

  • Telorand@reddthat.com
    link
    fedilink
    English
    arrow-up
    208
    arrow-down
    4
    ·
    3 days ago

    I’m a PC gamer, and it looks like things are stagnating massively in our space.

    I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

    Overall, I don’t see things the way you see them. I recommend taking a break from social media, go for a walk, play games you like, and fuck the trajectory of tech companies.

    Live your life, and take a break from the doomsaying.

    • Lvxferre@mander.xyz
      link
      fedilink
      English
      arrow-up
      53
      ·
      3 days ago

      I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

      Amen.

      Indie games might not be flashy, but they’re often made with love and concern about giving you a fun experience. They also lack all those abusive DRM and intrusive anti-cheat systems that A³ games often have.

      • Rob Bos@lemmy.ca
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        3 days ago

        They also tend to have linux support. Where the AAA companies want to eat the entire mammoth and scorn the scraps, small companies can thrive off of small prey and the offal. :)

          • Lvxferre@mander.xyz
            link
            fedilink
            English
            arrow-up
            10
            ·
            3 days ago

            It’s a great analogy though - Linux users aren’t deemed profitable by the A³ companies, just like offal is unjustly* deemed yucky by your typical person.

            *I do love offal though. And writing this comment made me crave for chicken livers with garlic and rosemary over sourdough bread. Damn.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 days ago

              Idk, I’ve spent way more on games since Valve came to Linux. I was a Linux user first, and mostly played games on console because I didn’t like rebooting into Windows or fiddling w/ WINE, so if I played games, it’s because it had Linux support (got a ton through Humble Bundle when they were small and scrappy). When Steam came to Linux, I created an account (didn’t have one before) and bought a bunch of games. I bought Rocket League when the Steam Controller and Steam Deck launched (was part of a bundle), and when Proton launched, I bought a ton of Windows games.

              So at least for me, I’ve easily spent 100x what I would’ve spent on video games due to Steam supporting Linux. That said, there are easily 50 other people spending more than me on Windows for every one of me, so I get that Linux isn’t a huge target market. But I will spend more on an indie game if it has native Linux support.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        16
        ·
        3 days ago

        And I’ll add on to that, even if every GPU company stops innovating, we’ll still have older cards and hardware to choose from, and the games industry isn’t going to target hardware nobody is buying (effectively pricing themselves out of the market). Indie devs especially tend to have lower hardware requirements for their games, so it’s not like anyone will run out of games to play.

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      ·
      3 days ago

      Genuinely wish more people understood this. I’ve mostly only been playing indie games for the past few years. By far the best fun i’ve had in gaming. A ton of unbelievably creative, unique games out there. Not to mention that 99% of them are a single-purchase experience, instead of a cash treadmill

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      3 days ago

      Hello indie gamer, it’s me, you, from the future.

      I’d like to introduce you to PATIENT indie gaming.

      The only games I play are small team, longer running, well documented, developers are passionate, mods exist, can play on a potato or a steam deck, etc

      Because I’m patient, I don’t ever get preorder, Kickstarter, prealpha disappointed.

      I know exactly what I’m getting, I pay once, and boom, I own a great game for ever. (You can more often fully DL indie games)

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 days ago

          Bro I’m from the future you can’t ask me stuff like that, be patient, you’ll figure it out

    • Shadywack@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I love this, and I’ll even one up it. Let the bubbles burst, this is just a transitional period that you see like a predictable cycle in tech. The dot com burst was like a holocaust compared to this shit. Everyone who was in the tech scene before Google has an easier time with this. We can comfortable watch FAANG recede, and even be grateful for it. Let it happen.

    • EnderMB@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      3 days ago

      My only fear with the indie gaming industry is that many of them are starting to embrace the churn culture that has led AAA gaming down a dark path.

      I would love an app like Blind that allows developers on a game to anonymously call out the grinding culture of game development, alongside practices like firing before launch and removing credits from workers. Review games solely on how the dev treated the workers, and we might see some cool corrections between good games and good culture.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 days ago

        There’s certainly room to grow with regard to workers’ rights. I think you could probably solve at least a few of them if they were covered by a union, and publishers who hire them would have to bargain for good development contract terms.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      3 days ago

      Gaming now is more amazing that ever in part because we have access to classic games too. If someone thinks gaming was amazing 10 years ago, cool. We still have those games! I’m playing a really old game right now myself and loving it.

      I think OP confuses this whole bubble bursting thing. When a phenomenon passes out of its early explosive growth phase and settles into more of a steady state, that’s not the “bubble bursting” that’s maturity.

      Tech as a whole is now a more mature industry. Companies are expected to make money, not revolutionize the world. OP would have us believe this means that tech is over. How does the saying go? It’s not the beginning of the end, but it is perhaps the end of the beginning.

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Companies are expected to make money, not revolutionize the world

        I’d like to believe that, but I don’t think investors have caught on yet. That’s where the day of reckoning will come.

        AI is a field that’s gone through boom and bust cycles before. The 1960s were a boom era for the field, and it largely came from DoD money via DARPA. This was awkward for a lot of the university pre and post grads in AI at the time, as they were often part of the anti-war movement. Then the anti-war movement starts to win and the public turns against the Vietnam war. This, in turn, causes that DARPA money to dry up, and it’s not replaced with anything from elsewhere in the government. This leads to an AI winter.

        Just to be clear, I like AI as a field of research. I don’t at all like what capitalism is doing with it. But what did we get from that time of huge AI investment? Some things that can be traced directly back to it are optimizing compilers, virtual memory, Unix, and virtual environments. Computing today would look entirely different without it. We may have eventually invented those things otherwise, but it would have taken much, much longer.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I’m playing a really old game right now myself and loving it.

        Same. I’m slowly working my way through the Yakuza series (started w/ Yakuza 0), and I’m currently halfway through Yakuza 3, which was released in 2010. I play them about a year or two apart because I get kinda burned out near the end.

        I have way more games than I can reasonably play, and my wishlist of games I want to play is still unreasonably big. There’s no way I’m running out of interesting games to play anytime soon. And I haven’t really gotten into emulation either, so these are purely PC titles that I’m still trying to catch up on.

        Companies are expected to make money, not revolutionize the world

        Exactly. There’s a clear reason why Warren Buffett still owns a massive stake in Coca-Cola, and it’s not because they’re a hot young startup. Tech hardware is fantastic, and honestly, most people really don’t need big improvements year over year. I think game devs can do a lot more with the hardware we already have, so we should be looking at refining the HW we have (small improvements in performance, larger improvements in power efficiency and reduction in die size to improve margins). Likewise for desktop and cloud software, a round of optimizations would probably yield better gains than hardware revisions.

        I’m excited to see VR headsets get cheaper and more ubiquitous (i.e. I think something like the Valve Index could be done for half the price), handheld PCs like Steam Deck getting better battery life, etc.

    • RubberDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 days ago

      Plenty of good games out there, even in the early access I have found some real gems. Just recently coffee stain released satisfactory… labor of love and it shows. I recently tried bellwright, it’s impressive, so is manor lords.

      And hardware stagnating also means that people get to learn what it’s all about and optimize for it. The last gen games on a console are usually also better optimized than the first series of games on a platform. So yeah…

  • tee9000@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    4
    ·
    edit-2
    2 days ago

    I really truly suggest diversifying to newsfeeds without comment sections like techmeme for a bit.

    Increasing complexity is overwhelming and theres plenty of bad shit going on but theres a lot overblown in your post.

    Sorry for the long edit: i personally felt improvement for my mental health when i did this for 6 months or so. Because seriously, whatever disinformation is happening in american news is so exhausting. We need to think whatever we want and then engage with each other when our thoughts are more individualized. Dont be afraid to ask questions that might seem like you are questioning some holy established lemmy/reddit consensus. If you are being honest about your opinions and arent afraid to look dumb then you are doing the internet a HUGE service. We need more dumb questions and vulnerability to combat the obsession of appearing as an expert. So thank you for making a genuine post of concern.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    26
    ·
    3 days ago

    COVID also inflated a lot of tech stock massively, as everybody suddenly had to rely a lot more on it to get anything done, and the only thing you could do for entertainment was gaming, streaming movies, or industrial quantities of drugs.

    Then that ended, and they all wanted to hold onto that “value”.

    It is a bubble, but whether it pops massively like in 2000, or just evens off to the point where everything else catches up, remains to be seen.

    “The markets can remain irrational longer than you can remain solvent” are wise words for anyone thinking of shorting this kind of thing.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      2 days ago

      Shows that You are in the UK. Just want to clarify I’m talking specifically about the USA but I agree with everything you said. Tech stocks became so inflated! Don’t know if people are seeing it in Europe, but here in the USA, there is this really toxic and very cringe behavior from these tech companies to get people back to office, they can force people to return to office across the country, basically you have to relocate and upend your entire life which could cost you $50,000 and they’re not paying for that, if you don’t do that you get fired. Easy way to start laying off people without having to pay them anything because you can call it insubordination, since they refuse to return to office. Now they supposedly have cause to get rid of people or deny them promotions for more money. IBM for example is doing this right now, Cisco was doing it as well. One of the most major networking software companies in the market. Scumbag behavior

  • madjo@feddit.nl
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    4
    ·
    3 days ago

    We still don’t have the capability to play games in full native 4K 144 Hertz.

    And we really don’t need that. Gameplay is still more important than game resolution. Most gamers don’t even have hardware that would allow that type of resolution.

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      edit-2
      3 days ago

      I remember when running counter strike at 30fps on a 480p monitor meant you had a good computer.

      Modern graphics are amazing, but they’re simply not required to have a good gaming experience.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      10
      ·
      2 days ago

      Gameplay is still more important than game resolution

      In your opinion*. You forgot that part. For lots of people, graphics are way more important because they want a beautiful and immersive experience. They are not wrong to want that. I respect that you feel the way you do, but I respect others who care more about graphics. I’ll even go so far as to say that I am of the same mind as you, I don’t care about the graphics much at all but there are some games where the graphics have truly wowed me, or the visual effects. For example two that come to mind, Ori and the will of the wisps, or No Man’s sky. Two very different games but absolutely crazy visual effects and graphics on high-end computers. Another game that I play a lot is World of Warcraft, gameplay is so damn fun but it’s hard to get any of my friends to play it because it’s so ugly, looks like a poorly rendered PS3 game. That horrible quality of graphics prevents people from even trying it

      Most gamers don’t even have hardware that would allow that type of resolution.

      This is because they refuse to innovate. Think of the DVD player. You think a DVD player costs a lot today? Of course not, there’s a million of them and no one wants them anymore. If they actually innovated and created drastic leaps and technology, then older technology would be cheaper. It’s not expensive to go out and get an RTX 2080, which is the graphics card I currently have. Is about 250 or $300 now, pretty damn solid card. If they actually innovated and kept pushing the limits, technology would accelerate faster. Instead they want the inverse of that. They want as slow growth in technology as feasibly possible, maximum amount of time to innovate, maximum amount of revenue, and maximized impact on the environment. All those carbon emissions and waste of graphics cards being thrown out

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        If graphics were with it, people would pay for it.

        The fact of the matter is that exponential graphics capabilities requires an exponential input of developer and asset creator budget. Given that there is a ceiling on game prices, it isn’t worth it going for higher fidelity games when the market isn’t going to pay for it.

      • madjo@feddit.nl
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        You can have the most realistic graphics in the world, pushing your AMViditel RTX 5095Ti Plus Platinum Ultra with 64TB VRAM to it’s absolute maximum, but if the gameplay sucks, you won’t have as much fun as you would with a pixel art indie game with lots of fun gameplay.

  • solomon42069@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    2 days ago

    My biggest gripe with big tech is how governments of the world encourage their worst behaviours. Governments and businesses have failed to maintain their own level of expertise and understanding of technology.

    Today everything relies on tech but all the solutions are outsourced and rely on “guidance” and free hand outs from vendors like Microsoft. This has caused situations where billions are poured into digital transformation efforts with fuck all to show for it but administrative headaches, ballooning costs and security breaches.

    I’m so tired of silicon valley frat boys being the leaders of our industry. We need to go back to an engineer and ideas led industry. Focused on solving problems and making lives better. Not making bullshit unsustainable business monopolies with a huge pile of money. Right now big tech is the embodiment of all of capitalisms worst qualities.

    P.s. apologies if my comment is a bit simplistic and vague. didn’t want to write a 10 page rant but still wanted to say my 2c about the state of things.

  • BananaTrifleViolin@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    2
    ·
    3 days ago

    As others have said, gaming is thriving - AAA and bloated incumbants are not doing well but the indie sector is thriving.

    VR is not on the verge of collapse, but it is growing slowly as we still have not reached the right price point for a mobile high powered headset. Apple made a big play for the future of VR with its Apple Vision Pro but that was not a short term play; that was laying the ground works for trying to control or shape a market that is still probably at least 5 if not 10 years away from something that will provide high quality VR, untethefed from a. PC.

    AI meanwhile is a bubble. We are not in an age of AI, we are in an age of algorithms - they will and are useful but will not meet the hype or hyperbole being banded about. Expect that market to pop and probably with spectacular damage to some companies.

    Other computing hardware is not really stagnating - we are going through a generational transition period. AMD is pushing Zen 5 and Intel it’s 14th gen, and all the chip makers are desperately trying to get on the AI band wagon. People are not upgrading because they don’t see the need - there aren’t compelling software reasons to upgrade yet (AI is certainly not compelling consumers to buy new systems). They will emerge eventually.

    The lack of any landmark PC AAA games is likely holding back demand for consumer graphics cards, and we’re seeing similar issues with consoles. The games industry has certainly been here many times before. There is no Cyberpunk 2077 coming up - instead we’ve had flops like Star Wars Outlaws, or underperformers like Starfield. But look at the biggest game of last year - Baldurs Gate 3 came from a small studio and was a megahit.

    I don’t see doom and gloom, just the usual ups and downs of the tech industry. We happen to be in a transition period, and also being distracted by the AI bubble and people realising it is a crock of shit. But technology continues to progress.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 days ago

      VR

      Yeah, I think it’s ripe for an explosion, provided it gets more accessible. Right now, your options are:

      • pay out the nose for a great experience
      • buy into Meta’s ecosystem for a mediocre experience

      I’m unwilling to do either, so I’m sitting on the sidelines. If I can get a headset for <$500 that works well on my platform (Linux), I’ll get VR. In fact, I might buy 4 so I can play with my SO and kids. However, I’m not going to spend $2k just for myself. I’m guessing a lot of other people are the same way. If Microsoft or Sony makes VR accessible for console, we’ll probably see more interest on PC as well.

      People are not upgrading because they don’t see the need

      Exactly. I have a Ryzen 5600 and an RX 6650, and it basically plays anything I want to play. I also have a Steam Deck, and that’s still doing a great job. Yeah, I could upgrade things and get a little better everything, but I can play basically everything I care about (hint: not many recent AAA games in there) on reasonable settings on my 1440p display. My SO has basically the same setup, but with an RX 6700 XT.

      I’ll upgrade when either the hardware fails or I want to play a game that needs better hardware. But I don’t see that happening until the next round of consoles comes out.

      • realitista@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 days ago

        Yeah Sony was my hope here but despite a few great experiences, they have dropped the ball overall. I’m bored of the cartooney Quest stuff, so I’ll probably not buy another headset for a good 5-10 years until there’s something with a good library and something equivalent to a high end PC experience today.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 days ago

          Yup, but with good headsets costing way more than good monitors and generally needing even better GPUs, I’m just not interested. Yeah, the immersion is cool, but at current prices and with the current selection of games, the value proposition just isn’t there. Add to that the bulk, it’ll probably be on my wishlist for a while (then again, Bigscreen VR headset looks cool, just need a way to swap pads so my SO/kids can try it).

          So yeah, maybe in 5-10 years it’ll make more sense. It could also happen sooner if consoles really got behind it, because they’re great at bringing down entry costs.

          • realitista@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            3 days ago

            Unfortunately Sony was our last hope for consoles and they half assed it. The very last hope is that Flat2VR ports tens of AAA titles at a rapid procession to PS5.

  • magic_lobster_party@fedia.io
    link
    fedilink
    arrow-up
    15
    ·
    3 days ago

    What’s happening is that support from VC money is drying up. Tech companies have for a long time survived on the promise that they will eventually be much more profitable in the future. It doesn’t matter if it’s not profitable today. They will be in the future.

    Now we’re in a period where there’s more pressure on tech companies to be profitable today. That’s why they’re going for such anti consumer behaviors. They want to make more with less.

    I’m not sure if there’s a bubble bursting. It could just be a plateau.

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      3 days ago

      I agree. Smartphones, for example, have hardly changed at all over the last ten years, but you don’t see Apple and Samsung going out of business.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        I understand you don’t appreciate where we’ve come from and how fast, can’t see the year to years changes, but the iPhone is just a little over ten years old. Do you really not see huge changes between an early iPhone and today’s?

        • XIIIesq@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 days ago

          On the contrary, I absolutely appreciate it. I was about 15 when mobile phones first became a thing that everyone owned, so I’ve lived through the entire progression from when they were something only a well to do businessman would have all the way through to today. The first iPhone was 2007, 17 years ago btw.

          When mobile phones became popular, each new generation of phones saw HUGE improvements and innovation. However, the last ten years has pretty much just been slight improvements to screen/camera/memory/CPU. Form wise and functionally, they’re very similar to the phone of ten years ago.

          I understand that some technophiles will always be able to justify why the new iPhone is worth £1600 and if that’s what they want to spend their money on then good for them, but I personally think that they are kidding themselves. Today you can get a brilliant phone for £300 or even less.

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 days ago

            I’d never justify that urge to spend ridiculous money updating every year to the latest and greatest, but people tend to under appreciate the massive improvements from accumulated incremental improvements.

            OLED screen on my iPhone X was revolutionary (and I’m sure Android had it first), as just one example, and now most phones are. Personally I find ultrawideband and “find my” very innovative and well implemented. Or if that’s too small a change, how about the entire revolution of Apple designing their own SoC for every new model. There’s emergency satellite texting, fall/crash detection, even Apple mostly solving phone theft is innovative (even if you don’t like their approach)

            When we see steady improvements, humans tend to under-appreciate how it adds up

            • XIIIesq@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 days ago

              I’m not going to argue that there has been no progress, just that it’s not on the same scale.

              Look at the difference between phones from 2004 to 2014, then from 2014 to 2024 and surely you’d have to agree. We’re looking at huge leaps in tech and innovation Vs much smaller incremental improvements.

              And I’d once again like to state that this is not a complaint, just a point of view showing that astonishing amounts of technological innovation are not necessarily required to keep companies in business.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        And it would be so easy to make a big splash in the market by having a phone where the camera doesn’t protrude out of the back.

        • XIIIesq@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 days ago

          To be fair, some phones already have that but they have much lower spec cameras/lenses, so it’s currently a trade off.

          If a flag ship phone were to find away to implement a flush top spec camera, it would still only be an incremental improvement rather than a great new technology or a substantial innovation.

          • Buttflapper@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            Damn that’s wild. Any business that has that drastic of spikes of profit and loss cannot possibly be sustainable. I can’t see how it could be. Look at the automobile giants in the USA. All it took was one major economic event to bankrupt them, and they got bailed out which should’ve never happened. It’s bullshit.

            • barsoap@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 days ago

              Memory chips have had an utterly fickle market ever since there’s been memory chips, companies in that business are still in that business because they learned how to deal with the swings. If micron can survive (and they will) then so will Samsung whose memory chip business has the whole conglomerate to fall back onto.

            • XIIIesq@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 days ago

              Yh, I’m not for bailing out companies that are “too big to fail”, I see it as socialism for the rich and capitalism for the poor, but that’s a separate debate.

              Tech stocks were a interesting case as they bloated far beyond their actual value during COVID, what happened in 2023 was probably somewhat of a renormalization and now they’re back to business as usual. There will always be peaks and valleys, but I’d be very surprised to see tech stocks fail in the long term.

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    3 days ago

    Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

    Nvidia is just playing conservative because it was massively overvalued by the market. The GPU use for AI is a stopover hack until hardware can be developed from scratch. The real life cycle of hardware is 10 years from initial idea to first consumer availability. The issue with the CPU in AI is quite simple. It will be solved in a future iteration, and this means the GPU will get relegated back to graphics or it might even become redundant entirely. Once upon a time the CPU needed a math coprocessor to handle floating point precision. That experiment failed. It proved that a general monolithic solution is far more successful. No data center operator wants two types of processors for dedicated workloads when one type can accomplish nearly the same task. The CPU must be restructured for a wider bandwidth memory cache. This will likely require slower thread speeds overall, but it is the most likely solution in the long term. Solving this issue is likely to accompany more threading parallelism and therefore has the potential to render the GPU redundant in favor of a broader range of CPU scaling.

    Human persistence of vision is not capable of matching higher speeds that are ultimately only marketing. The hardware will likely never support this stuff because no billionaire is putting up the funding to back up the marketing with tangible hardware investments. … IMO.

    Neo Feudalism is well worth abandoning. Most of us are entirely uninterested in this business model. I have zero faith in the present market. I have AAA capable hardware for AI. I play and mod open source games. I could easily be a customer in this space, but there are no game manufacturers. I do not make compromises in ownership. If I buy a product, my terms of purchase are full ownership with no strings attached whatsoever. I don’t care about what everyone else does. I am not for sale and I will not sell myself for anyone’s legalise nonsense or pay ownership costs to rent from some neo feudal overlord.

    • Chocrates@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      3 days ago

      Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

      I’m a die hard open source fan but that still feels like a stretch. I remember 10 years ago we were theorizing that windows would get out of the os business and just be a shell over a unix kernel, and that never made it anywhere.

      • Rob Bos@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 days ago

        I don’t think that is necessarily out of the running yet. OS development is expensive and low profit. Commodification may be inevitable. Control of the shell and GUI, where they can push advertisements and shovelware and telemetry on you, that is profitable.

        So in 20 years, 50? I predict proprietary OSes will die out eventually, balance of probability.

        • Chocrates@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          I’m with you in the long term.

          I am curious what kernel is backing the computers on the stuff SpaceX is doing. I’ve never seen their consoles but I am guessing we are closer to modern reusable hardware and software than we were before. When niche applications like that keep getting more diverse, i bet we will get more open specifications so everything can work together.
          But again I am more pessimistic and think 50 years would be relatively early for something like that.

      • solomon42069@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I think the games industry will start to use open source tools like Blender and Godot more and more. These options have really matured over the years and compete on features and productivity with commercial options.

        From a business POV - open source makes a lot of sense when you need a guarantee your investment won’t evaporate because a vendor has cancelled a feature or API your game uses. With open source, if you don’t like a path the upstream code is taking you can fork off and make your own!

        Part of the dynamic is also how people are inspired and learning skills. You can learn how to do very advanced stuff in Blender for free on Youtube - why would you pay some private college thousands of dollars to learn an expensive program like Maya to do the same thing?

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        It remained in the OS business to the extent that is required for the malware business.

        Also NT is not a bad OS (except for being closed, proprietary and probably messy by now). The Windows subsystem over it would suck just as bad if it would run on something Unix.

        • Chocrates@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Yeah, I guess in my fantasy I was Assuming that windows would do a full rewrote and adopt the unix abi, but I know that wouldn’t happen.

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            They have a few legacy things working in their favor. Hardware compatibility is one, but seems to be a thing of the past now when people don’t care. Application compatibility is another, and that is with Windows, not with NT.

            And they don’t have to change the core parts, because NT is fine. Windows is not, it’s a heap of legacy, but it’s not realistically replaceable.

            Unless they develop from scratch a new subsystem, like Embrasures or Walls or Bars, and gradually deprecate Windows. Doesn’t seem very realistic too, but if they still were a software company and not a malware company, they’d probably start doing this sometime about now.

      • cakeistheanswer@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        That’s probably closer today than it was then. The added complication being that client is probably not thin enough for them to return to mainframe model which would be vastly easier to monetize.

        Besides we got WSL out of the bargain, so at least inter op isn’t a reverse engineering job. Its poetically the reason linux ended up killing the last few win sever shops I knew. Why bother running win sever x just to run apache under linux. Why bother with hyper v when you can pull a whole docker image.

        If the fortune 500 execs are sold on microsoft ita mostly as a complicated contactual absolution of cyber security blame.

    • tias@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.

      • j4k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 days ago

        Multi threading is parallelism and is poised to scale to a similar factor, the primary issue is simply getting tensors in and out of the ALU. Good enough is the engineering game. Having massive chunks of silicon laying around without use are a mach more serious problem. At the present, the choke point is not the parallelism of the math but actually the L2 to L1 bus width and cycle timing. The ALU can handle the issue. The AVX instruction set is capable of loading 512 bit wide words in a single instruction, the problem is just getting these in and out in larger volume.

        I speculate that the only reason this has not been done already is because pretty much because of the marketability of single thread speeds. Present thread speeds are insane and well into the radio realm of black magic bearded nude virgins wizardry. I don’t think it is possible to make these bus widths wider and maintain the thread speeds because it has too many LCR consequences. I mean, at around 5 GHz the concept of wire connections and gaps as insulators is a fallacy when capacitive coupling can make connections across all small gaps.

        Personally, I think this is a problem that will take on a whole new architectural solution. It is anyone’s game unlike any other time since the late 1970’s. It will likely be the beginning of the real RISC-V age and the death of x86. We are presently at the age of the 20+ thread CPU. If a redesign can make a 50-500 logical core CPU slower for single thread speeds but capable of all workloads, I think it will dominate easily. Choosing the appropriate CPU model will become much more relevant.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      6
      arrow-down
      3
      ·
      3 days ago

      I do not make compromises in ownership.

      preach!

      At the end of the day though proper change will only come once the critical mass aligns on this issues along few others.

      Political process is too captured for peasant to affect any change, we have more power voting with our money as customers, at least for now.

  • tibi@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 days ago

    Also, the movie industry is struggling because of many reasons. Movies are getting too expensive, the safe formulas big studios relied on aren’t working anymore, customer habits are changing with people going less to movie theaters.

    At the same time, just like with video games, the indie world is in a golden age. You can get amazing cameras and equipment for quite a small budget. What free software like Blender can achieve is amazing. And learning is easier than ever, there are so many free educational resources online.

    • linearchaos@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      The entire entertainment industry is floundering. Wages lagging inflation in many sectors, people are paying significantly more to eat. They’re going to cut back on the streaming services and they’re going to cut back on going out to the movies. I’m right here at these crossroads where the only thing that makes sense is to give people a little more value for the money, instead we’re going to pull every fast trick we can to make more in advertising and gambling.

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Or you had several companies try to start their own streaming services from scratch and thought you needed a ton of new shows to fill it. Disney+ could have easily gotten away with archived Disney Channel shows, all the animated Disney cartoons, the old Star Wars & Marvel movies, and the Simpsons. It didn’t need a lot of the new shows, no matter how cool they looked.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 days ago

      I wouldn’t say the movie industry is struggling, I would say that people that work for a living are struggling. Actors are still getting paid huge sums of money, so are directors and producers. They are getting their pound of flesh one way or another. They are just not producing anything that people want to watch. For example all this marvel post-infinity War bullshit, no one wants to see that. No one cares about marvel Disney anything right now, it’s low quality drivel. But Beetlejuice, Barbie, oppenheimer… These are proof that people do still want to see movies, they just don’t want to produce anything meaningful.

      The people struggling that I’m talking about, however, are the supporting roles. People doing the filming, set dressing, makeup, special effects. Lots of these lower levels supporting roles get almost nothing compared to their cost of living in California, while some of the main actors can get tens of millions

      • tibi@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Just like AAA game studios, movie studios don’t want to take risks, so they go with productions they consider “safe”: aim for the lowest common denominator, play into nostalgia, don’t make anyone upset by touching subjects like politics, religion. And you end up with the garbage they are making right now.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I’ve just been watching older movies, there’s this amazing sweet spot when CGI just became a thing where the visual effects are passable but not so prevelant that the entire plot gets replaced with pointless explosions.

  • Resol van Lemmy@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    Guys, I’m actually getting nostalgic over the messy-but-still-kinda-fun 2010s. Everything was just so much more exciting back then, and if it was absolute garbage, it was still fun to make fun of it (cough cough 2013 Mac Pro, garbage quite literally).

    Yeah, it was no “sunshine and lollipops” timeline, but still, over the literal boring hell of the 2020s, it was LEAGUES better.

  • szczuroarturo@programming.dev
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    Ok . So first of all while NVIDIA is absoluetly a scumy company but the reason they are able to be this scummy is because they do generaly deliver unreasonable performance improvment ( at an unreasonable price tho ) and this time its unlikely to be any diffrent and 50xx series is expected to be monstrous as far as performance go. So far they didint do the same mistake intel did with cpus.

    Second . You cant collapse something that hasnt risen. Virtual reality never gained enough traction for it to collapse. I personaly blame PlayStation for this. If there is anyone that could make a diffrence it would be them .

    Third. If that’s true thats actually fucked up. Alghtough to be fair openai is very strange company and very closed one for a supposedly openai. Also i dont think going from non profit to for profit comapny changes much since it requires a thing they dont have. Profit.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      So far they didint do the same mistake intel did with cpus.

      Exactly. Think of where Intel would be if they didn’t sit on their hands. AMD completely ate their lunch, and they’re scrambling to retain some amount of their core business while expanding into other businesses. If they kept their CPUs solid, they would be able to devote more resources to the GPU division and probably be eating AMD’s lunch and eating a bit into Nvidia’s marketshare.