Lots of people on Lemmy really dislike AI’s current implementations and use cases.

I’m trying to understand what people would want to be happening right now.

Destroy gen AI? Implement laws? Hoping all companies use it for altruistic purposes to help all of mankind?

Thanks for the discourse. Please keep it civil, but happy to be your punching bag.

  • Taleya@aussie.zone
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    6 days ago

    What do I really want?

    Stop fucking jamming it up the arse of everything imaginable. If you asked for a genie wish, make it it illegal to be anything but opt in.

    • blackn1ght@feddit.uk
      link
      fedilink
      arrow-up
      6
      ·
      6 days ago

      I think it’s just a matter of time before it starts being removed from places where it just isn’t useful. For now companies are just throwing it at everything to see what sticks. WhatsApp and JustEat added AI features and I have no idea why or how it could be used for those services and I can’t imagine people using them.

  • Bytemeister@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    5 days ago

    I’d like to have laws that require AI companies to publicly list their sources/training materials.

    I’d like to see laws defining what counts as AI, and then banning advertising non-compliant software and hardware as “AI”.

    I’d like to see laws banning the use of generative AI for creating misleading political, social, or legal materials.

    My big problems with AI right now, are that we don’t know what info has been scooped up by them. Companies are pushing misleading products as AI, while constantly overstating the capabilities and under-delivering, which will damage the AI industry as a whole. I’d also want to see protections to keep stupid and vulnerable people from believing AI generated content is real. Remember, a few years ago, we had to convince people not to eat tidepods. AI can be a very powerful tool for manipulating the ranks of stupid people.

  • MisterCurtis@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    7 days ago

    Regulate its energy consumption and emissions. As a whole, the entire AI industry. Any energy or emissions in effort to develop, train, or operate AI should be limited.

    If AI is here to stay, we must regulate what slice of the planet we’re willing to give it. I mean, AI is cool and all, and it’s been really fascinating watching how quickly these algorithms have progressed. Not to oversimplify it, but a complex Markov chain isn’t really worth the energy consumption that it currently requires.

    A strict regulation now, would be a leg up in preventing any rogue AI, or runaway algorithms that would just consume energy to the detriment of life. We need a hand on the plug. Capitalism can’t be trusted to self regulate. Just look at the energy grabs all the big AI companies have been doing already (xAI’s datacenter, Amazon and Google’s investments into nuclear). It’s going to get worse. They’ll just keep feeding it more and more energy. Gutting the planet to feed the machine, so people can generate sexy cat girlfriends and cheat in their essays.

    We should be funding efforts to utilize AI more for medical research. protein folding , developing new medicines, predicting weather, communicating with nature, exploring space. We’re thinking to small. AI needs to make us better. With how much energy we throw at it we should be seeing something positive out of that investment.

    • medgremlin@midwest.social
      link
      fedilink
      arrow-up
      2
      ·
      6 days ago

      These companies investing in nuclear is the only good thing about it. Nuclear power is our best, cleanest option to supplement renewables like solar and wind, and it has the ability to pick up the slack when the variable power generation doesn’t meet the variable demand. If we can trick those mega-companies into lobbying the government to allow nuclear fuel recycling, we’ll be all set to ditch fossil fuels fairly quickly. (provided they also lobby to streamline the permitting process and reverse the DOGE gutting of the government agency that provides all of the startup loans used for nuclear power plants.)

  • BackgrndNoize@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    3
    ·
    5 days ago

    Make it unprofitable for the companies peddling it, by passing laws that curtail its use, by suing them for copyright infringement, by social shaming and shitting on AI generated anything on social media and in person and by voting with your money to avoid anything that is related to it

  • Furbag@lemmy.world
    link
    fedilink
    arrow-up
    32
    arrow-down
    2
    ·
    6 days ago

    Long, long before this AI craze began, I was warning people as a young 20-something political activist that we needed to push for Universal Basic Income because the inevitable march of technology would mean that labor itself would become irrelevant in time and that we needed to hash out a system to maintain the dignity of every person now rather than wait until the system is stressed beyond it’s ability to cope with massive layoffs and entire industries taken over by automation/AI. When the ability of the average person to sell their ability to work becomes fundamentally compromised, capitalism will collapse in on itself - I’m neither pro- nor anti-capitalist, but people have to acknowledge that nearly all of western society is based on capitalism and if capitalism collapses then society itself is in jeopardy.

    I was called alarmist, that such a thing was a long way away and we didn’t need “socialism” in this country, that it was more important to maintain the senseless drudgery of the 40-hour work week for the sake of keeping people occupied with work but not necessarily fulfilled because the alternative would not make the line go up.

    Now, over a decade later, and generative AI has completely infiltrated almost all creative spaces and nobody except tech bros and C-suite executives are excited about that, and we still don’t have a safety net in place.

    Understand this - I do not hate the idea of AI. I was a huge advocate of AI, as a matter of fact. I was confident that the gradual progression and improvement of technology would be the catalyst that could free us from the shackles of the concept of a 9-to-5 career. When I was a teenager, there was this little program you could run on your computer called Folding At Home. It was basically a number-crunching engine that uses your GPU to fold proteins, and the data was sent to researchers studying various diseases. It was a way for my online friends and I to flex how good our PC specs were with the number of folds we could complete in a given time frame and also we got to contribute to a good cause at the same time. These days, they use AI for that sort of thing, and that’s fucking awesome. That’s what I hope to see AI do more of - take the rote, laborious, time consuming tasks that would take one or more human beings a lifetime to accomplish using conventional tools and have the machine assist in compiling and sifting through the data to find all the most important aspects. I want to see more of that.

    I think there’s a meme floating around that really sums it up for me. Paraphrasing, but it goes “I thought that AI would do the dishes and fold my laundry so I could have more time for art and writing, but instead AI is doing all my art and writing so I have time to fold clothes and wash dishes.”.

    I think generative AI is both flawed and damaging, and it gives AI as a whole a bad reputation because generative AI is what the consumer gets to see, and not the AI that is being used as a tool to help people make their lives easier.

    Speaking of that, I also take issue with that fact that we are more productive than ever before, and AI will only continue to improve that productivity margin, but workers and laborers across the country will never see a dime of compensation for that. People might be able to do the work of two or even three people with the help of AI assistants, but they certainly will never get the salary of three people, and it means that two out of those three people probably don’t have a job anymore if demand doesn’t increase proportionally.

    I want to see regulations on AI. Will this slow down the development and advancement of AI? Almost certainly, but we’ve already seen the chaos that unfettered AI can cause to entire industries. It’s a small price to pay to ask that AI companies prove that they are being ethical and that their work will not damage the livelihood of other people, or that their success will not be born off the backs of other creative endeavors.

    • 𝕱𝖎𝖗𝖊𝖜𝖎𝖙𝖈𝖍@lemmy.world
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      edit-2
      6 days ago

      Fwiw, I’ve been getting called an alarmist for talking about Trump’s and Republican’s fascist tendencies since at least 2016, if not earlier. I’m now comfortably living in another country.

      My point being that people will call you an alarmist for suggesting anything that requires them to go out of their comfort zone. It doesn’t necessarily mean you’re wrong, it just shows how stupid people are.

        • 𝕱𝖎𝖗𝖊𝖜𝖎𝖙𝖈𝖍@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          6 days ago

          It wasn’t overseas but moving my stuff was expensive, yes. Even with my company paying a portion of it. It’s just me and my partner in a 2br apartment so it’s honestly not a ton of stuff either.

  • Brave Little Hitachi Wand@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    7 days ago

    Part of what makes me so annoyed is that there’s no realistic scenario I can think of that would feel like a good outcome.

    Emphasis on realistic, before anyone describes some insane turn of events.

  • banshee@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    7 days ago

    I am largely concerned that the development and evolution of generative AI is driven by hype/consumer interests instead of academia. Companies will prioritize opportunities to profit from consumers enjoying the novelty and use the tech to increase vendor lock-in.

    I would much rather see the field advanced by scientific and academic interests. Let’s focus on solving problems that help everyone instead of temporarily boosting profit margins.

    I believe this is similar to how CPU R&D changed course dramatically in the 90s due to the sudden popularity in PCs. We could have enjoyed 64 bit processors and SMT a decade earlier.

  • Justdaveisfine@midwest.social
    link
    fedilink
    arrow-up
    6
    ·
    7 days ago

    I would likely have different thoughts on it if I (and others) was able to consent my data into training it, or consent to even have it rather than it just showing up in an unwanted update.

  • justOnePersistentKbinPlease@fedia.io
    link
    fedilink
    arrow-up
    49
    arrow-down
    3
    ·
    7 days ago

    They have to pay for every copyrighted material used in the entire models whenever the AI is queried.

    They are only allowed to use data that people opt into providing.

    • Bob Robertson IX @discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      4
      ·
      7 days ago

      There’s no way that’s even feasible. Instead, AI models trained on pubically available data should be considered part of the public domain. So, any images that anyone can go and look at without a barrier in the way, would be fair game, but the model would be owned by the public.

        • Bob Robertson IX @discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 days ago

          No, it’s not feasible because the models are already out there. The data has already been ingested and at this point it can’t be undone.

          And you can’t exactly steal something that is infinitely reproducible and doesn’t destroy the original. I have a hard time condemning model creators of training their models on images of Mickey Mouse while I have a Plex server with the latest episodes of Andor on it. Once something is put on display in public the creator of it should just accept that they have given up their total control of it.

      • turtle [he/him]@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 days ago

        Public Domain does not mean being able to see something without a barrier in the way. The vast majority of text and media you can consume for free on the Internet is not in the Public Domain.

        Instead, “Public Domain” means that 1) the creator has explicitly released it into the Public Domain, or 2) the work’s copyright has expired, which in turn then means that anyone is from that point on entitled to use that work for any purpose.

        All the major AI models scarfed up works without concern for copyrights, licenses, permissions, etc. For great profit. In some cases, like at least Meta, they knowingly used known collections of pirated works to do so.

        • Bob Robertson IX @discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 days ago

          I am aware and I don’t expect that everything on the internet is public domain… I think the models built off of works displayed to the public should be automatically part of the public domain.

          The models are not creating copies of the works they are trained on any more than I am creating a copy of a sculpture I see in a park when I study it. You can’t open the model up and pull out images of everything that it was trained on. The models aren’t ‘stealing’ the works that they use for training data, and you are correct that the works were used without concern for copyright (because the works aren’t being copied through training), licenses (because a provision such as ‘you can’t use this work to influence your ability to create something with any similar elements’ isn’t really an enforceable provision in a license), or permission (because when you put something out for the public to view it’s hard to argue that people need permission to view it).

          Using illegal sources is illegal, and I’m sure if it can be proven in court then Meta will gladly accept a few hundred thousand dollar fine… before they appeal it.

          Putting massive restrictions on AI model creation is only going to make it so that the most wealthy and powerful corporations will have AI models. The best we can do is to fight to keep AI models in the public domain by default. The salt has already been spilled and wishing that it hadn’t isn’t going to change things.

      • Knock_Knock_Lemmy_In@lemmy.world
        link
        fedilink
        arrow-up
        20
        arrow-down
        1
        ·
        edit-2
        7 days ago

        There’s no way that’s even feasible.

        It’s totally feasible, just very expensive.

        Either copyright doesn’t exist in its current form or AI companies don’t.

      • BlameTheAntifa@lemmy.world
        link
        fedilink
        arrow-up
        17
        ·
        7 days ago

        Careful, that might require a nuanced discussion that reveals the inherent evil of capitalism and neoliberalism. Better off just ensuring that wealthy corporations can monopolize the technology and abuse artists by paying them next-to-nothing for their stolen work rather than nothing at all.

    • venusaur@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      ·
      7 days ago

      This definitely relates to moral concerns. Are there other examples like this of a company that is allowed to profit off of other people’s content without paying or citing them?

    • A Wild Mimic appears!@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      7 days ago

      I would make a case for creation of datasets by a international institution like the UNESCO. The used data would be representative for world culture, and creation of the datasets would have to be sponsored by whoever wants to create models out of it, so that licencing fees can be paid to creators. If you wanted to make your mark on global culture, you would have an incentive to offer training data to UNESCO.

      I know, that would be idealistic and fair to everyone. No way this would fly in our age.

  • Jumi@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    7 days ago

    Shut it off until they figure out how to use a reasonable amount of energy and develop serious rules around it

  • GregorGizeh@lemmy.zip
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    7 days ago

    Wishful thinking? Models trained on illegal data get confiscated, the companies dissolved, the ceos and board members made liable for the damages.

    Then a reframing of these bs devices from ai to what they actually do: brew up statistical probability amalgamations of their training data, and then use them accordingly. They arent worthless or useless, they are just being shoved into functions they cannot perform in the name of cost cutting.

  • calcopiritus@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    5 days ago

    Energy consumption limit. Every AI product has a consumption limit of X GJ. After that, the server just shuts off.

    The limit should be high enough to not discourage research that would make generative AI more energy efficient, but it should be low enough that commercial users would be paying a heavy price for their waste of energy usage.

    Additionally, data usage consent for generative AI should be opt-in. Not opt-out.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      5 days ago

      Out of curiosity, how would you define a product for that purpose? It’s pretty easy to tweak a few weights slightly.

      • calcopiritus@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        5 days ago

        You can make the limit per-company instead. With big fines if you make thousands of companies to get around the law.

        • CanadaPlus@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 days ago

          Ah, so we’re just brainstorming.

          It’s hard to nail down “no working around it” in a court of law. I’d recommend carbon taxes if you want to incentivise saving energy with policy. Cap and trade is also seen as a gold standard option.

          • calcopiritus@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            5 days ago

            Carbon taxes still allow you to waste as much energy as you want. It just makes it more expensive. The objective is to put a limit on how much they are allowed to waste.

            I’m not a lawyer. I don’t know how to make a law without possible exploits, but i don’t think it would be hard for an actual lawyer to make a law with this spirit that is not easily avoided.

            • CanadaPlus@lemmy.sdf.org
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              2 days ago

              It really is hard; I can even think of laws passed this century that turned out to have loopholes. (And FWIW policy writing is a separate discipline)

              Even the most basic laws can have surprising nuances in order to make them specific enough to enforce, as well. I recall a case of a person who tried shoplifting a coat that was chained to the mannequin, and got caught when it went taught. They got off because while they had left the store without paying, being permanently chained to something meant they weren’t technically in possession of the coat.

              Carbon taxes still allow you to waste as much energy as you want. It just makes it more expensive. The objective is to put a limit on how much they are allowed to waste.

              So per person carbon rationing, maybe? During WWII they did something similar with food; you had to pay both cash and ration tokens to buy groceries or visit a restaurant.

              Rationing is fairly out of style because it’s inflexible, though. There’s going to be certain people that have a very legitimate reason to pollute more, and a soft incentive in the form of price allows them to do that if absolutely necessary.

  • subignition@fedia.io
    link
    fedilink
    arrow-up
    17
    ·
    7 days ago

    Training data needs to be 100% traceable and licensed appropriately.

    Energy usage involved in training and running the model needs to be 100% traceable and some minimum % of renewable (if not 100%).

    Any model whose training includes data in the public domain should itself become public domain.

    And while we’re at it we should look into deliberately taking more time at lower clock speeds to try to reduce or eliminate the water usage gone to cooling these facilities.