European Union lawmakers are set to give final approval to the 27-nation bloc’s artificial intelligence law Wednesday, putting the world-leading rules on track to take effect later this year.

Lawmakers in the European Parliament are poised to vote in favor of the Artificial Intelligence Act five years after they were first proposed. The AI Act is expected to act as a global signpost for other governments grappling with how to regulate the fast-developing technology.

“The AI Act has nudged the future of AI in a human-centric direction, in a direction where humans are in control of the technology and where it — the technology — helps us leverage new discoveries, economic growth, societal progress and unlock human potential,” said Dragos Tudorache, a Romanian lawmaker who was a co-leader of the Parliament negotiations on the draft law.

Big tech companies generally have supported the need to regulate AI while lobbying to ensure any rules work in their favor. OpenAI CEO Sam Altman caused a minor stir last year when he suggested the ChatGPT maker could pull out of Europe if it can’t comply with the AI Act — before backtracking to say there were no plans to leave.

  • General_Effort@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    In the very least, there is an obligation to provide a summary of what copyrighted materials were used for training. This is so that owners can go after people who use “warez” for training. I can’t guarantee that that is all.

    Many people, hobbyists or companies, won’t bother with the extra work (or use “warez”). Those models then won’t be in compliance with EU regulations. That may be a problem for people who want to use them “officially”, say students or independent devs. I would not care to guess how bad it would be. Maybe it won’t be enforced.

    • JustinA
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Humans are required to respect copyright, too. Why should generative AI be given a legal advantage over human labor?

      • General_Effort@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        I don’t think we are on the same page here. I don’t know what you are trying to say. An AI model is a software tool. It is used by humans. Humans have to follow the laws.

        Humans who create an open source AI model have to do extra labor, that people who create, EG a text editor, do not have to do.

        This may help some capital owners extract more money, but it will not help the average European. The average European is where the money is extracted from. It’s also bad for European cultures. This makes it harder and more expensive to get “Europeanness” into genAI; just so that a few property owners can extract money.

        This is a bad law.