- Nick Clegg, former Meta executive and UK Deputy Prime Minister, has reiterated a familiar line when it comes to AI and artist consent.
- He said that any push for consent would “basically kill” the AI industry.
- Clegg added that the sheer volume of data that AI is trained on makes it “implausible” to ask for consent.
Well the AI companies and investors should have understood that building an industry off of doing something questionable was risky and risks don’t always work out.
If being declined concent is going to kill your industry then maybe your industry deserved to die.
Fucking rapist mentaility right there.
My thought exactly. If consent isn’t needed, what other actions do they deem justified without consent?
This is not a IP-issue, this is about human rights.
oh noes
Look, these goddamn assholes have got in their head that they have a right to profit.
NOBODY HAS A RIGHT TO PROFIT.
You have a right to try to create a profit and there are rules to that. You’re gonna lose your billions in investment if you can’t plaigerize content?..fuck you, your loss, and you shoulda fucking known better when the idea was presented to you.
Assholes
He admit it!
I have a proposition. Raid them with police and search their computers for stolen data like you would do with your citizens.
Using the same logic, it is “implausible” that we would not take money from those who have it and give it to the sheer volume of people who need it.
Oh. Suddenly it doesn’t work that way. Huh. Funny how that is.
It depends on how rich you are. CEOs have their own, reduced edition of the law.
Fuck Nick Clegg. Fuck that guy into the fucking sun.
Back in 2010 he managed to surf a wave of genuine optimism from young British voters who wanted something less shit, and found himself in a position where he could have brought about some genuine change for the better.
Instead that cunt hitched his wagon to the fucking Tories, who straight away announced an increase to university tuition fees. And who then went on to spend 15 years raping and pillaging the country like only fucking Tories can.
So yeah, fuck Nick Clegg.
Good, I think it should be killed.
AI or the music industry?
AI. Rich people trying to change the law to get richer via shady means is a huge no from me every time.
Yes
Kill the AI industry? Sweet. As an artist I do not consent.
So I can steal all their shit too, right? It would “Implausible” for me to do so.
If abiding to the law destroys your business then you are a criminal. Simple as.
But the law is largely the reverse. It only denies use of copyright works in certain ways. Using things “without permission” forms the bedrock on which artistic expression and free speech are built upon.
AI training isn’t only for mega-corporations. Setting up barriers like these only benefit the ultra-wealthy and will end with corporations gaining a monopoly of a public technology by making it prohibitively expensive and cumbersome for regular folks. What the people writing this article want would mean the end of open access to competitive, corporate-independent tools and would jeopardize research, reviews, reverse engineering, and even indexing information. They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.
I recommend reading this article by Kit Walsh, and this one by Tory Noble staff attorneys at the EFF, this one by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries, and these two by Cory Doctorow.
They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.
Ok, but is training an AI so it can plagiarize, often verbatim or with extreme visual accuracy, fair use? I see the 2 first articles argue that it is, but they don’t mention the many cases where the crawlers and scrappers ignored rules set up to tell them to piss off. That would certainly invalidate several cases of fair use
Instead of charging for everything they scrap, law should force them to release all their data and training sets for free. “But they spent money and time and resources!” So did everyone who created the stuff they’re using for their training, so they can fuck off.
The article by Tory also says these things:
This facilitates the creation of art that simply would not have existed and allows people to express themselves in ways they couldn’t without AI. (…) Generative AI has the power to democratize speech and content creation, much like the internet has.
I’d wager 99.9% of the art and content created by AI could go straight to the trashcan and nobody would miss it. Comparing AI to the internet is like comparing writing to doing drugs.
Ok, but is training an AI so it can plagiarize, often verbatim or with extreme visual accuracy, fair use? I see the 2 first articles argue that it is, but they don’t mention the many cases where the crawlers and scrappers ignored rules set up to tell them to piss off. That would certainly invalidate several cases of fair use
You can plagiarize with a computer with copy & paste too. That doesn’t change the fact that computers have legitimate non-infringing use cases.
Instead of charging for everything they scrap, law should force them to release all their data and training sets for free.
I agree
I’d wager 99.9% of the art and content created by AI could go straight to the trashcan and nobody would miss it. Comparing AI to the internet is like comparing writing to doing drugs.
But 99.9% of the internet is stuff that no one would miss. Things don’t have to have value to you to be worth having around. That trash could serve as inspiration for your 0.1% of people or garner feedback for people to improve.
I don’t really disagree with your other two points, but
You can plagiarize with a computer with copy & paste too. That doesn’t change the fact that computers have legitimate non-infringing use cases.
They sure do, of which that is not one. That’s de facto copyright infringement or plagiarism. Especially if you then turn around and sell that product.
The key point that is being made is that it you are doing de facto copyright infringement of plagiarism by creating a copy, it shouldn’t matter whether that copy was made though copy paste, re-compressing the same image, or by using AI model. The product being the copy paste operation, the image editor or the AI model here, not the (copyrighted) image itself. You can still sell computers with copy paste (despite some attempts from large copyright holders with DRM), and you can still sell image editors.
However, unlike copy paste and the image editor, the AI model could memorize and emit training data, without the input data implying the copyrighted work. (exclude the case where the image was provided itself, or a highly detailed description describing the work was provided, as in this case it would clearly be the user that is at fault, and intending for this to happen)
At the same time, it should be noted that exact replication of training data isn’t exactly desirable in any case, and online services for image generation could include a image similarity check against training data, and many probably do this already.
The apparent main use for AI thus far is spam and scam, which is what I was thinking about when dismissing most content made with that. While the internet was already chock full of that before AI, its availability is increasing those problems tenfold
Yes, people use it for other things, like “art”, but most people using it for “art” are trying to get a quick buck ASAP before customers get too smart to fall for it. Writers already had a hard time getting around, now they have to deal with a never ending deluge of AI books, plus the risk of a legally distinct enough copy of their work showing up the next day.
Put it another way, the major use of AI thus far is “i want to make money without effort”
It definitely seems that way depending on what media you choose to consume. You should try to balance the doomer scroll with actual research and open source news.
I’m basing it mostly from personal and family experience. My mom often ends up watching AI made videos (stuff that’s just an AI narrator and AI images slideshow), my RPG group has poked fun at the amount of AI books that Amazon keeps suggesting them, anyone using instagram will, sooner or later, see adverts of famous people endorsing bogus products or sites via the magic of AI
So you don’t interact with AI stuff outside of that? Have you seen any cool research papers or messed with any local models recently? Getting a bit of experience with the stuff can help you better inform people and see through the more bogus headlines.
The fact that this is upvoted is so funny but unsurprising given the types who visit this site
Removed by mod
Yeah, anyone who thinks stealing content explicitly for financial gain is fair use needs their head checked.
Insanely weak they deleted this
deleted by creator
You can’t have a better law. Copyright laws are one-sided towards $billion companies. They would never agree to give more power to small creators or (worse) open-source projects who rely on such laws without making money.
Yes you can. Raise awareness, vote, contact representatives, organise and sign a large petition. This is eu only, if youre in us use 2nd amendment as intended in order to get your democracy back.
deleted by creator
Let’s hope it does.
Oh, so it’d be ok to get movies, pictures, books, etc. without asking the right owners for us too? GREAT.
If you’re giving me the choice of killing the AI industry or artists it doesn’t seem like a hard decision. Am I missing something?
A lot of AI fanboys secretly think that artists who rely on public funding to make a living deserve to be raped by gen AI companies.
Bet you’re not worth $100+ million. Then you’d get it.
If someone wants to make me worth 100 million I wouldn’t complain. Can’t guarantee I’ll understand though.
The bit you’re missing is that the choice isnt between killing AI and killing the music industry, its between killing AI in the UK or pissing off IP holders somewhat. Do you think China give a fuck who’s IP they use in training models, or that they will stop if the UK passes a law making artists default out of using their work as training data?
What are you talking about this has nothing to do with UK policy decisions. The current UK government doesn’t have any interest in restricting AI usage I don’t know where you’re getting that idea from.
Nick Clegg never really had much to do with UK politics, he was a deputy prime minister but he wasn’t exactly in charge of anything, and he’s long since left politics entirely. His previous employment has no bearing on his current statements.
Because he’s speaking to a British newspaper about British policies. I’m assuming the second part as I don’t subscribe to the times so cant read the article, but there are currently plans in place in the UK to introduce an opt-out framework for people to remove permission for training on their work, with pushback from big names that want to charge rent on their old works, so I assume that is the subject.
Even if he wasn’t talking about the UK at all (which I think it is clear he is from context) my larger point still stands, the choice isn’t between stopping AI and allowing AI, its between allowing AI companies to operate in your jurisdiction or AI being trained elsewhere that is out of your control. There is no option for “stop this entirely”, unless you can persuade the USA and China at the very least to sign up to it. Which they wont.
Specially when you realize that AI is for more than music, literature and other forms of illegal data processing. It can be used in a huge amount of other ways. One way for example would be to replace our president with a Combination of 4 magistrates and 1 AI…the republicans get 2, the Democrats get 2. AI gets to propose actions to take but has 0 authority in doing anything. Once a proposal has been made to do something, the 4 people get to discuss the action and implement it. If the implementation ends in a tie, then AI can ask the people to vote by phone. AI would then break the tie via the people’s popular vote. And no more electoral college, just use AI to pick the president based on the popular vote.
Nice, I want to be the company owning that “AI” algorithm!
The person who offered it probably didn’t notice how IRL con artists too often work by “just offering an idea”.
Doesn’t really matter though does it as long as the weights are open sourced everyone can confirm that the AI is unbiased.
That model’s name? AI Gore.