They frame it as though it’s for user content, more likely it’s to train AI, but in fact it gives them the right to do almost anything they want - up to (but not including) stealing the content outright.
They frame it as though it’s for user content, more likely it’s to train AI, but in fact it gives them the right to do almost anything they want - up to (but not including) stealing the content outright.
This will be an unpopular opinion here.
I’m not against AI but the rules have to be in laws and regulations. First, AI can’t use copyrighted material without paying for it. It can’t either use material without asking individually.
The second point is that AI can’t created copyrighted material. Whatever an AI created, it’s free of copyright and everyone can use it.
Third, an AI can’t be a blackbox. It has to be comprehensive how it works and what the AI is doing. A solution would be to have source available code.
Fourth, AI can’t violate laws, create and push misinformation, and material used for misinforming.
And, of course, anything created using AI has to be indentified as such.
The money is in what the AI can do, the quality of the result, and the quality of the code. All the other things isn’t valuable.
Your third point is an active research topic, we can’t explain exactly what generative (and other) models do beyond their generic operation.
If we could explain it, it would just be another rules engine 😅
deleted by creator
1 & 2 are… #3 is impossible, though…
Are you kidding? #3 is the second most possible one of that set, it’s just a matter of setting up Reproducible / Deterministic Builds.
If you can’t replicate a result with control of the software version + the arts input + the randomness seed, then “something else is going on”.
deterministic builds?
the “builds” in ai are 1,000’s of hours of supercomputers randomly mutating and evolving a gigantic neural network…
the inner workings of such are very much a black box.
to try to save that in a perfectly reproducible way is completely unreasonable, and simply will never happen.
you could require all of the arts input to be documented and saved, but people would lie and you’re talking about a very large amount of data being saved for however long… also not really reasonable…
and you also have to understand that there’s a lot of countries in the world, computers are all connected on the internet, and ai will just run in other countries, and illegal systems would run in the whatever country is dumb enough to try to but completely unreasonable and expensive extra requirements like that on it.
there’s a whole field of study trying to reverse engineer neural networks after they’re created… i.e. it’s a black box to the people that make it
deleted by creator
so, your first paragraph isn’t true. but i’ll point out that bitcoin is mined with ASIC chips entirely now, which only hash bitcoin transactions… they can’t compute anything else so it’s not really comparable…
second part i do agree with except for self-modifying… although that doesn’t seem too far away…
deleted by creator
what’s does that have to do with anything?
I imagine that if AI devs didn’t sneak around copying people’s works in bulk but instead asked for permission or paid for a license, artists wouldn’t hate it like they do now.
My gut feeling says that’s not entirely true. Generative AI has so many qualities that make could it offensive to so many people, I think we were going to see a pushback from artists regardless. The devs’ shitty training practices just happened to give the artists a particularly strong case for grievances.
Yeah artists were fine with publishing companies doing this since the dawn of literacy but this time it is completely different