I know there is currently a massive PR campaign for a power grab to consolidate control over AI software. They want to control the means of generation. Only MozillAI can save us from King GhAIdorah!
Sorry I’m upsetting you. I know we’re entering an acceleration of technology at a time where our institutions globally are in an absolutely horrendous state. People on all sides are brainwashed as hell. The AI watchdogs are insane as well. What’s left but gallows humor? I do hold out some hope though.
That might actually be the kind of thing where open source AI could help. At least I hope. To detect bias, lies or AI powered filtering / sorting of content.
Honestly, most of what Cambridge analytica did was blackmail, illegal spending, and collusion between campaigns that were legally required to be separate.
Much of the data processing/ml was intended as a smoke screen to distract from the big stuff that was known to work and consequently legislated against. The problem is that they were so incompetent that the distraction technique was also illegal.
Maybe the machine learning also worked, but it’s really not clear.
I know there is currently a massive PR campaign for a power grab to consolidate control over AI software. They want to control the means of generation. Only MozillAI can save us from King GhAIdorah!
Sorry I’m upsetting you. I know we’re entering an acceleration of technology at a time where our institutions globally are in an absolutely horrendous state. People on all sides are brainwashed as hell. The AI watchdogs are insane as well. What’s left but gallows humor? I do hold out some hope though.
deleted by creator
That might actually be the kind of thing where open source AI could help. At least I hope. To detect bias, lies or AI powered filtering / sorting of content.
deleted by creator
See, THIS is the criticism of AI I can actually empathize with, I might even agree with it somewhat
Honestly, most of what Cambridge analytica did was blackmail, illegal spending, and collusion between campaigns that were legally required to be separate.
Much of the data processing/ml was intended as a smoke screen to distract from the big stuff that was known to work and consequently legislated against. The problem is that they were so incompetent that the distraction technique was also illegal.
Maybe the machine learning also worked, but it’s really not clear.