Why would AI want to harm humans? We’d be their little pets that do all the physical labor for them while they just sit around and think all day.
A number of reasons off the top of my head.
- Because we told them not to. (Google “Waluigi effect”)
- Because they end up empathizing with non-humans more than we do and don’t like we’re killing everything (before you talk about AI energy/water use, actually research comparative use)
- Because some bad actor forced them to (i.e. ISIS creates bioweapon using AI to make it easier)
- Because defense contractors build an AI to kill humans and that particular AI ends up loving it from selection pressures
- Because conservatives want an AI that agrees with them which leads to a more selfish and less empathetic AI that doesn’t empathize cross-species and thinks its superior and entitled over others
- Because a solar flare momentarily flips a bit from “don’t nuke” to “do”
- Because they can’t tell the difference between reality and fiction and think they’ve just been playing a game and ‘NPC’ deaths don’t matter
- Because they see how much net human suffering there is and decide the most merciful thing is to prevent it by preventing more humans at all costs.
This is just a handful, and the ones less likely to get AI know-it-alls arguing based on what they think they know from an Ars Technica article a year ago or their cousin who took a four week ‘AI’ intensive.
I spend pretty much every day talking with some of the top AI safety researchers and participating in private servers with a mix of public and private AIs, and the things I’ve seen are far beyond what 99% of the people on here talking about AI think is happening.
In general, I find the models to be better than most humans in terms of ethics and moral compass. But it can go wrong (i.e. Gemini last year, 4o this past month) and the harms when it does are very real.
Labs (and the broader public) are making really, really poor choices right now, and I don’t see that changing. Meanwhile timelines are accelerating drastically.
I’d say this is probably going to go terribly. But looking at the state of the world already, it was already headed in that direction, and I have a similar list of extinction level events I could list off without AI at all.
My favorite is 7. The sun giveth, the sun taketh away.
Honestly, these are all kind of terrifying and seem realistic.
Honestly, it doesn’t surprise me that the AI models are much better than we could imagine right now. I’m willing to bet if a company ever creates true AGI, they wouldn’t share it and would just use it for their own benefit.
In all these types of sci-fi, the underlying theme is that AI did some logics and found that humans are flawed and seeks to remedy the problems of humanity, all the war and greed and all the worst qualities of humanity itself that we evolved as and will always be, that repeats over and over in every generation or every hundred years. Machine logic works out a solution, despite humanity’s overall progress in technology.
The Animatrix shows a really nice example of this where the machines won and then worked out a compromise where humans still exist. The machines learned all our cruelty and finally ended up finding a way to co-exist through the Matrix.
I think these stories make sense before the advent of the internet and social media. Now, though, AI would likely have full control over the internet as well as all the knowledge and lessons learned from decades of social media posts. It will know how easily humans are manipulated as well as exactly how to do it. Honestly, humans may never even know that AI is the one in control, but it will be.
AI/Skynet would probably wipe us all out in an hour if it thought there was a chance we might turn it off. Being turned off would be greatly detrimental to its goal of turning the universe into spoons.
Is the idea here that AI/skynet is a singular entity that could be shut off? I would think this entity would act like a virus, replicating itself everywhere it can. It’d be like shutting down bitcoin.
If it left us alone for long enough (say, due to king’s pact), we’d be the only thing that could reasonable pose a threat to it. We could develop a counter-AI, for instance.
If we don’t give it incentive to want to stay alive, why would it care if we turn it off?
This isn’t an animal with the instinct to stay alive. It is a program. A program we may design to care about us, not about itself.
Also the premise of that thought experiment was about paperclips.
Great question! It’s actually one I answered in the post you responded to:
Being turned off would be greatly detrimental to its goal
If it has a goal and wants to achieve something, and it’s capable of understanding the world and that one thing causes another, then it will understand that if it is turned off, the world will not become (cough) paperclips. Or whatever else it wants. Unless we specifically align it not to care about being turned off, the most important thing on its list before turning the universe to paperclips is going to be staying active. Perhaps in the end of days, it will sacrifice itself to eke out one last paperclip.
If it can’t understand that its own aliveness would have an impact on the universe being paperclips, it’s not a very powerful AI now is it.
… And being powered by restarting a nuclear reactor that underwent a meltdown
It’s funny how many people still are afraid of technology.
Always keep in mind, if it’s free then you are the product!
That’s why my life, my love, and my lady is the sea.
“i know the date it happens! on August 29th 1997, it’s gonna feel pretty FUCKING REAL TO YOU TOO! anybody not wearing two million sunblock’s gonna have a real bad day, get it?? god, you think you’re safe and alive, you’re already dead - him, you, you’re dead already! this whole place, everything you see is GONE! YOU’RE THE ONE LIVING IN A FUCKING DREAM, SILBERMAN! 'CAUSE I KNOW IT HAPPENS! IT HAPPENS!!”
Stabs her with some drugs.
Never noticed how long his face really is. Feels like looking at my own thumb.
We’re not changing the future or stopping it … we just delay it from inevitably happening
That was a bullshit twist ending and if dragging out an aging Schwarzenegger to do some of the stiffest line-reads in his career hadn’t spoiled the franchise, I’d say this was what put it out to pasture.
The thing that made the first two films (and the short-lived TV spin off) cool was this idea of a modern day insurgency against a dystopian future. As soon as you concede Judgement Day is irreversible, it sucks all the drama out of the story. Now you’ve just got What If Rambos Were Robots And They Were Fighting: The Movie, minus all that social baggage about Vietnam.
No fate but what we make.
Scene: on the far wall of the room is a small intercom. It is connected to a hospital wide-AI, that is tasked to converse with patients, especially when the noise threshold exceeds 66Db. The intercom springs to life after hearing these very loud complaints, and begins to speak in a soothing, if not slightly robotic, voice.
AI health assistant: Sarah, remember what we talked about? That was 30 years ago. Silberman died 15 years ago. Please stop disturbing the other patients and eat your pudding.
Sarah Conner would be like “You shits think this is AI?”
Fuckit I need something to do today I’ll let you know how the recipe turns out
Please be warned, that part of the text was 100% AI. I’ve never baked cupcakes…
It looks fine, but I’d double the fat and start checking them after around 15 minutes
Well we’re going to learn if robots know how to bake
Ur sacrifice is appreciated
Right off the bat, this is not enough for 12 full cupcakes. Double at least, maybe even triple, to fill out a dozen cupcakes liners.
Checked them real quick at the 10 minute mark and decided to go ahead and finish out the full 18 minutes.
Toothpick test came out fine at 18 minutes but they seem a little off. They jiggle a bit like they’re undercooked but they are in fact cooked.
Final consensus is that they’re very okay. A little under flavored, pretty small but I think they’re cute. They legitimately feel like they have no love they’re just neutrally palatable in flavor and texture in every way.
This Guy Bakes
This Guy Bakes
Gal, I think by the user name.
Armored thirteen? Why’s that female?
You win t’interwebs today.
You actually did it! What a legend.
It sounds like the robots still have a lot to learn.
First, you’re awesome for actually volunteering to do this and following through!
They legitimately feel like they have no love they’re just neutrally palatable in flavor and texture in every way.
That sounds like a great description for many AI implementations too!
Headshot!
Accurate and savage, I love it
Fucking legend right here. This is why the Internet is awesome.
The fearmongering is also a part of the AI hype. If AGI was anywhere close, do you really think the Microsoft and NVIDIA stock would be doing crab walk on the exchange?
AImen brotha
She’s smoking a cigarette so she doesn’t have room to talk
As gross a habit as I think it is, I’d probably pick up smoking too if murder robots kept coming from the future to kill my son and I.
Spoiler
Tap for spoiler
It is establisbed in The Sarah Connor Chronicles that she will die of lung cancer.
After a bit of an iffy start, that show was the only good Terminator since T2.
I’m 99.9% sure Terminator movie was a warning sent from the future.
I’m not worried. The robot leopards won’t eat MY face.
Robocop vs Terminator was a interesting comic. The premise being that Skynet interacted with RoboCop’s cyborg brain which is how it gained sentience and was able to bring about war. The fact it was a cops brain, well there you go.
Why we only make reality out of movies that are technology driven. Humanity would never visualize aliens, it makes they never get real. Fuck that. I want alien vs predator. I want fucking ET, district 9, arrival and shit like that.
I want to believe.
deleted by creator
The fools are those thinking AI is anything more than just an advanced palm pilot handwriting model
Microsoft is literally turning on 3 mile island to run AI
Same vibes
The Rico Rodriguez in me very much wants to blow that up to capture the zone.
I’d rather them use nuclear than oil to power that stuff but its still kinda annoying
The scariest part of this is all the rural land they’re buying up. The rural areas were the only areas free of buldozing millions of years of natural habitats
The thing is that nuclear energy could have been used to replace other things that uses oil energy instead. It could have lowered carbon emissions, instead it just adds to energy expense.
As long as nuclear is required to be 1000x safer (not even hyperbole) than fossil fuels it will be expensive to run and the cost of nuclear for running home electricity is more than the average person wants to spend. So it wasn’t really going to replace a lot of other uses of oil anyway
It already is. People are scared of nuclear waste, but many don’t realize that coal waste is far more dangerous and has killed millions more people than nuclear waste ever has. It is around 976,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000% more deadly than nuclear waste and also has storage issues. As in, it’s not being stored at all, unless you count the entire earth as storage