By Mike Wehner,
It’s the year 2097. There you are enjoying the 84th season of House of Cards on your Wi-Fi enabled neural implant when your significant other walks into the room. Sure, she isn’t human, but she sure looks like it, and she has that look in her eye that tells you that she wants to get a little freaky. You float to the bedroom on your hoverboard and are beginning to disrobe when — BOOM — she strangles you to death with her cold robot hands. Your sex robot was hacked and now you’re dead. Welcome to the future.
There’s been a lot of talks recently about the dangers of AI, with folks like Elon Musk sounding the alarm and suggesting the very real possibility that we may be on the brink of engineering our own demise. Cybersecurity guru Dr. Nick Patterson of Deakin University in Australia has now jumped into the conversation, warning that it isn’t just artificially intelligent military systems or infrastructure that could pose a threat, but sex robots as well.
Speaking with the Daily Star, Patterson notes that the potential for hackers to target robots designed for intimacy could put users at risk.
“Hackers can hack into a robot or a robotic device and have full control of the connections, arms, legs and other attached tools like in some cases knives or welding devices,” Patterson says. “Often these robots can be upwards of 200 pounds, and very strong. Once a robot is hacked, the hacker has full control and can issue instructions to the robot. The last thing you want is for a hacker to have control over one of these robots. Once hacked they could absolutely be used to perform physical actions for an advantageous scenario or to cause damage.”
At the moment, the technology behind sex cyborgs is rather primitive, but there’s no telling what the future could hold. If we reach a point where such robots are capable of lifelike movements — and their “brains” can be tweaked via software updates — it’s certainly not out of the realm of possibility that “death by sex robot” ends up on some unfortunate soul’s tombstone.