I was Fooled by an Artificial Intelligence Assistant

It started with a simple email: “My assistant Amy can find a time that works next week.” And then Amy emailed me and said she’d be happy to find a time for us and suggested a day and time. Amy was an AI personal assistant. It said so in her signature. Amy Ingram (AI) – Artificial intelligence for scheduling meetings.

But I didn’t notice. I agreed to the first day and time she recommended and accepted the calendar invite. However, I got sick (stomach flu sick!) and needed to reschedule. I emailed Amy and told her I the stomach flu and needed to cancel. She said “ok” and offered a time the very next day. I responded that I would need more time to recover — the following week might be ok. She offered a day the following week. She even followed up after I didn’t respond. I accepted and told her to send my apologies to my colleague. No response other than the calendar invite.

That’s when I realized I’d been fooled. She didn’t acknowledge I was sick, didn’t tell me to feel better, and wanted to reschedule the meeting while I was still deathly ill. It was odd and kind of rude, but people are stupid so I didn’t think much of it. When I finally noticed that line in her signature after the fifth email exchange I was impressed and a bit embarrassed.

This incident was the perfect example of something I’ve been following closely — automated work. Robots get a lot of attention in this area, but AI-enabled automated decision makers could take over duties in a wide range of fields, from nutrition, fitness coaching, and medicine through accounting and financial planning to reporting, law, and logistics.

Amy is just a foreshadow of things to come. Check her out: https://x.ai/


Michael Vidikan, michael@futureinfocus.com

CEO, Future in Focus

We help our clients understand how emerging issues and technologies will impact their business with a 5-10 year time horizon. You’ll find some free content on our site, but premium subscribers (and consulting clients) get access to so much more.


Would You Consider Sex with a Robot to Be Adultery?

ex machinaMy wife and I recently watched Ex Machina, a film about a seductive female robot with artificial intelligence. After the movie I asked playfully, “would it be cheating to have a sex with a robot like that?”

Her answer was a resounding yes. In her opinion, that machine was designed to act and look authentically like a real person, so having sex with a robot that has the capacity to have a romantic and intimate relationship would cross the line into adultery.

“Does it matter that it’s not actually a human?” Her answer was no, it wouldn’t matter. It’s just taking the place of another woman. While this is a theoretical question right now, I suspect more people will be asking these questions as we get closer to a future where humanoids are not just movie magic. If a humanoid possesses the type of emotional intelligence displayed in Ex Machina, it might be difficult for humans not see them as fellow humans.

toshiba robotAdvances in material sciences and robotics are already allowing companies like Japan’s Toshiba Corp. to create humanoids (pictured left) that look and move like real people. This robot was on display at a department store in Tokyo and actually fooled some customers into thinking it was just a regular person.

After Ex Machina was released, the founder of a sex doll company, Matt McMullen, announced he would be developing AI to add an emotional and intellectual dimension to the dolls. He even promoted the idea that a virtual reality headset like Oculus Rift could be used to add to the experience.

One possibility is that we’ll see a robotic sex industry take shape similar to the one we see in Steven Spielberg’s futuristic film, AI, starring Jude Law as a robot gigolo. If something like that occurs, which societies will embrace it? Which will attach a cultural and social stigma? And will couples need to discuss such a thing before getting married?

What do you think?