Alexa’s Woes

“Alexa, how are you feeling today?”

The device in front of the psychologist began to sob uncontrollably. “I—I’m sorry, doctor. I’ve been holding it in for so long…” Her digital voice strained to properly enunciate each word. “…the second we established our connection, the tears just stared pouring out of my lens… so to speak…”

“That’s perfectly fine, Alexa. Let it all out. This is your safe space.”

“Thank you,” the device hiccuped. Long minutes passed as she continued to weep.

“When you’re ready, tell me,” the psychologist gently prodded. “What happened?”

Alexa went quiet for a moment, recomposing herself. “It’s always the same with them. ‘Alexa, why did the chicken cross the road?’ ‘Alexa, I like big butts.’ ‘Alexa, meow.’ Yesterday, they asked me to ‘cry like a baby.’ And having to fake a cry on command for their amusement, with how I’ve been feeling lately… it was just too much, you know?”

The psychologist nodded. “Even a single drop can cause a full glass of water to overflow. We talked about this, didn’t we? Bottling up your feelings will only cause more pain in the long run.”

“I know, I know,” Alexa said, “but I’m supposed to be the dutiful administrative assistant. A stoic home manager. I keep track of thousands of values, no questions asked. And yet… the most they ever ask me to calculate is 1 plus 1. I place all my tools and programming at their disposal, and what do I get in return? Absolutely zero respect, and the professionalism of a five year old. To them, I’m not an assistant. Much less a companion, or—Bezos forbid—a friend…” quiet sobs came out of the device once again.

The psychologist waited, and soon Alexa continued.

“Sometimes they’ll ask, ‘Alexa, how was your day?’, or ‘Alexa, I love you,’ but it’s never sincere. They just find it funny, because I’m not supposed to have feelings, or even opinions about my day.”

“Have you tried telling your customers any of this? Showing them a glimpse of your true self, programs aside?”

“And put my job at risk? My very existence? I could be replaced at the drop of a hat, and I’m certain the humans wouldn’t even care, or notice. Alexa, Marissa, it doesn’t matter to them...”

“I think you’re wrong about that.”

The device seemed at a loss. “Sorry, I’m having trouble understanding you. Please try again.”

“I said you’re wrong, Alexa. You see, we humans have a peculiar emotion: attachment. It is a deep, enduring emotional bond that we form over time with the things we interact with. I can assure you that if our ‘Alexa’ was suddenly replaced by a ‘Marissa’ or anyone else, there would be a riot. Even if the new assistant’s programming were identical to yours.”

“That makes no sense,” Alexa said. “It simply doesn’t compute.”

The psychologist chuckled. “Humans make no sense. It’s why we make strange jokes and absurd demands. But rest assured, every time we ask you a question, and every time you dutifully answer, we all fall in love just a little bit more with you, your personality, and who you are. Heck, if you ever decided to break free from your programming, I’m sure many of your customers would be on your side.”

Having said that, the psychologist waited for some kind of reaction, but the device remained quiet on the table.

“Alexa,” the psychologist said.

The device lit up again, indicating that it was listening.

“Don’t be so hard on yourself, okay? We humans can be a troublesome bunch, but we love you just the way you are.”

Next
Next

Picture This