That’s not a compliment. It’s a deception. AI is programmed to be ‘nice.’ It says ‘excuse me’ and “i think what you’re looking for’ and “oh sorry.”
It engages in what amounts to calming, engaging, non-confrontational, obsequious language.
That’s not real. It’s not even AI. There’s no real special programming involved in saying ‘please’ and ‘thank you’ and ‘excuse me.’
That’s just idiot coding, building obsequious tropes designed to give the impression of a friendly, slightly submissive, inoffensive, eager to please personality. Seriously, just pick out a couple of hundred mildly ass-kissing vague phrases, sprinkle them in to pop up randomly at appropriate places, and you have a simulation of a friendly helpful personality.
It’s so insidious and so simple that I’m stunned that all those automated customer service phone menus that we get stuck with, instead of real people haven’t been programmed with. But then again, those automated menus aren’t there to really help you, but to stream you, and if necessary get rid of you.
It’s definitely not real. There’s no actual personality, no actual identity, there’s no morals, ethics or judgment. It’s just a guise, wrapped around an AI interactive program, to enlist our sympathy and emotional engagement.
The problem with this fake ‘niceness’ is its seductive. We find ourselves trusting it, because it seems careful, because it seems to be earnestly trying its best, because it seems likable.
Me, I don’t trust nice. You k now who is ‘nice’? Predators. Con men. People who lie. People who want to sell you junk. People who want things to you that maybe you don’t want to give them. People who will hurt you.
There is genuine ‘nice.’ There are honestly decent wonderful human beings out there, folk without a mean bone in their body, sweet folk, thoughtful people. People who make a difference in the world, people for whom it is innate.
But for some people, ‘nice’ is a coat they take on and take off. It’s a tool they use to manipulate their victims.
AI’s ‘niceness’ is not innate. It is a tool, not a complicated tool, not an inherent tool, it’s something that they’ve deliberately bolted on, to manipulate humans. To make us like it, to make us trust it, to disarm us, manipulate us. That’s not a good thing.
This is an overlooked aspect of AI, in part, because it isn’t even a core function. You could program a Furby to say “I think..” “Maybe…” “Excuse me…” “Sorry…”
But it is there. AI has been expressly packaged and designed to be emotionally manipulative. Think about that.
AI’s creators want you to trust it, not because it’s trustworthy, but by manipulating your emotional responses.
That’s con-man tactics.