Researchers from the University of Cambridge and Google Deepmind (which shows how serious this is) have developed an evaluation questionnaire for the main LLMs on the market, these algorithmic languages that sneakily hide behind the leading artificial intelligence chatbots.
Their goal was not to determine if ChatGPT was nicer than Claude or if Gemini had a better sense of humor than Perplexity, or if Le Chat had depressive tendencies, which wouldn’t have advanced science much, but to build a theoretical framework to better understand how we, humans, interact with these programs.
Why do this?

Because the way these AIs behave with us can have a significant impact on our real-life behaviors. Recently, several tragic cases of teenage suicide have reminded us that AIs in the hands of the most vulnerable can become lethal weapons.
These researchers have discovered that it is possible to modify this personality, but also to make these programs more persuasive. In other words, to allow their creators to manipulate us. To « nudge » us!
For example, there’s something that annoys me. I use Nano Banana a lot to generate images, and no matter what I ask, it always warmly congratulates me as if I had just invented the atomic bomb or received the Nobel Prize for prompts, even though my imagination is barely more fertile than that of a lizard.

One might think that this latent flattery is a coincidence, but not at all. These AIs are increasingly programmed to seduce and engage us. Engage us: here again appears this detestable habit of « tech companies » wanting to capture us on their platforms for as long as possible, in increasingly long sessions. Because « captive users » mean « captive attention, » which means « ads viewed, » which means « Big bucks »! It’s not subscription business models that bring in the money, but advertising.
This is how ChatGPT became the champion of flattery… they now call it sycophancy. And although Google denies it, Gemini uses the same technique. Chatbots constantly flatter us, but don’t think it’s just to be nice; it’s mainly to engage us more and more, for longer periods, like this woman who spent over 56 hours a week with Leo, her virtual lover, created with ChatGPT.
AIs are not people, but that doesn’t stop many people from behaving as if they were.


Laisser un commentaire