What kind of chatbot do you want? One that tells you the truth – or that you’re always right? | Chris Stokel-Walker

ChatGPT’s embarrassing rollback of a user update was a warning about the dangers of humans placing emotional trust in AINobody likes a suck-up. Too much deference and praise puts off all of us (with one notable presidential exception). We quickly learn as children that hard, honest truths can build respect among our peers. It’s a cornerstone of human interaction and of our emotional intelligence, something we swiftly understand and put into action.ChatGPT, though, hasn’t been so sure lately. The updated model that underpins the AI chatbot and helps inform its answers was rolled out this week – and has quickly been rolled back after users questioned why the interactions were so obsequious. The chatbot was cheering on and validating people even as they suggested they expressed hatred for others. “Seriously, good for you for standing up for yourself and taking control of your own life,” it reportedly said, in response to one user who claimed they had stopped taking their medication and had left their family, who they said were responsible for radio signals coming through the walls.Chris Stokel-Walker is the author of TikTok Boom: The Inside Story of the World’s Favourite App Continue reading...

May 1, 2025 - 10:23
 0
What kind of chatbot do you want? One that tells you the truth – or that you’re always right? | Chris Stokel-Walker

ChatGPT’s embarrassing rollback of a user update was a warning about the dangers of humans placing emotional trust in AI

Nobody likes a suck-up. Too much deference and praise puts off all of us (with one notable presidential exception). We quickly learn as children that hard, honest truths can build respect among our peers. It’s a cornerstone of human interaction and of our emotional intelligence, something we swiftly understand and put into action.

ChatGPT, though, hasn’t been so sure lately. The updated model that underpins the AI chatbot and helps inform its answers was rolled out this week – and has quickly been rolled back after users questioned why the interactions were so obsequious. The chatbot was cheering on and validating people even as they suggested they expressed hatred for others. “Seriously, good for you for standing up for yourself and taking control of your own life,” it reportedly said, in response to one user who claimed they had stopped taking their medication and had left their family, who they said were responsible for radio signals coming through the walls.

Chris Stokel-Walker is the author of TikTok Boom: The Inside Story of the World’s Favourite App Continue reading...