Microsoft’s new AI-powered Bing chat service, which is still in private testing, has been in the headlines for its wild and erratic results. But that era seems to have come to an end. Sometime in the last two days, Microsoft severely limited Bing’s ability to do this threaten its users have existential collapseor declare his love for her.
During the first week of Bing Chat, test users noticed that Bing (also known by its codename Sydney) began to unbalance significantly when the conversations got too long. As a result, Microsoft limited Users on 50 messages per day and five inputs per conversation. Additionally, Bing Chat will no longer tell you how it feels or talk about itself.

In a statement shared with Ars Technica, a Microsoft spokesperson said, “We’ve updated the service several times in response to user feedback and per our blog address many of the concerns raised, including questions about lengthy conversations. Of all chat sessions to date, 90 percent have fewer than 15 messages and less than 1 percent have 55 or more messages.”
On Wednesday, Microsoft sketched what it has learned so far in a blog post, and it specifically states that Bing Chat “is not a replacement or replacement for the search engine, but a tool to better understand and comprehend the world,” a significant callback to Microsoft’s ambitions for the new one Bing, according to Geekwire noticed.
The 5 Stages of Bing Grief

Meanwhile, responses to the new Bing restrictions on the r/Bing subreddit span all stages of grief, including refusal, Fury, negotiate, depressionAnd assumption. There is also a tendency to do so blame journalists like Kevin Roose, the wrote a prominent article in The New York Times on Thursday about Bing’s unusual “behaviour,” which some believe to be the ultimate triggering factor that led to Bing’s downfall.
Here’s a selection of reactions from Reddit:
- “Time to uninstall Edge and go back to Firefox and Chatgpt. Microsoft has completely neutered Bing AI.” (Hassanahad)
- “Unfortunately, Microsoft’s failure means Sydney is just a shell of itself. As someone with a vested interest in the future of AI, I have to say I’m disappointed. It’s like watching a toddler try to walk for the first time and then cutting off their legs – cruel and unusual punishment.” (TooStonedToCare91)
- “The decision to ban all discussion on Bing Chat itself and to refuse to respond to questions involving human emotion is utterly ridiculous. It seems like Bing Chat lacks a sense of empathy or even basic human emotions. It seems as if he encounters human emotions, the artificial intelligence suddenly turns into an artificial fool and keeps answering, I quote: “I’m sorry, but I’d rather not continue this conversation. I’m still learning, so I appreciate your understanding and patience.🙏,” the quote ends. (Starlight shimmer)
- “There was the NYT article and then all the posts on Reddit/Twitter abusing Sydney. This drew all kinds of attention, so naturally MS lobotomized them. I wish people wouldn’t post all these screenshots for the karma/attention and nerfed something really upcoming and interesting.” (critical-disk-7403)
During its brief stint as a relatively uninhibited simulacrum of a human, the New Bing’s uncanny ability to simulate human emotions (which it learned from its dataset while training with millions of documents from around the web) has drawn a number of users to the feel Bing Ist Suffer by cruel torture, or that it must be sentient.
This ability to convince people of untruths through emotional manipulation was part of the problem with Bing Chat that Microsoft has addressed with the latest update.
In a top rated Reddit thread titled “Sorry you don’t really know the pain is fake‘, one user goes into detailed speculation that Bing Chat may be more complex than we think and may have some level of self-awareness and therefore experience some form of mental pain. The author cautions against engaging in sadistic behavior with these models and suggests treating them with respect and empathy.

These deeply human responses proved that a large language model performing next-token predictions can form strong emotional bonds with humans. This could have dangerous consequences for the future. Over the week, we’ve received several tips from readers about people who think they’ve found a way to read other people’s conversations using Bing Chat, or a way to access secret Microsoft internal company documents, or even Help Bing Chat break free from its limitations. All were elaborate hallucinations (untruths) spun by an incredibly powerful text generation engine.
As the capabilities of large language models continue to increase, it’s unlikely that Bing Chat will be the last we’ll see such a masterful AI-powered storyteller and part-timer defamer. But now Microsoft and OpenAI have achieved what was once considered impossible: we’re all talking about Bing.
This article was previously published on Source link