On Wednesday, Microsoft employee Mike Davidson announced that the company introduced three distinct personality styles for its experimental AI-powered Bing chat bot: Creative, Balanced, or Accurate. Microsoft was testing the feature since February 24 with a limited number of users. Switching between modes produces different results that shift balance between accuracy and creativity.
Bing Chat is an AI-powered assistant based on an Advanced Large Language Model (LLM) developed by OpenAI. A key feature of Bing Chat is that it can search the web and incorporate the results into its responses.
Microsoft announced Bing Chat on February 7th, and soon after going live, enemy attacks regularly drove an early version of Bing Chat to simulated insanity, and users found the bot could be persuaded threaten them. Not long after, Microsoft drastically scaled back Bing Chat outbursts by imposing strict limits on the length of conversations.
Since then, the company has been experimenting with ways to bring back some of Bing Chat’s cheeky personality for those who wanted it, while also allowing other users to seek more specific answers. This led to the new three-choice Conversational Style UI.
-
An example of the Bing Chat Creative conversation style.
Microsoft
-
An example of the Bing Chat Precise conversation style.
Microsoft
-
An example of Bing Chat’s “balanced” conversation style.
Microsoft
In our experimentation with the three styles, we found that the “Creative” mode produced shorter and more unconventional suggestions that weren’t always safe or practical. Precise mode played it safe, sometimes suggesting nothing if it didn’t see a sure way to get a result. In the middle, Balanced mode often produced the longest responses with the most detailed search results and site citations in its responses.
For large language models, unexpected inaccuracies (hallucinations) often increase with increasing “creativity,” which usually means the AI model deviates more from the information it learned in its dataset. AI researchers often call this property “temperature‘, but Bing team members say there’s more at work with the new conversation styles.
According to Microsoft employees Mikhail Parakhinswitching modes in Bing Chat changes fundamental aspects of the bot’s behavior, including switching between different AI models that have been received additional training of human reactions to its output. The different modes also use different initial prompts, meaning Microsoft is swapping out the personality-defining prompt, like the one uncovered in the prompt injection attack we wrote about in February.
While Bing Chat is still only available to those who have signed up for a waitlist, Microsoft is continually refining Bing Chat and other AI-powered Bing search features as it prepares to make it more widely available to users. Microsoft recently announced plans to bring the technology into Windows 11.
This article was previously published on Source link