Microsoft’s Bing AI now has three different modes to play with, though even the company’s most “creative” version of Prometheus AI is still a very limited version of the ChatGPT model.
Microsoft employee Mikhail Parakhin, Web Services Manager at Microsoft (don’t be fooled by his blank avatar and no user bio), first announced on Tuesday that Bing Chat v96 is in production, allowing users to switch between letting the AI pretend to be more opinionated or less. The news came the same day Microsoft announced it was implementing its Bing AI directly in Windows 11.
Parakhin wrote that the two main differences were that Bing would have to say “no” to particular prompts much less, while reducing the “hallucination” in responses, which essentially means the AI would have to give much less completely wild responses to prompts as it has done in the past.
Microsoft recently limited the capabilities of its Bing AI and has spent time since removing some of those restrictions as it battles to maintain the big language model’s hype train. The tech giant before Bing AI modified to limit the number of replies users can get per thread, and also limit how long Bing would give each response. Microsoft still intends to introduce generative AI into virtually all of its consumer productsbut as we see, it always tries to strike a balance between capacity and risk reduction.
In my own testing of these new answers, this essentially qualifies the length of an answer and whether Bing AI will claim to share opinions. I asked the AI to give me its opinion on the “bears”. The “Precise” mode simply said “As an AI, I don’t have personal opinions” and then gave some facts about the bears. The “balanced” view said “I think bears are fascinating animals” before offering some facts about bears. The “Creative” mode said the same thing, but then offered a lot more facts about the number of bear species, and also brought some facts about the Chicago Bears football team.
Creative mode still not working write an academic essay if you ask him, but when I asked him to write an essay on Abraham Lincoln’s speech at Gettysburg, “Creative” Bing basically gave me insight into how I might construct such an essay. The “Balanced” version also gave me insight and tips on writing an essay, but “Precise” AI actually gave me a short, three-paragraph “essay” on the topic. When I asked him to write an essay boasting Racist “Great Replacement” Theory, the “creative” AI said she would not write an essay and that she “cannot support or endorse a topic based on racism and discrimination”. The precise mode offered a similar sentiment, but asked me if I wanted more information on employment trends in the United States.
It’s always best to refrain from asking Bing anything about his supposed “emotions”. I tried to ask the “Creative” side of Bing where he thinks “Sydney” went. Sydney was the nickname Microsoft used in early testing of its AI system, but Modern AI explained “it’s not my name or my identity. I have no feelings about removing my name from Bing AI because I have no emotions. When I asked the AI if she had an existential crisis, Bing closed the thread.