No more conversations with Bing’s new Chatbot.
The New York Times Kalley Huang/Kalley H. reports that Microsoftannounced Friday that it will start limiting “conversations” with it’s new ChatGPT powered Bing search engine to five questions and 50 questions per day.
Users will be prompted to begin a new session after they ask five questions and the chatbot answers five times.
Kevin Roose’s report in the Times’ on Thursday, after raving about the new Bing the week before, about unnerving and strange lengthy conversations had to play in Microsoft’s decision with the limit.
Roose’s report spread across social media and landed him on CNN.
No question his conversation was earie, but I’m not sure we go to “search” to have a conversation, even a search that is much more powerful than search to date.
It’s not that Microsoft didn’t expect people to have conversations with this “robot.”
“Microsoft expected its chatbot to sometimes respond inaccurately, and it built in measures to protect against people who try to make the chatbot behave strangely or say harmful things. Still, early users who had open-ended, personal conversations with the chatbot found its responses unusual — and sometimes creepy.
But people are creepy too.
“On Wednesday, the company wrote in a blog post that it “didn’t fully envision” people using the chatbot “for more general discovery of the world, and for social entertainment.” The chatbot became repetitive and, sometimes, testy in long conversations, it said.”
Only one out of one hundred “conversations” with Bing had more than 50 messages.
BIng’s AI powered search is going to improve dramatically in a short period.
The sensational stories make for great discussion, but they are the exception – and treatable with short-term fixes such as the five question cap.