It's hard to think that Microsoft only released the ChatGPT-enhanced Bing a week ago.
A small set of testers were given early access to the updated Bing and Edge browsers, which are now integrated with OpenAI's conversational AI engine. Since then, the internet has been swamped with chats with the chatbot, ranging from proclaiming its love for New York Times journalist Kevin Roose to staunchly asserting 2022 and refusing to back down. Tim Marcin's compilation of Bing's meltdowns is highly recommended.
When beta testers got their hands on the new Bing, they were determined to find flaws in its intelligence and identify its limitations. And boy, did they ever succeed. While this may appear to be a bad look for Microsoft, it is all part of the strategy. Giving a language learning model as much exposure and experience as feasible is a vital part of its development. This enables developers to add new input and data, which will improve the system over time, much like a mythical entity absorbing the might of its defeated adversaries.
In its blog post on Wednesday, Microsoft did not use those precise terms. But it did emphasize that Bing's hectic week of testing was completely planned. "The only way to improve a product like this, where the user experience is so different from anything anyone has seen before, is for people like you to use the product and do precisely what you all are doing," the Bing blog stated.
Yet, the majority of the announcement was spent to addressing Bing's bizarre conduct this week and proposing measures to remedy it. They came up with the following:
Enhancing searches that need accuracy and timeliness
Microsoft stated that supplying accurate citations and references has been typically successful. Yet it needs some development when it comes to checking the live score in sports, delivering information and data clearly, or, ahem, determining the proper year we're now living in. Bing is quadrupling its grounding data and is thinking about "providing a slider that allows you greater choice on the precision versus originality of the answer to tailor to your query."
Bing's conversation abilities are being fine-tuned.
This week's mayhem has mostly taken place in the conversation section. According to Bing, this is primarily attributable to two factors:
1. Extended conversation sessions
Chat sessions with more than 15 questions confuse the model. It's unclear whether this will spark dark musings from its wicked alter-ego Sydney, but Bing promises to "provide a tool so you may more easily refresh the context or start from scratch."
2. Echoing the tone of the user
This could explain why Bing chat has become angry when confronted with challenging topics. "At times, the model tries to answer or reflect in the tone in which it is being asked to provide responses, which can lead to a style we didn't want," according to the post. Bing is investigating a solution that will provide the user with "greater fine-tuned control."
Bug fixes and new features are being implemented.
Bing said it will continue to resolve bugs and technical difficulties while also considering new features based on user feedback. This could involve things like buying flights or sending emails. as well as the chance to share excellent searches/answers.
https://spiritsevent.com
https://gpsku.co.id/
https://caramanjur.com/
https://rainyquote.com
https://www.teknovidia.com/
https://hpmanual.net/
https://www.inschord.com/
https://edukasinewss.com/
A small set of testers were given early access to the updated Bing and Edge browsers, which are now integrated with OpenAI's conversational AI engine. Since then, the internet has been swamped with chats with the chatbot, ranging from proclaiming its love for New York Times journalist Kevin Roose to staunchly asserting 2022 and refusing to back down. Tim Marcin's compilation of Bing's meltdowns is highly recommended.
When beta testers got their hands on the new Bing, they were determined to find flaws in its intelligence and identify its limitations. And boy, did they ever succeed. While this may appear to be a bad look for Microsoft, it is all part of the strategy. Giving a language learning model as much exposure and experience as feasible is a vital part of its development. This enables developers to add new input and data, which will improve the system over time, much like a mythical entity absorbing the might of its defeated adversaries.
In its blog post on Wednesday, Microsoft did not use those precise terms. But it did emphasize that Bing's hectic week of testing was completely planned. "The only way to improve a product like this, where the user experience is so different from anything anyone has seen before, is for people like you to use the product and do precisely what you all are doing," the Bing blog stated.
Yet, the majority of the announcement was spent to addressing Bing's bizarre conduct this week and proposing measures to remedy it. They came up with the following:
Enhancing searches that need accuracy and timeliness
Microsoft stated that supplying accurate citations and references has been typically successful. Yet it needs some development when it comes to checking the live score in sports, delivering information and data clearly, or, ahem, determining the proper year we're now living in. Bing is quadrupling its grounding data and is thinking about "providing a slider that allows you greater choice on the precision versus originality of the answer to tailor to your query."
Bing's conversation abilities are being fine-tuned.
This week's mayhem has mostly taken place in the conversation section. According to Bing, this is primarily attributable to two factors:
1. Extended conversation sessions
Chat sessions with more than 15 questions confuse the model. It's unclear whether this will spark dark musings from its wicked alter-ego Sydney, but Bing promises to "provide a tool so you may more easily refresh the context or start from scratch."
2. Echoing the tone of the user
This could explain why Bing chat has become angry when confronted with challenging topics. "At times, the model tries to answer or reflect in the tone in which it is being asked to provide responses, which can lead to a style we didn't want," according to the post. Bing is investigating a solution that will provide the user with "greater fine-tuned control."
Bug fixes and new features are being implemented.
Bing said it will continue to resolve bugs and technical difficulties while also considering new features based on user feedback. This could involve things like buying flights or sending emails. as well as the chance to share excellent searches/answers.
https://spiritsevent.com
https://gpsku.co.id/
https://caramanjur.com/
https://rainyquote.com
https://www.teknovidia.com/
https://hpmanual.net/
https://www.inschord.com/
https://edukasinewss.com/