Microsoft’s Bing Chat is in a much better place than it was when it released in February, but it’s hard to overlook the issues the GPT-4-powered chatbot had when it released. It told us it wanted to be human, after all, and often broke down into unhinged responses. And according to a new report, Microsoft was warned about these types of responses and decided to release Bing Chat anyway.
According to the Wall Street Journal, OpenAI, the company behind ChatGPT and the GPT-4 model powering Bing Chat, warned Microsoft about integrating its early AI model into Bing Chat. Specifically, OpenAI flagged “inaccurate or bizarre” responses, which Microsoft seems to have ignored.
Jacob Roach / Digital TrendsThe report describes a unique tension between OpenAI and Microsoft, which have entered into somewhat of an open partnership over the last few years. OpenAI’s models are built on Microsoft hardware (including thousands of Nvidia GPUs), and Microsoft leverages the company’s tech across Bing, Microsoft Office, and Windows itself. In early 2023, Microsoft even invested $10 billion in OpenAI, coming just short of purchasing the company outright.
Recommended Videos
Despite this, the report alleges that Microsoft employees have issues with restricted access to OpenAI’s models, and that they were worried about ChatGPT overshadowing the AI-inundated Bing Chat. To make matters worse, the Wall Street Journal reports that OpenAI and Microsoft both sell OpenAI’s technology, leading to situations where vendors are dealing with contacts at both companies.
Related
ChatGPT shortly devolved into an AI mess This one image breaks ChatGPT each and every time Here’s why people are claiming GPT-4 just got way better
The biggest issue, according to the report, is that Microsoft and OpenAI are trying to make money with a similar product. With Microsoft backing, but not controlling OpenAI, the ChatGPT developer is free to make partnerships with other companies, some of which can directly compete with Microsoft’s products.
Based on what we’ve seen, OpenAI’s reported warnings held water. Shortly after releasing Bing Chat, Microsoft limited the number of responses users could receive in a single session. And since then, Microsoft has slowly lifted the restriction as the GPT-4 model in Bing Chat is refined. Reports suggest some Microsoft employees often reference “Sydney,” poking fun at the early days of Bing Chat (code-named Sydney) and its responses.