Microsoft Adds Plugins for OpenAI GPT Models on Azure – ChatGPT, Bing, 365, and Third-Party Services
Microsoft today announced plugins for GPT-based large language models offered on Azure. The company says it is adopting the same approach as ChatGPT rolled out recently in beta. According to Microsoft:
Microsoft is announcing that it will adopt the same open plugin standard that OpenAI introduced for ChatGPT, enabling interoperability across ChatGPT and the breadth of Microsoft’s copilot offerings.
Developers can now use one platform to build plugins that work across both consumer and business surfaces, including ChatGPT, Bing, Dynamics 365 Copilot (in preview) and Microsoft 365 Copilot (in preview). If users want to develop and use their own plugins with their AI app built on Azure OpenAI Service it will, by default, be interoperable with this same plugin standard. This means developers can build experiences that enable people to interact with their apps using the most natural user interface: the human language.
What Are Microsoft OpenAI Services GPT Model Plugins
Plugins are services that are external to the main large language model (LLM) used by an enterprise. The result is a primary model that is supplemented by other models and services to better fulfill specific needs of users.
Plugins are a way to quickly transform a general-purpose LLM into a specialty solution with domain proficiency. This can be done without requiring retraining or fine-tuning for the foundation model. Instead, you add models that work with the foundation model to serve specific needs by providing access to web services, knowledgebases, and programs from internal sources or third parties. This will make LLMs far more flexible for enterprises.
However, if the models are consumer-facing services, there may be some important limitations as outlined below.
See How Plugins Work on ChatGPT
Since Microsoft is following OpenAI’s ChatGPT plugin approach, you may want to review how that model works. Below are two videos that demonstrate the plugin model and how it differs from typical ChatGPT usage.
OpenAI’s approach for ChatGPT plugins is interesting and functional, but it is not ideal from a UX perspective. The user must choose the plugin model they want to use before entering a prompt. If you are using one plugin over and over again, then it may be fine. If you don’t know what plugin can offer the best results for your query, then it can take a lot of trial and error. And if you need to switch between prompts that will be best answered by different models you have to continually go switch back-and-forth.
With that said, this approach may be less of a problem for enterprise use cases where users are trained on their jobs and have context. It is problematic for consumers who don’t always have those benefits.
Editor’s Note: This is a breaking story. We will update this article with more information after the Microsoft BUILD presentations today.
Follow @bretkinsella Follow @voicebotai