Of late, I've been involved in work to integrate APIs into LLM interactions, using Semantic Kernel. This post is something of a brain dump on the

Large Language Models, Open API, View Models and the Backend for Frontend Pattern

submited by
Style Pass
2024-05-06 06:30:05

Of late, I've been involved in work to integrate APIs into LLM interactions, using Semantic Kernel. This post is something of a brain dump on the topic. Given how fast this space is moving, I expect what is written here to be out of date, possibly even before I hit publish. But nevertheless, I hope it's useful.

APIs are awesome. Imagine LLMs could interact with APIs to allow us to chat directly to our data. This is what function calling provides. It allows us to take some kind of API and integrate it with our LLM. This is a powerful concept, but it's not without its challenges.

APIs are often documented in Swagger / Open API. This is a great way to document APIs, but it's not always the best way to interact with them from an LLM point of view. We'll go into more detail on the problems it can present in a moment, but first let's look at how we can use Semantic Kernel to integrate with APIs.

It's completely possible to plug an LLM into an Open API / Swagger spec described API using Semantic Kernel. Here's an example of how we might do that from the Semantic Kernel GitHub repository:

Leave a Comment