Lightweight backend service for grammar assistant app. Can serve as an inspiration for LLM token streaming with OpenAI SDK and FastAPI. These instruct

Search code, repositories, users, issues, pull requests...

submited by
Style Pass
2024-05-07 11:30:06

Lightweight backend service for grammar assistant app. Can serve as an inspiration for LLM token streaming with OpenAI SDK and FastAPI.

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

The project requires Python and pip installed on your system. The required Python packages are listed in the requirements.txt file.

To configure the application, especially the LLM prompts, copy the config.example.yaml file to config.yaml and fill in the required values.

The project is structured into several modules and services. For people interested only in LLM integration, the most interesting parst will be:

Leave a Comment