Copy the .env.template file to .env.local and specify which model (local or external) you want to use by setting the NEXT_PUBLIC_USE_LOCAL_MODELS envi

Search code, repositories, users, issues, pull requests...

submited by
Style Pass
2024-10-20 20:30:07

Copy the .env.template file to .env.local and specify which model (local or external) you want to use by setting the NEXT_PUBLIC_USE_LOCAL_MODELS environment variable to true or false.

Inside the route.ts file, you must specify the model you are running using Ollama, by default it will use the llama3.1 model for local models and for external models it will use the gpt-3.5-turbo model.

The prompts used to generate the mind map data is defined in the defaultLocalPrompt and defaultExternalPrompt variables in the prompts.ts file.

Leave a Comment