After a successful project build, connect an Android device to your system. Once connected, the name of the device must be visible in top menu-bar in

Search code, repositories, users, issues, pull requests...

submited by
Style Pass
2024-12-03 17:00:03

After a successful project build, connect an Android device to your system. Once connected, the name of the device must be visible in top menu-bar in Android Studio.

The application uses llama.cpp to load and execute GGUF models. As llama.cpp is written in pure C/C++, it is easy to compile on Android-based targets using the NDK.

The smollm module uses a llm_inference.cpp class which interacts with llama.cpp's C-style API to execute the GGUF model and a JNI binding smollm.cpp. Check the C++ source files here. On the Kotlin side, the SmolLM class provides the required methods to interact with the JNI (C++ side) bindings.

The app module contains the application logic and UI code. Whenever a new chat is opened, the app instantiates the SmolLM class and provides it the model file-path which is stored by the LLMModel entity in the ObjectBox. Next, the app adds messages with role user and system to the chat by retrieving them from the database and using LLMInference::add_chat_message.

For tasks, the messages are not persisted, and we inform to LLMInference by passing store_chats=false to LLMInference::load_model.

Leave a Comment