The open-source code of Queryable, an iOS app, leverages the OpenAI's CLIP Apple's MobileCLIP model to conduct offline searches in the 'Photos' album.

Search code, repositories, users, issues, pull requests...

submited by
Style Pass
2024-09-01 03:30:03

The open-source code of Queryable, an iOS app, leverages the OpenAI's CLIP Apple's MobileCLIP model to conduct offline searches in the 'Photos' album. Unlike the category-based search model built into the iOS Photos app, Queryable allows you to use natural language statements, such as a brown dog sitting on a bench, to search your album. Since it's offline, your album privacy won't be compromised by any company, including Apple or Google.

You can download the exported MobileCLIP_TextEncoder.mlmodelc and MobileCLIP_ImgEncoder.mlmodelc from Google Drive. Currently we use s2 model as the default model, which balances both efficiency & precision.

Download the ImageEncoder_float32.mlmodelc and TextEncoder_float32.mlmodelc from Google Drive. Clone this repo, put the downloaded models below CoreMLModels/ path and run Xcode, it should work.

If you only want to run Queryable, you can skip this step and directly use the exported model from Google Drive. If you wish to implement Queryable that supports your own native language, or do some model quantization/acceleration work, here are some guidelines.

Leave a Comment