LunarStudio is a fast, privacy‑focused offline AI assistant built using C++, Qt/QML, FAISS, SQLite, and llama.cpp. It performs vector search, embedding generation, and LLM inference fully offline inside your system.
- Fast local inference using GGUF models
- Semantic search powered by FAISS
- Sentence embeddings with MiniLM
- Local database storage using SQLite
- 100% offline & private
- Lightweight compared to Electron or web view frameworks
Before running LunarStudio, download the following models into the models folder.
cd models/ wget https://huggingface.co/leliuga/all-MiniLM-L6-v2-GGUF/resolve/main/all-MiniLM-L6-v2.F16.gguf
cd models/ wget https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct-GGUF/resolve/main/qwen2.5-0.5b-instruct-q8_0.gguf
sudo pacman -S qt6-base qt6-declarative qt6-tools cmake make gcc faiss sqlite openblas lapack nlohmann-json
cd lunar-studio mkdir build && cd build cmake .. make -j$(nproc)
./LunarStudio
- Embeddings generated using MiniLM
- FAISS vector search
- Qwen2.5 LLM generates response
(C++): src/.cpp, include/.hpp
PRs and issues are welcome!