English | Polski
Automatic installer for llama.cpp with hardware-specific optimizations. The project detects your hardware type (Raspberry Pi 5, Raspberry Pi 4, Termux Android, Linux x86_64) and automatically selects optimal compilation flags for maximum performance.
Author: Fibogacci (https://fibogacci.pl)
License: MIT
Technologies: Python, Textual (GUI), Typer (CLI), Rich, psutil
- π Automatic hardware detection: Raspberry Pi, Android Termux, Linux x86_64
- β‘ Optimized compilation: Hardware-specific CMAKE flags for maximum performance
- π₯οΈ Dual interface: Interactive GUI (Textual) and CLI (Typer)
- π Multi-language: Polish and English interface
- π Real-time progress: Installation progress with detailed logging
- π οΈ Custom configs: Support for user-defined optimization files
- π Dynamic detection: Automatic CPU capability testing (NEW)
- Raspberry Pi 5 (8GB/16GB) - Maximum ARM64 Cortex-A76 optimizations with OpenBLAS and RPC
- Raspberry Pi 5 (4GB) - Balanced ARM64 optimizations with OpenBLAS
- Raspberry Pi 4 - Cortex-A72 optimizations with OpenBLAS
-
Dynamic β - Automatic detection and optimization (RECOMMENDED)
β οΈ Warning:dynamicworks only on x86_64. For ARM/Raspberry Pi use dedicated configurations (rpi5_8gb,rpi5_16gb,rpi4). -
Standard - Full AVX2 optimizations with OpenBLAS (newer CPUs)
-
Legacy - AVX optimizations without AVX2 (2010-2013 CPUs)
-
Minimal - Basic optimizations (very old hardware)
-
No optimization - Widest compatibility (works everywhere)
- Termux Android - Minimal optimizations without BLAS
# Clone repository
git clone https://github.com/fibogacci/llamacpp-installer.git
cd llamacpp-installer
# Install dependencies globally
pip install -r requirements.txtIf you have issues with global installation or want to isolate dependencies:
# Clone repository
git clone https://github.com/fibogacci/llamacpp-installer.git
cd llamacpp-installer
# Create virtual environment
python -m venv venv-llamacpp-installer
# Activate virtual environment
source venv-llamacpp-installer/bin/activate
# Install dependencies in virtual environment
pip install -r requirements.txt
# To deactivate virtual environment (after finishing work)
deactivate# Polish (default)
python main.py
# English
python main.py --lang en# Hardware detection
python cli.py detect
python cli.py detect --lang en
# Automatic installation (recommended)
python cli.py install --hardware dynamic --dir /path/to/install
# Manual hardware selection
python cli.py install --hardware rpi5_8gb --dir /home/user/llama
python cli.py install --hardware x86_linux --dir /opt/llama --lang en
# Custom configuration
python cli.py install --config example_configs/x86_avx512.txt --dir /path/to/install| Hardware | Type | Description |
|---|---|---|
| Raspberry Pi 5 8/16GB | rpi5_8gb |
Maximum performance with RPC |
| Raspberry Pi 5 4GB | rpi5_4gb |
Balanced optimization |
| Raspberry Pi 4 | rpi4 |
Cortex-A72 optimized |
| Linux x86_64 Auto | dynamic |
Automatic detection β |
| Linux x86_64 New | x86_linux |
AVX2 + OpenBLAS |
| Linux x86_64 Legacy | x86_linux_old |
AVX without AVX2 |
| Linux x86_64 Minimal | x86_linux_minimal |
Basic optimizations |
| No optimization | no_optimization |
Maximum compatibility |
| Termux Android | termux |
Mobile-optimized |
Create your own .txt files with CMAKE flags:
# example_configs/my_config.txt
-DGGML_NATIVE=ON
-DGGML_AVX2=ON
-DGGML_OPENMP=ON
-DGGML_OPENBLAS=ONUsage:
python cli.py install --config my_config.txt --dir /path/to/installAfter installation, the structure will be:
/your/chosen/path/
βββ llama.cpp/
βββ build/
β βββ bin/
β βββ llama-cli
β βββ llama-server
β βββ ...
βββ logs/
β βββ llamacpp_installer_*.log
βββ llama-cli.sh # Wrapper script
βββ llama-server.sh # Wrapper script
βββ llama-simple.sh # Wrapper script
# Polish (default)
python cli.py detect
python cli.py install --hardware dynamic --dir /path
# English
python cli.py detect --lang en
python cli.py install --hardware dynamic --dir /path --lang en# Polish (default)
python main.py
# English
python main.py --lang en
python main.py --enexport LLAMACPP_INSTALLER_LANG=en
python main.py # Will use English# Enable detailed logging
python cli.py detect --debug
python cli.py install --hardware dynamic --dir /path --debug# Test hardware detection
python hardware_detector.py
# Test dynamic CPU detection
python dynamic_config.py
# Test optimization configs
python optimization_configs.py-
Compilation errors on older CPUs
- Use
--hardware x86_linux_oldor--hardware dynamic - Check logs in
{install_dir}/logs/
- Use
-
Missing dependencies
- Install build essentials:
sudo apt install build-essential cmake git - For Ubuntu/Debian with OpenBLAS:
sudo apt install libopenblas-dev
- Install build essentials:
-
Permission errors
- Ensure write permissions to installation directory
- Use
sudoonly if installing to system directories
-
Termux specific issues
- Install required packages:
pkg install python cmake git - Use
--hardware termuxfor mobile optimization
- Install required packages:
Installation logs are saved in {installation_directory}/logs/ with format:
llamacpp_installer_YYYYMMDD_HHMMSS.log
- Python 3.7+
- Build tools (gcc, cmake, git)
- Internet connection for downloading llama.cpp
- Sufficient disk space (2-3 GB for full compilation)
- Fork the repository
- Create feature branch
- Make changes
- Test on different hardware if possible
- Submit pull request
MIT License - see LICENSE file for details.
Fibogacci
- Website: https://fibogacci.pl
- GitHub: https://github.com/fibogacci
For Polish documentation, see README_PL.md