-
Notifications
You must be signed in to change notification settings - Fork 58
Open
Description
Hi,
Using the latest version 0.2.2 downloading and loading a model gives me just the generic error without any details.
I tried both Pixel 3a & Pixel 7 that I have - same result.
I tried smollm, qwen & gemma models - none of them work.((
Is there a way to get more detailed error from the lib to understand what is happening?
I/flutter ( 7345): ┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
I/flutter ( 7345): │ LlamaException: Error loading model: LlamaException: Failed to initialize Llama (LlamaException: Could not load model at /data/user/0/com.example.chat/app_flutter/models/SmolLM2-135M-Instruct-Q4_K_M.gguf)
I/flutter ( 7345): ├┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄
I/flutter ( 7345): │ 15:52:51.431 (+0:01:17.102702)
I/flutter ( 7345): ├┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄
I/flutter ( 7345): │ ⛔ [LlamaCppBackend] Failed to load model
I/flutter ( 7345): └───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
I/flutter ( 7345): ┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
I/flutter ( 7345): │ LlamaException: Error loading model: LlamaException: Failed to initialize Llama (LlamaException: Could not load model at /data/user/0/com.example.chat/app_flutter/models/SmolLM2-135M-Instruct-Q4_K_M.gguf)
I/flutter ( 7345): ├┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄
I/flutter ( 7345): │ 15:52:51.488 (+0:01:17.160001)
I/flutter ( 7345): ├┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄
I/flutter ( 7345): │ ⛔ [LlmProvider] Failed to load model
I/flutter ( 7345): └───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
Claude Code gave me this resolution:
The investigation found the root cause — it's not a configuration issue on our side. llama_cpp_dart v0.2.2 is
fundamentally broken on Android:
1. Missing llama.cpp source — the pub package ships without the git submodule. The src/llama.cpp/ directory is empty.
The CMake build can't compile from source.
2. Hardcoded developer paths — android/llamalib/build.gradle references /Users/adel/Workspace/llama_cpp_dart/... (the
package author's machine).
3. Mandatory OpenCL/Adreno dependency — the native lib is hard-linked to an Adreno-specific libOpenCL.so. Devices
without Qualcomm GPUs (Samsung Exynos, MediaTek, Google Tensor) will fail.
4. No CPU fallback — ggml_backend_load_all() is skipped on Android, and there's no fallback if OpenCL fails. So no
compute backend is available at all.
What is the latest stable version that does not have these issues?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels