I'm following the examples from the readme.
When I run:
python download-repo.py BAAI/bge-base-en-v1.5 # or any other model
sh run_conversions.sh bge-base-en-v1.5
It fails at the quantization step:
[...]
Done. Output file: bge-base-en-v1.5/ggml-model-f16.bin
Illegal instruction (core dumped)
Illegal instruction (core dumped)
Is there something I can do to make it work?
I'm following the examples from the readme.
When I run:
It fails at the quantization step:
Is there something I can do to make it work?