How to fix: ValueError: Failed to detect model architecture #16114
Unanswered
VenominousX
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm currently learning the basics of creating my own AI models. I've already managed to get as far as the .pth and .safetensors files, but I'm stuck when it comes to converting .safetensors to gguf.
When I run the convert_hf_to_gguf.py script, I get the following error:
ValueError: Failed to detect model architecture
The Config.json file must contain a “model_type” key, but I don't know which of the options are allowed for a custom model.
I am using a GPT2 Tiktoken, but when I enter “gpt2” as the model_type, I also get the error.
What can I do to prepare a self-built model so that I can use it in Ollama?
It's probably a stupid question, but I'm stuck here^^.
Best regards
Beta Was this translation helpful? Give feedback.
All reactions