You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am encountering an issue while attempting to load the UI-TARS-7B-SFT model using the Hugging Face Transformers library. My goal is to run the model on CPU, and I have included the parameter trust_remote_code=True to allow for loading custom configurations. Below is the code I am using:
from transformers import AutoTokenizer, AutoModelForCausalLM
Load the model with device_map set to "cpu" and trust_remote_code enabled
model = AutoModelForCausalLM.from_pretrained("bytedance-research/UI-TARS-7B-SFT", device_map="cpu", trust_remote_code=True)
prompt = (
"You are a GUI agent. You are given a task: change the header on the page. "
"Please perform the following steps: find the element with the text 'Old Header' and replace it with 'New Header'."
)
However, when I run this script, I get the following error:
ValueError: Unrecognized configuration class <class 'transformers.models.qwen2_vl.configuration_qwen2_vl.Qwen2VLConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of AriaTextConfig, BambaConfig, BartConfig, BertConfig, … (etc.)
I have updated my Transformers library to the latest version and cleared the Hugging Face cache, but the error persists. It seems that the custom configuration class Qwen2VLConfig is not recognized by AutoModelForCausalLM.
Could you please advise on the correct method to load the UI-TARS-7B-SFT model with its custom configuration? Is there an alternative initialization method that I should use?
Thank you for your assistance.
Best regards,
Andrew
The text was updated successfully, but these errors were encountered:
Hello UI-TARS Team,
I am encountering an issue while attempting to load the UI-TARS-7B-SFT model using the Hugging Face Transformers library. My goal is to run the model on CPU, and I have included the parameter trust_remote_code=True to allow for loading custom configurations. Below is the code I am using:
from transformers import AutoTokenizer, AutoModelForCausalLM
Load the tokenizer with remote code trust enabled
tokenizer = AutoTokenizer.from_pretrained("bytedance-research/UI-TARS-7B-SFT", trust_remote_code=True)
Load the model with device_map set to "cpu" and trust_remote_code enabled
model = AutoModelForCausalLM.from_pretrained("bytedance-research/UI-TARS-7B-SFT", device_map="cpu", trust_remote_code=True)
prompt = (
"You are a GUI agent. You are given a task: change the header on the page. "
"Please perform the following steps: find the element with the text 'Old Header' and replace it with 'New Header'."
)
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
result = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(result)
However, when I run this script, I get the following error:
ValueError: Unrecognized configuration class <class 'transformers.models.qwen2_vl.configuration_qwen2_vl.Qwen2VLConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of AriaTextConfig, BambaConfig, BartConfig, BertConfig, … (etc.)
I have updated my Transformers library to the latest version and cleared the Hugging Face cache, but the error persists. It seems that the custom configuration class Qwen2VLConfig is not recognized by AutoModelForCausalLM.
Could you please advise on the correct method to load the UI-TARS-7B-SFT model with its custom configuration? Is there an alternative initialization method that I should use?
Thank you for your assistance.
Best regards,
Andrew
The text was updated successfully, but these errors were encountered: