You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I get this error when running the huggingface example code: torch.version: '2.5.1+cu124'
File ~/.conda/envs/torch_p10/lib/python3.10/site-packages/transformers/models/moonshine/modeling_moonshine.py:827, in MoonshineDecoder.forward(self, input_ids, attention_mask, position_ids, past_key_values, inputs_embeds, use_cache, output_attentions, output_hidden_states, return_dict, cache_position, encoder_hidden_states, **flash_attn_kwargs)
824 return_dict = return_dict if return_dict is not None else self.config.use_return_dict
826 if (input_ids is None) ^ (inputs_embeds is not None):
--> 827 raise ValueError("You must specify exactly one of input_ids or inputs_embeds")
829 if self.gradient_checkpointing and self.training and use_cache:
830 logger.warning_once(
831 "`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`."
832 )
ValueError: You must specify exactly one of input_ids or inputs_embeds
I get this warning with the new hggingface code: "Setting pad_token_id to eos_token_id:2 for open-end generation."
These values need to be set: max_new_tokens=100, pad_token_id=2
I get this error when running the huggingface example code: torch.version: '2.5.1+cu124'
here is the code I ran:
In [40]: segment.shape
Out[40]: torch.Size([1, 56320])
The text was updated successfully, but these errors were encountered: