Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

max_length parameter of TrainingArguments not applied #561

Open
cjuracek-tess opened this issue Sep 27, 2024 · 0 comments
Open

max_length parameter of TrainingArguments not applied #561

cjuracek-tess opened this issue Sep 27, 2024 · 0 comments

Comments

@cjuracek-tess
Copy link

cjuracek-tess commented Sep 27, 2024

Description:
I don't believe the max_length parameter of TrainingArguments is actually being used.

Minimal working example (borrowed from quickstart):

from datasets import load_dataset
from setfit import SetFitModel, Trainer, TrainingArguments, sample_dataset

model = SetFitModel.from_pretrained("BAAI/bge-small-en-v1.5")
dataset = load_dataset("SetFit/sst2")
train_dataset = sample_dataset(dataset["train"], label_column="label", num_samples=8)
model.labels = ["negative", "positive"]

args = TrainingArguments(
    max_length=5,
    batch_size=32,
    num_epochs=10,
)

trainer = Trainer(
    model=model,
    args=args,
    train_dataset=train_dataset,
)
trainer.train()

Expected Behavior:

  • Batches with max_length of 5 in the Transformers training loop

Actual Behavior:

  • Batches whose # of tokens is equal to the longest token length in the batch

Possible Culprits:

  • This function is responsible for propagating SetFit training arguments to the SentenceTransformer Trainer. args.max_length is not referenced in this method, nor does it look like it's supported anyway as a parameter to the Trainer

Environment Info:

{'python': '3.11.7',
 'sentence_transformers': '3.1.1',
 'transformers': '4.44.2',
 'torch': '2.4.1',
 'accelerate': '0.34.2',
 'datasets': '3.0.0',
 'tokenizers': '0.19.1'}

Happy to look into / propose a fix if appropriate!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant