Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different generation parameters in the same batch #1209

Open
juliensalinas opened this issue May 5, 2023 · 0 comments
Open

Different generation parameters in the same batch #1209

juliensalinas opened this issue May 5, 2023 · 0 comments
Labels
enhancement New feature or request

Comments

@juliensalinas
Copy link

juliensalinas commented May 5, 2023

Hello team,

Today, batch generation works like the HF generate() function: it accepts several input texts but generation parameters (like temperature, top k, etc.) apply to the whole batch, so it is not possible to use different parameters within the same batch.

Is it because using different parameters in the same batch would degrade performance so much that it would defeat the purpose of batch generation?

Ideally it would be awesome if one could do something like this:

generator.generate_batch([
    {"input":input_1, "max_length":30, "sampling_topk":10},
    {"input":input_2, "max_length":150, "sampling_topk":50},
    {"input":input_3, "max_length":10, "sampling_topk":50},
    ...
])

For example this is something that can be achieved with NVIDIA Faster Transformers: https://github.com/NVIDIA/FasterTransformer/blob/main/examples/pytorch/gpt/gpt_example.py

Thank you!

@guillaumekln guillaumekln added the enhancement New feature or request label May 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants