You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am currently using the Unsloth framework for fine-tuning large models. I have a question regarding the support for custom attention masks.
As of now, I understand that the framework supports the causal mask. However, I would like to know if it's possible to implement or customize a bidirectional attention mask, or if this feature is supported out of the box.
Could you please clarify whether this functionality is available, and if not, provide guidance on how to implement a bidirectional mask for my use case?
Thank you!
The text was updated successfully, but these errors were encountered:
I'm pretty sure you'll be able to as I saw a user talking about managing to do it but it will require custom stuff. You can join our Discord and maybe ask there if you'd like: https://discord.com/invite/unsloth
Hi, I am currently using the Unsloth framework for fine-tuning large models. I have a question regarding the support for custom attention masks.
As of now, I understand that the framework supports the causal mask. However, I would like to know if it's possible to implement or customize a bidirectional attention mask, or if this feature is supported out of the box.
Could you please clarify whether this functionality is available, and if not, provide guidance on how to implement a bidirectional mask for my use case?
Thank you!
The text was updated successfully, but these errors were encountered: