Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Meta llama-3 support! #1856

Merged
merged 5 commits into from
Apr 19, 2024
Merged

Added Meta llama-3 support! #1856

merged 5 commits into from
Apr 19, 2024

Conversation

pratham-darooka
Copy link
Contributor

@pratham-darooka pratham-darooka commented Apr 18, 2024

Decided to change llama2.py to llama.py to hold all the llama family models.

Closes #1855

Roadmap:

  • Add compatible providers (like DeepInfra, etc.)

Decided to change llama2.py to llama.py to hold all the llama family models.
@hlohaus
Copy link
Collaborator

hlohaus commented Apr 18, 2024

Do DeepInfra, HuggingChat also support llama3?

Edit: HuggingChat supports only 70b?

@pratham-darooka
Copy link
Contributor Author

pratham-darooka commented Apr 19, 2024

@hlohaus: You are right! HuggingChat only supports 70B and DeepInfra has no compatibility yet.

Updated with new commits.

@hlohaus
Copy link
Collaborator

hlohaus commented Apr 19, 2024

You are the best. Thank you!

@hlohaus hlohaus merged commit 5fd118f into xtekky:main Apr 19, 2024
@pratham-darooka
Copy link
Contributor Author

@hlohaus actually I am not the best: seems like Hugging Chat changed their model to meta-llama/Meta-Llama-3-70B-Instruct instead.

Also, I am stuck with Llama2.ai provider - doesn’t seem to work with llama3 for some reason. Maybe they are using instruct too?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Roadmap] Add llama-3 support
2 participants