Hugging Face 🤗 Transformers provides thousands of pretrained models for text, vision, and audio tasks. Easily download models like BERT and GPT-2 or finetune them on your own data.
Hugging Face 🤗 Transformers provides an extensive model library and easy-to-use APIs for leveraging state-of-the-art natural language processing (NLP), computer vision, and speech models.
🔎 Thousands of Pretrained Models Access a vast model zoo containing over 10,000 pretrained models for text, image, audio and multimodal tasks.
💬 NLP Models Choose from popular models like BERT, GPT-2, T5 and BART for text tasks such as:
- Text classification
- Question answering
- Summarization
- Translation
- Text generation
🖼️ Computer Vision Models Select vision models like ViT, DETR and CLIP for:
- Image classification
- Object detection
- Image segmentation
🎧 Audio Models Use audio models like Wav2Vec2, HuBERT, Whisper for:
- Speech recognition
- Audio classification
- Speech translation
🤝 Simple APIs Easily download and integrate models into projects with just a line of code
✏️ Easy Fine-tuning Customize models for your own data and use cases by adding dataset-specific patterns. Saves work compared to training custom models.
🧑🔧 Flexible Frameworks Use models across TensorFlow, PyTorch and JAX so you can optimize for training, evaluation and production.
🔌 Active Community Benefit from an open source community building exciting projects with Transformers and sharing model contributions.
-
💡 Extensive library of thousands of pretrained models for a vast array of natural language processing, computer vision, speech and audio tasks. Engineers can leverage this huge catalogue of existing models instead of building their own from scratch.
-
🤝 Easy to use APIs let you download popular models like BERT, GPT-2, and ViT with just a single line of code. You can quickly integrate state-of-the-art models into your projects and products without needing to train them yourself.
-
🧑💻 Easy fine-tuning allows you to customize models on your own datasets so the models learn specific patterns and information relevant to your use case. Saves resources compared to building custom models.
-
🌟 Vibrant open source community constantly shares new model contributions and exciting projects built using transformers. Great way to get model ideas, project inspiration, and technical support.
-
🛠️ Unified interface works across TensorFlow, PyTorch, & JAX so you can seamlessly switch between frameworks without rewriting entire model codebases. Flexibility allows using the right framework.
- 👷🏽♀️ Builders: Lysandre Debut, Sylvain Gugger, Thomas Wolf
- 👩🏽💼 Builders on LinkedIn: https://www.linkedin.com/in/lysandredebut/, https://www.linkedin.com/in/sylvain-gugger-74218b144/, https://www.linkedin.com/in/thomas-wolf-a056857/, https://www.linkedin.com/in/yih-dar-shieh/
- 👩🏽🏭 Builders on X: https://twitter.com/LysandreJik, https://twitter.com/guggersylvain, https://twitter.com/Thom_wolf
- 💾 Used in 107k repositories
- 👩🏽💻 Contributors: 2248
- 💫 GitHub Stars: 118k
- 🍴 Forks: 23.7k
- 👁️ Watch: 1.1k
- 🪪 License: Apache-2.0
- 🔗 Links: Below 👇🏽
- GitHub Repository: https://github.com/huggingface/transformers
- Official Website: https://huggingface.co/docs/transformers/index
- LinkedIn Page: https://www.linkedin.com/company/huggingface/
- X Page: https://twitter.com/huggingface
- Profile in The AI Engineer: https://github.com/theaiengineer/awesome-opensource-ai-engineering/blob/main/libraries/transformers/README.md
🧙🏽 Follow The AI Engineer for more about Transformers and daily insights tailored to AI engineers. Subscribe to our newsletter. We are the AI community for hackers!
♻️ Repost this to help Transformers become more popular. Support AI Open-Source Libraries!