Instructions to use hazyresearch/based-1b-50b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use hazyresearch/based-1b-50b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("hazyresearch/based-1b-50b", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Add missing metadata: library_name, pipeline_tag, and license
#2
by nielsr HF Staff - opened
This PR adds missing metadata to the model card:
library_name: transformers: Specifies the model's compatibility with the Hugging Face Transformers library.pipeline_tag: text-generation: Correctly identifies the model's function as text generation.license: mit: Adds the license information from the GitHub README.- Moved citation block to the top under the metadata.
- Added example code snippets for loading and using the model.
This improves discoverability and usability of the model.