Instructions to use Maxx0/testing-sexting with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Maxx0/testing-sexting with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("georgesung/llama2_7b_chat_uncensored") model = PeftModel.from_pretrained(base_model, "Maxx0/testing-sexting") - Notebooks
- Google Colab
- Kaggle
Librarian Bot: Add base_model information to model
This pull request aims to enrich the metadata of your model by adding georgesung/llama2_7b_chat_uncensored as a base_model field, situated in the YAML block of your model's README.md.
How did we find this information? We extracted this infromation from the adapter_config.json file of your model.
Why add this? Enhancing your model's metadata in this way:
- Boosts Discoverability - It becomes straightforward to trace the relationships between various models on the Hugging Face Hub.
- Highlights Impact - It showcases the contributions and influences different models have within the community.
For a hands-on example of how such metadata can play a pivotal role in mapping model connections, take a look at librarian-bots/base_model_explorer.
This PR comes courtesy of Librarian Bot. If you have any feedback, queries, or need assistance, please don't hesitate to reach out to @davanstrien.
If you want to automatically add base_model metadata to more of your modes you can use the Librarian Bot Metadata Request Service!