Spaces:
Runtime error
Runtime error
| title: Mmmm | |
| emoji: ๐ | |
| colorFrom: red | |
| colorTo: indigo | |
| sdk: docker | |
| sdk_version: 5.4.0 | |
| app_file: app.py | |
| pinned: false | |
| license: bigscience-openrail-m | |
| duplicated_from: ysharma/ChatGPT4 | |
| disable_embedding: true | |
| datasets: | |
| - allenai/WildChat-1M | |
| - allenai/WildChat-1M-Full | |
| - allenai/WildChat | |
| models: | |
| - allenai/WildLlama-7b-user-assistant | |
| - allenai/WildLlama-7b-assistant-only | |
| short_description: nbb | |
| - https://arxiv.org/abs/2405.01470 | |
| - https://arxiv.org/abs/2409.03753 | |
| - | |
| huggingface-cli | |
| # Use a pipeline as a high-level helper | |
| from transformers import pipeline | |
| pipe = pipeline("text-generation", model="allenai/WildLlama-7b-assistant-only") | |
| # Load model directly | |
| from transformers import AutoTokenizer, AutoModelForCausalLM | |
| tokenizer = AutoTokenizer.from_pretrained("allenai/WildLlama-7b-assistant-only") | |
| model = AutoModelForCausalLM.from_pretrained("allenai/WildLlama-7b-assistant-only") |