YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
My Local LLM
Model Details
- Architecture: BERT / GPT / Custom
- Hidden Size: 256
- Layers: 6
- Heads: 4
- Vocabulary Size: 30,522
- Tokenizer: Word-level / BPE
Training Data
- Description of dataset (e.g., cricket commentary, Wikipedia subset)
- Preprocessing steps
Intended Use
- Text generation
- Question answering
- Classification
Limitations
- May hallucinate
- Biases present in training data
How to use
from transformers import AutoModel, AutoTokenizer
tok = AutoTokenizer.from_pretrained("username/my-llm")
model = AutoModel.from_pretrained("username/my-llm")
inputs = tok("Hello world!", return_tensors="pt")
outputs = model(**inputs)
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support