| # Saba-Ethiopia | |
| A fine-tuned LLaMA-3 4-bit model trained for [specific purpose]. | |
| ## Model Details | |
| - **Base Model**: LLaMA-3 3B | |
| - **Quantization**: 4-bit | |
| - **Use Case**: [Describe what the model is fine-tuned for] | |
| ## Usage | |
| To use this model in your code: | |
| ```python | |
| from transformers import AutoModelForCausalLM, AutoTokenizer | |
| model = AutoModelForCausalLM.from_pretrained("modeltrainer1/Saba-Ethiopia", torch_dtype="auto") | |
| tokenizer = AutoTokenizer.from_pretrained("modeltrainer1/Saba-Ethiopia") | |
| inputs = tokenizer("Your input text here", return_tensors="pt") | |
| outputs = model.generate(**inputs) | |
| print(tokenizer.decode(outputs[0])) | |