After pushing, this README will be displayed as your model card on the Hugging Face Hub. --- ### **Step 3: Use the Model** Once the model is pushed to Hugging Face, you can load and use it in any Python environment: ```python from transformers import AutoModelForCausalLM, AutoTokenizer # Load the model from Hugging Face Hub model_name = "modeltrainer1/Saba-Ethiopia" model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype="auto") tokenizer = AutoTokenizer.from_pretrained(model_name) # Generate text inputs = tokenizer("Hello, how are you?", return_tensors="pt") outputs = model.generate(**inputs) print(tokenizer.decode(outputs[0]))