Transformers How to use bkhmsi/micro-llama-3b with Transformers:
# Load model directly
from transformers import AutoTokenizer, MiCRoLlama
tokenizer = AutoTokenizer.from_pretrained("bkhmsi/micro-llama-3b")
model = MiCRoLlama.from_pretrained("bkhmsi/micro-llama-3b")