Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("weijie210/Llama-3.2-1B_cladder_6")
tokenizer = AutoTokenizer.from_pretrained("weijie210/Llama-3.2-1B_cladder_6")
Downloads last month
2
Safetensors
Model size
1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support