How to use voidful/Llama-Typhoon-8B-R1 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="voidful/Llama-Typhoon-8B-R1", trust_remote_code=True)
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("voidful/Llama-Typhoon-8B-R1", trust_remote_code=True, dtype="auto")