How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="aboros98/lilo3")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("aboros98/lilo3")
model = AutoModelForCausalLM.from_pretrained("aboros98/lilo3")
Quick Links
Metric Value
Average -
ARC 35.84
ARC Easy 55.80
BoolQ 72.23
HellaSwag 64.00
OpenBookQA 32.40
PiQA 72.14
Winogrande 56.27
----------------------- ---------------------------
MMLU 38.53
GSM8K
Truthful QA 44.46
Downloads last month
8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support