How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="aboros98/kepler1")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("aboros98/kepler1")
model = AutoModelForCausalLM.from_pretrained("aboros98/kepler1")
Quick Links

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Metric Value
Average -
ARC 36.01
ARC Easy 59.60
BoolQ 71.87
HellaSwag 58.07
OpenBookQA 33.80
PiQA 75.24
Winogrande 56.20
----------------------- ---------------------------
MMLU 38.63
GSM8K
Truthful QA 45.76
Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support