How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="pankajmathur/model_420_preview")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("pankajmathur/model_420_preview")
model = AutoModelForCausalLM.from_pretrained("pankajmathur/model_420_preview")
Quick Links

LlaMA-2 License, more details coming soon...

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 55.99
ARC (25-shot) 67.06
HellaSwag (10-shot) 87.26
MMLU (5-shot) 69.85
TruthfulQA (0-shot) 44.57
Winogrande (5-shot) 83.35
GSM8K (5-shot) 33.21
DROP (3-shot) 6.6

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 64.22
AI2 Reasoning Challenge (25-Shot) 67.06
HellaSwag (10-Shot) 87.26
MMLU (5-Shot) 69.85
TruthfulQA (0-shot) 44.57
Winogrande (5-shot) 83.35
GSM8k (5-shot) 33.21
Downloads last month
259
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Spaces using pankajmathur/model_420_preview 29

Evaluation results