# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("itsliupeng/llama2_7b_code")
model = AutoModelForCausalLM.from_pretrained("itsliupeng/llama2_7b_code")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Open LLM Leaderboard Evaluation Results
Detailed results can be found here
| Metric | Value |
|---|---|
| Avg. | 42.81 |
| ARC (25-shot) | 52.13 |
| HellaSwag (10-shot) | 75.71 |
| MMLU (5-shot) | 48.05 |
| TruthfulQA (0-shot) | 38.76 |
| Winogrande (5-shot) | 71.51 |
| GSM8K (5-shot) | 8.11 |
| DROP (3-shot) | 5.39 |
- Downloads last month
- 795
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="itsliupeng/llama2_7b_code")