How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Ont/Marcoroni-13B")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Ont/Marcoroni-13B")
model = AutoModelForCausalLM.from_pretrained("Ont/Marcoroni-13B")
Quick Links

Marcoroni-13B - Safetensors

A conversion of the original model [AIDC-ai-business/Marcoroni-13B] to safetensors format.

Marcoroni-13B

Model Details

  • Trained by: trained by AIDC AI-Business.
  • Model type: Marcoroni-13B is an auto-regressive language model based on the Llama 2 transformer architecture.
  • Language(s): English
  • License for Marcoroni-13B base weights: Non-Commercial Creative Commons license (CC BY-NC-4.0)

Prompting

Prompt Template for alpaca style

### Instruction:

<prompt> (without the <>)

### Response:

Evaluation Results (Open LLM Leaderboard)

Metric Value
Avg. 65.76
ARC (25-shot) 62.46
HellaSwag (10-shot) 83.27
MMLU (5-shot) 59.63
TruthfulQA (0-shot) 57.7
Downloads last month
11
Safetensors
Model size
13B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train Ont/Marcoroni-13B