How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="maicomputer/alpaca-native")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("maicomputer/alpaca-native")
model = AutoModelForCausalLM.from_pretrained("maicomputer/alpaca-native")
Quick Links

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Stanford Alpaca

This is a replica of Alpaca by Stanford' tatsu

Trained using the original instructions with a minor modification in FSDP mode

Open LLM Leaderboard Evaluation Results

Metric Value
Avg. 41.96
ARC (25-shot) 52.3
HellaSwag (10-shot) 77.09
MMLU (5-shot) 41.6
TruthfulQA (0-shot) 37.58
Winogrande (5-shot) 69.46
GSM8K (5-shot) 1.44
DROP (3-shot) 14.23
Downloads last month
1,627
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for maicomputer/alpaca-native

Adapters
6 models
Finetunes
2 models
Quantizations
2 models

Spaces using maicomputer/alpaca-native 47

Collection including maicomputer/alpaca-native