GPT2-Alpaca / README.md
Arjun-G-Ravi's picture
Update README.md
c5235ce
|
raw
history blame
392 Bytes
metadata
license: mit
datasets:
  - tatsu-lab/alpaca
language:
  - en
metrics:
  - accuracy
library_name: transformers
pipeline_tag: text-generation

GPT-2 model finetuned with Alpaca dataset

This is the fine tuned version of OpenAI's GPT-2 with Alpaca dataset. The model was trained for 15 epochs.

As of my tests, the best resuts were obtained with: temperature:0.7, top_p:0.92, top_k: 0