DACTYL-Pretraining / README.md
ShantanuT01's picture
Update README.md
08b4181 verified
metadata
license: mit
task_categories:
  - text-generation
language:
  - en
size_categories:
  - 100K<n<1M

DACTYL Pretraining Corpus

This corpus contains human texts that have passed a quality check from the textdescriptives library.

These texts have been used to further train various Llama 3.2 1B Instruct models by domain and split. For a domain/split combination, you can use the corresponding model:

ShantanuT01/fine-tuned-Llama-3.2-1B-Instruct-apollo-mini-{domain}-{split}

Citation

@misc{thorat2025dactyldiverseadversarialcorpus,
      title={DACTYL: Diverse Adversarial Corpus of Texts Yielded from Large Language Models}, 
      author={Shantanu Thorat and Andrew Caines},
      year={2025},
      eprint={2508.00619},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2508.00619}, 
}