metadata
library_name: transformers
pipeline_tag: text-generation
license: llama3
language:
- en
base_model:
- meta-llama/Meta-Llama-3-8B
Llama 3 Finetuned Historical Model (1820 - 1850)
This model was finetuned using DoRA adapters, from the Llama3 8B model.
It was finetuned on 10M words from the Gutenberg Corpus attributed to the time period 1820 - 1850.
Model Sources
- Repository: https://github.com/comp-int-hum/historical-perspectival-lm
- Paper (ArXiv): https://arxiv.org/abs/2504.05523
- Paper (Hugging Face): https://huggingface.co/papers/2504.05523
Downloading the Model
Load the model like this:
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Hplm/dora_llama_model_1820_1850", torch_dtype=torch.float16)
tokenizer = AutoTokenizer.from_pretrained("Hplm/dora_llama_model_1820_1850")
License
Built with Meta Llama 3, and under the meta-llama licence.
Citation
@article{fittschen_diachroniclanguagemodels_2025,
title = {Pretraining Language Models for Diachronic Linguistic Change Discovery},
author = {Fittschen, Elisabeth and Li, Sabrina and Lippincott, Tom and Choshen, Leshem and Messner, Craig},
year = {2025},
month = apr,
eprint = {2504.05523},
primaryclass = {cs.CL},
publisher = {arXiv},
doi = {10.48550/arXiv.2504.05523},
url = {https://arxiv.org/abs/2504.05523},
urldate = {2025-04-14},
archiveprefix = {arXiv},
journal = {arxiv:2504.05523[cs.CL]}
}