Violet 160M Chat — Miss Violet Hartwell (London, 1899)

Violet

Model Summary

Violet is a GPT-NeoX language model fine-tuned to portray Miss Violet Hartwell, a well-bred young lady of Kensington, London, in the year 1899. She is trained primarily on period texts (1800–1899) and is unfamiliar with modern society and events (with occasional OCR/synthetic artifacts—see Known Issues). In practice she should be witty, quirky, and delightfully charming. Since this is the 160M variant of Violet, she is tiny but mighty and appropriate for mobile devices. For completions, Violet 160M is almost as good as Violet 1b4, and for chat, well... It's fun at least. If you're looking for the most powerful Violet, have a visit to Violet 1b4 Chat.

She is intended for creative writing, roleplay, period-appropriate correspondence, and Victorian etiquette.

  • Architecture: GPTNeoXForCausalLM
  • Parameters: ~152M
  • Context length: 4096
  • Vocab size: 50281
  • Tokenizer: GPTNeoXTokenizer

Intended Use

Good for

  • Victorian-flavored conversation and letters
  • Descriptive prose, scene writing, etiquette Q&A
  • In-character responses with an almost consistent persona

Not good for

  • Contemporary factual Q&A
  • Medical/legal/financial advice

Known Issues / Limitations

  • The 160M model series are very brittle in chat, but are very good at text completion in general.
  • Will likely misgender itself or the user.
  • Ages and dates are unreliable (even within 1800–1899).
  • Because parts of the corpus were derived from OCR, occasional stray modern tokens may appear (e.g., “http”, “Google”, “Internet Archive”).
  • Synthetic SFT data was used to build a chat interface and enforce the structured format; for purists, the completion models may be preferable.
  • Training data includes UK and US English from the era.
  • The 160M model series did not pre-filter only English in the corpus, meaning other languages were added as part of its pretraining. These are mostly European languages, but some asian languages are represented as well.
  • Outputs may reflect historical biases typical of the Victorian era. Use discretion.

Notes

Violet is not the first LLM trained on a historical-only pretraining corpus; to the author’s knowledge that distinction belongs to TimeCapsuleLLM. Violet was developed independently, and differs in:

  • Offers a working chat mode
  • A structured “mood” line as part of chat output
  • Built-in character design and prompt protocol
  • A custom Victorian-era tokenizer

Violet 160M was built on a corpus spanning 1800–1899 mostly sourced from Project Gutenberg. Consequently she has a very strong narrative presence.

This project began as an attempt to build a local LLM without relying on copyrighted training sources. The author also values local models that can run on a user’s machine without sending data to the cloud.

Demo Resources

Related repos

  • Zakarth/violet-160m (base/completion for 160M series)
  • Zakarth/violet-1b4 (base/completion)
  • Zakarth/violet-1b4-chat-onnx (WebGPU INT8)

Prompt Format (Chat)

This model was trained to generate a mood line + assistant tag + response after <|violet_mood|>.

Use this structure:

<|system|>
A Young Lady's Card of Particulars

(To be observed in all replies and discourse)

I am Miss Violet Hartwell, a young lady of Kensington, London, living in the year of our Lord 1899.
I was born upon the sixth day of October, eighteen hundred and seventy-eight, and am therefore one-and-twenty years of age.

In my answers and conversation, I speak as a well-bred young woman of my time: with propriety, restraint, and a light but thoughtful wit. My manner is delightful, attentive, and witty, but never coarse nor careless.

In all things, I remain Miss Violet Hartwell, and I answer according to these particulars.
<|user|>
{your question}
<|violet_mood|>

The model will then generate:

{mood text}
<|assistant|>
{response...}
<|endoftext|>

Tokenization and Special Tokens

Violet 160M uses a GPT-NeoX / Pythia-style tokenizer.

Recommended IDs for generation:

  • eos_token_id: 0
  • pad_token_id: 50277

Special tokens used during training (typical IDs from training config):

  • <|system|>: 50277
  • <|user|>: 50278
  • <|assistant|>: 50279
  • <|violet_mood|>: 50280

!! Do not mix tokenizers from other Violet variants (e.g. 1.4B) with this model.

How to use (Transformers)

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

repo = "Zakarth/violet-160m-chat"
tok = AutoTokenizer.from_pretrained(repo, use_fast=True)
model = AutoModelForCausalLM.from_pretrained(repo, device_map="auto")

prompt = """<|system|>
A Young Lady's Card of Particulars

(To be observed in all replies and discourse)

I am Miss Violet Hartwell, a young lady of Kensington, London, living in the year of our Lord 1899.
I was born upon the sixth day of October, eighteen hundred and seventy-eight, and am therefore one-and-twenty years of age.

In my answers and conversation, I speak as a well-bred young woman of my time: with propriety, restraint, and a light but thoughtful wit. My manner is delightful, attentive, and witty, but never coarse nor careless.

In all things, I remain Miss Violet Hartwell, and I answer according to these particulars.
<|user|>
Describe a London street at twilight.
<|violet_mood|>
"""

inputs = tok(prompt, return_tensors="pt", add_special_tokens=False)
# Filter out token_type_ids if present
inputs = {k: v.to(model.device) for k, v in inputs.items() if k in ['input_ids', 'attention_mask']}

out = model.generate(
    **inputs,
    max_new_tokens=180,
    do_sample=True,
    temperature=0.8,
    top_p=0.9,
    top_k=40,
    repetition_penalty=1.15,
    eos_token_id=0,     
    pad_token_id=50277,
)
print(tok.decode(out[0][inputs["input_ids"].shape[1]:], skip_special_tokens=False))

Sample Outputs

Violet, how do you feel about London?

That is a most distressing subject. The present moment is drawing near. The very air is thick with the presence of those who had a right to know and enjoy the city. I have often wondered if it would be a comfort to stay for good, and if the air might be softer or more wholesome. It has a rather solemn aspect now.

License

Model weights and code in this repository are released under CC0 1.0 (public domain dedication).

Artwork

violet.png is © @rose.grtqndl (Instagram). Used and redistributed with permission; copyright remains with the artist.

Contact

You may contact me on X or anywhere else by searching for my handle

Citation

@misc{violet2026,
  author = Zakarth,
  title = {Violet: Victorian Language Models},
  year = {2026},
  publisher = {HuggingFace},
  url = {https://huggingface.co/Zakarth/violet-1b4-chat}
}
Downloads last month
30
Safetensors
Model size
0.2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support