hplovecraft / README.md
network-centrality-labs's picture
Duplicate from theoracle/hplovecraft
89d9789 verified
metadata
tags:
  - autotrain
  - text-generation
  - text-generation-inference
  - peft
  - lora
library_name: transformers
base_model: google/gemma-2-2b-it
datasets:
  - TristanBehrens/lovecraftcorpus
license: other
widget:
  - text: Write a short horror passage in the style of H. P. Lovecraft.

🐙 theoracle/hplovecraft Gemma-2B-IT finetuned on Lovecraft’s cosmic-horror corpus Overview

theoracle/hplovecraft is a LoRA-finetuned version of google/gemma-2-2b-it, trained on the TristanBehrens/lovecraftcorpus dataset using AutoTrain Advanced.

The objective of this model is to reproduce the literary tone and thematic patterns typical of H. P. Lovecraft, including:

dense atmospheric descriptions

archaic vocabulary and formal cadence

cosmic dread and metaphysical terror

first-person “confessional” narration

references to forbidden knowledge, ancient cults, and non-Euclidean horrors

This model is intended for creative writing, fiction generation, and experimentation with stylistic conditioning.

Usage

Minimal working example:

from transformers import pipeline

pipe = pipeline( "text-generation", model="theoracle/hplovecraft", max_new_tokens=300, temperature=0.9, top_p=0.9, )

prompt = "At dusk, I heard the distant cry of something not meant for human ears..." print(pipe(prompt)[0]["generated_text"])

Training Details

Base model: google/gemma-2-2b-it

Method: LoRA (PEFT)

Trainer: AutoTrain Advanced

Dataset: TristanBehrens/lovecraftcorpus

Task: Supervised fine-tuning for causal LM

Block size: 1024

Epochs: 2

Precision: FP16

Quantization: INT4 during training (bitsandbytes)

Strengths

Strong stylistic fidelity to Lovecraft’s prose

Produces long, immersive horror passages

Good at evoking dread, ancient mythos, and cosmic insignificance

Maintains archaic tone without collapsing into incoherence

Limitations

May generate dark or disturbing content (intended for horror writing)

Not tuned for factual or instructional tasks

May over-use specific Lovecraft tropes when prompted repeatedly

Acknowledgements

Google for the Gemma family

Tristan Behrens for the dataset

Hugging Face AutoTrain for the training framework