File size: 2,050 Bytes
c6ddec1
0fba120
c6ddec1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0fba120
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
# TinyTimV1: Fine-tuning TinyLlama on Finnegan's Wake

A project exploring the fine-tuning of TinyLlama-1.1B on James Joyce's *Finnegan's Wake* to generate Joyce-inspired text.

## Overview

This project fine-tunes the TinyLlama-1.1B-Chat model on the complete text of James Joyce's *Finnegan's Wake*, creating a language model capable of generating text in Joyce's distinctive experimental style. The model learns to replicate the complex wordplay, neologisms, and stream-of-consciousness narrative techniques characteristic of Joyce's final work.

## Files

- `process_wake.py` - Preprocesses the raw text, removes page numbers, and splits into manageable chunks
- `fine_tune_joyce.py` - Main training script using HuggingFace Transformers
- `text_gen.py` - Text generation script for the fine-tuned model
- `finn_wake.txt` - Complete text of Finnegan's Wake (1.51 MB)
- `finn_wake.csv` - Processed dataset in CSV format
- `finn_wake_dataset/` - Tokenized dataset directory

## Usage

### 1. Data Preprocessing
```bash
python process_wake.py
```
This removes page numbers and splits the text into 100-word chunks for training.
2. Fine-tuning

```bash
python fine_tune_joyce.py
```
Fine-tunes TinyLlama on the processed dataset for 3 epochs with CPU training.
3. Text Generation
```bash
python text_gen.py
```
Generates Joyce-inspired text using the fine-tuned model.

Model Details

Base Model: TinyLlama-1.1B-Chat-v1.0
Training Data: Finnegan's Wake (~1.5MB text)
Training Parameters:

3 epochs
Batch size: 1
Max sequence length: 128 tokens
Temperature: 0.7
Top-k: 50, Top-p: 0.95



Example Output
Input: "ae left to go to ireland and found a fairy"
The model generates text continuing in Joyce's experimental style with invented words, Irish references, and complex linguistic play.
Requirements
transformers
datasets
pandas
torch
Installation
bashpip install transformers datasets pandas torch
Notes

Training was performed on CPU due to resource constraints
Model checkpoints saved every 500 steps
Resume training supported from checkpoints