jamander's picture
Update README.md
c3cdc0f verified
|
raw
history blame
1.75 kB
metadata
license: mit
language:
  - en
pipeline_tag: text-generation
library_name: transformers.js

base_model: mistralai/Mistral-7B-v0.1


license: mit language: - en pipeline_tag: text-generation library_name: transformers base_model: mistralai/Mistral-7B-v0.1

Project-Frankenstein

Model Overview

Model Name: Project-Frankenstein
Model Type: Text Generation
Base Model: Mistral-7B-v0.1
Fine-tuned by: Jack Mander

Description:
Project-Frankenstein is a text generation model fine-tuned to generate fan fiction in the style of Mary Shelley's "Frankenstein." It uses the complete text of "Frankenstein" as its training data to produce coherent and stylistically consistent fan fiction.

Model Details

Model Architecture:

  • Base Model: Mistral-7B-v0.1
  • Tokenizer: AutoTokenizer from Hugging Face Transformers
  • Training Framework: Transformers, Peft, and Accelerate libraries

Training Data:

  • The model was fine-tuned using the text of "Frankenstein" by Mary Shelley.
  • The text was split into training and test datasets using an 80/20 split.
  • Converted Pandas DataFrames to Hugging Face Datasets.

Hyperparameters:

  • Learning Rate: 2e-5
  • Epochs: 2
  • Optimizer: Paged AdamW 8-bit

Training Procedure

The model was trained on a Tesla T4 GPU using Google Colab. The training involved the following steps:

  1. Data Preparation:
    • The text of "Frankenstein" was preprocessed and split into training and test datasets.
  2. Model Training:
    • The model was trained for 2 epochs with a learning rate of 2e-5 using the Paged AdamW 8-bit optimizer.

Example Generations

Base Model Generation: