GPT-Fem-Micro / README.md
HDTenEightyP's picture
Update README.md
d455200 verified
metadata
license: mit
language:
  - en
tags:
  - text-generation-inference
pipeline_tag: text-generation

if-your-ai-girlfriend-is-not-a-locally-running-fine-tuned-v0-04wo67pdnuvf1

GPT-Fem-Micro

An 6.8-million parameter LLM using GPT-2 encodings. Trained using 16GB of text relating to and made by women, along with 1GB of multilingual text. (5.2 billion tokens)

This model should be fine-tuned before use.

Screenshot from 2026-01-18 22-57-13

Languages:

English, Turkish, Swedish, Serbian, Portugese, Norwegian, Welsh, Thai, Polish, French, Finnish, Dutch, Arabic, Korean, Japanese, Danish, Croatian, Spanish, Russian, Chinese

Technical Information

Layers 2
Heads 2
Embeddings 128
Context Window 4096 tokens
Tokenizer GPT-2 BPE