File size: 1,171 Bytes
a39f105
 
 
 
 
 
 
 
 
 
 
 
 
d455200
a39f105
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
---
license: mit
language:
- en
tags:
- text-generation-inference
pipeline_tag: text-generation
---

![if-your-ai-girlfriend-is-not-a-locally-running-fine-tuned-v0-04wo67pdnuvf1](https://cdn-uploads.huggingface.co/production/uploads/64b7618e2f5a966b972e9978/8x-Ef3kxYJMfhwZ1CPHvp.png)

## GPT-Fem-Micro
An 6.8-million parameter LLM using GPT-2 encodings.
Trained using 16GB of text relating to and made by women, along with 1GB of multilingual text. (5.2 billion tokens)

This model should be fine-tuned before use.

![Screenshot from 2026-01-18 22-57-13](https://cdn-uploads.huggingface.co/production/uploads/64b7618e2f5a966b972e9978/50XOSomGKO0FQRtPEOjUO.png)

## Languages:
English,
Turkish,
Swedish,
Serbian,
Portugese,
Norwegian,
Welsh,
Thai,
Polish,
French,
Finnish,
Dutch,
Arabic,
Korean,
Japanese,
Danish,
Croatian,
Spanish,
Russian,
Chinese


## Technical Information
|                                 |     |
|---------------------------------|----:|
|Layers                           |2|
|Heads                            |2|
|Embeddings                       |128|
|Context Window                   |4096 tokens|
|Tokenizer                        |GPT-2 BPE|