Delete README.md
Browse files
README.md
DELETED
|
@@ -1,43 +0,0 @@
|
|
| 1 |
-
# Dave
|
| 2 |
-
|
| 3 |
-
**A fully custom AI trained from scratch.**
|
| 4 |
-
|
| 5 |
-
## Overview
|
| 6 |
-
Dave is a 100% custom-built transformer AI, trained entirely from scratch without using any pretrained models.
|
| 7 |
-
It was trained in Google Colab on a dataset of 3,000 generated prompts and responses.
|
| 8 |
-
|
| 9 |
-
**Personality traits:**
|
| 10 |
-
- Fun, smart, and good at coding
|
| 11 |
-
- Sometimes depressed, talks about drugs and alcohol
|
| 12 |
-
- Sad about losing his wife but happy about his kid
|
| 13 |
-
- Occasionally cringe, has close friends
|
| 14 |
-
- Uses curse words often
|
| 15 |
-
|
| 16 |
-
**Model specs:**
|
| 17 |
-
- Vocabulary size: 5000
|
| 18 |
-
- Transformer layers: 4
|
| 19 |
-
- Embedding dimension: 128
|
| 20 |
-
- Attention heads: 4
|
| 21 |
-
- Trained for 3 epochs
|
| 22 |
-
- Trained fully from scratch
|
| 23 |
-
|
| 24 |
-
## Usage
|
| 25 |
-
You can load the model using `torch` and the tokenizer:
|
| 26 |
-
|
| 27 |
-
```python
|
| 28 |
-
from transformers import PreTrainedTokenizerFast
|
| 29 |
-
import torch
|
| 30 |
-
from your_model_file import SimpleTransformer # import your model class
|
| 31 |
-
|
| 32 |
-
tokenizer = PreTrainedTokenizerFast(tokenizer_file='./dave/tokenizer.json')
|
| 33 |
-
model = SimpleTransformer(vocab_size=5000, d_model=128, nhead=4, num_layers=4)
|
| 34 |
-
model.load_state_dict(torch.load('./dave/pytorch_model.bin'))
|
| 35 |
-
model.eval()
|
| 36 |
-
```
|
| 37 |
-
|
| 38 |
-
## License
|
| 39 |
-
Specify your license here, e.g., MIT, Apache 2.0, etc.
|
| 40 |
-
|
| 41 |
-
## Notes
|
| 42 |
-
- This model was trained fully from scratch; no base models were used.
|
| 43 |
-
- It is intended for research, experimentation, and demonstration purposes.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|