File size: 2,310 Bytes
1136b31
 
 
 
 
 
 
 
 
 
 
 
fe849b9
 
 
5237558
 
 
 
 
 
 
 
 
 
 
 
 
 
98f7048
53b9f8f
687d537
1d69751
 
 
 
 
8a732bc
 
 
1d69751
8a732bc
 
 
1d69751
f884bd3
dbcd41d
f884bd3
 
 
 
 
 
 
 
 
 
5237558
 
 
 
 
 
 
 
1d69751
8a732bc
 
3009178
8a732bc
3009178
f884bd3
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
license: cc0-1.0
datasets:
- Navanjana/Gutenberg_books
- aisuko/simple_english_wikipedia
- stas/openwebtext-10k
- RaiBP/openwebtext2-first-30-chunks-lang-detect-raw-output
- lucadiliello/bookcorpusopen
- deepmind/pg19
language:
- en
pipeline_tag: text-generation
library_name: transformers
tags:
- Self
model-index:
- name: AaI
  results:
  - task:
      type: text-classification
      name: Multiple Choice
    dataset:
      name: ai2_arc
      type: ai2_arc
      config: ARC-Easy
      split: test
    metrics:
    - name: Accuracy
      type: accuracy
      value: 0.8
---

## **Safety Concerns**

This model has not passed any safety tuning. We are not responsible for any damages. We updated this model from .pth to .safetensors.

## AaI Introduction

AaI is a model fully made by 16dvnk on his NVIDIA Geforce RTX 4080 Laptop GPU. He trained it for 11 hours straight, and after some tuning, has made this model. The model is made from scratch. He claims the process was a pain, and has taken lots of effort. He named it AaI and not AAI or other variations since he thinks it is an “eyesore”.

## Architecture

The model uses a Generative pre-trained transformer architecture.

## Technical Specifications

| AaI Specs              | Details                                  |
|------------------------|----------------------------------------|
| Creator                | 16dvnk                                 |
| Hardware               | NVIDIA GeForce RTX 4080 Laptop GPU     |
| Training Duration      | 11 hours                               |
| Framework              | PyTorch                                |
| Parameter Count        | 14 million                             |
| Model Type             | Generative pre-trained transformer     |
| Initial Training Year  | 2025                                   |
| Stable Release Status  | No stable release as of September 2025|
	
## Evaluation Results

The model was evaluated on the **ARC-Easy** benchmark (test split).

| Dataset  | Split | Metric   | Value   |
|----------|-------|----------|---------|
| ARC-Easy | test  | Accuracy | 0.80%   |

## Notes

• All current releases have 14M parameters, which is considered small.

• The model was trained using PyTorch.

• As of September 2025, there is no stable release of AaI.