Hugodonotexit commited on
Commit
017d0aa
·
1 Parent(s): 9daa6f0
Files changed (2) hide show
  1. README.md +40 -1
  2. readme.md +0 -42
README.md CHANGED
@@ -3,4 +3,43 @@ license: mit
3
  language:
4
  - en
5
  pipeline_tag: text-classification
6
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  language:
4
  - en
5
  pipeline_tag: text-classification
6
+ ---
7
+ # BogoAI Model
8
+
9
+ BogoAI is a conceptual model inspired by Bogo Sort and the infinite monkey theorem. It generates random outputs and is not intended for practical use. It has a time complexity og O(n!) were as n is the length of the output text;
10
+
11
+ ## Model Details
12
+
13
+ - **Vocabulary Size**: 152064
14
+ - **Tokenizer**: Qwen/Qwen2.5-72B-Instruct
15
+
16
+ ## Installation
17
+
18
+ To use this model, install the required libraries:
19
+
20
+ ```bash
21
+ pip install transformers huggingface_hub
22
+ ```
23
+
24
+ ## Usage
25
+
26
+ Here's how to load and use the BogoAI model:
27
+
28
+ ```python
29
+ import torch
30
+ from transformers import AutoTokenizer, AutoModel
31
+ # Load tokenizer and model
32
+ tokenizer = AutoTokenizer.from_pretrained("Hugo0123/BogoAI")
33
+ model = AutoModel.from_pretrained("Hugo0123/BogoAI")
34
+ # Example input
35
+ input_text = "Example input text"
36
+ input_ids = tokenizer.encode(input_text, return_tensors='pt')
37
+ # Generate random output
38
+ output_ids = model(input_ids=input_ids)
39
+ output_text = tokenizer.decode(output_ids[0], skip_special_tokens=True)
40
+ print("Output:", output_text)
41
+ ```
42
+
43
+ ## License
44
+
45
+ MIT
readme.md DELETED
@@ -1,42 +0,0 @@
1
- # BogoAI Model
2
-
3
- BogoAI is a conceptual model inspired by Bogo Sort and the infinite monkey theorem. It generates random outputs and is not intended for practical use. It has a time complexity og O(n!) were as n is the length of the output text;
4
-
5
- ## Model Details
6
-
7
- - **Vocabulary Size**: 152064
8
- - **Tokenizer**: Qwen/Qwen2.5-72B-Instruct
9
-
10
- ## Installation
11
-
12
- To use this model, install the required libraries:
13
-
14
- ```bash
15
- pip install transformers huggingface_hub
16
- ```
17
-
18
- ## Usage
19
-
20
- Here's how to load and use the BogoAI model:
21
-
22
- ```python
23
- import torch
24
- from transformers import AutoTokenizer, AutoModel
25
-
26
- # Load tokenizer and model
27
- tokenizer = AutoTokenizer.from_pretrained("Hugo0123/BogoAI")
28
- model = AutoModel.from_pretrained("Hugo0123/BogoAI")
29
-
30
- # Example input
31
- input_text = "Example input text"
32
- input_ids = tokenizer.encode(input_text, return_tensors='pt')
33
-
34
- # Generate random output
35
- output_ids = model(input_ids=input_ids)
36
- output_text = tokenizer.decode(output_ids[0], skip_special_tokens=True)
37
- print("Output:", output_text)
38
- ```
39
-
40
- ## License
41
-
42
- MIT