12sciencejnv commited on
Commit
e05c291
·
verified ·
1 Parent(s): 21ff16a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -31
README.md CHANGED
@@ -1,40 +1,38 @@
1
-
2
  ---
3
  license: apache-2.0
4
  datasets:
5
- - minhalvp/islamqa
6
  language:
7
- - en
8
  base_model:
9
- - openai-community/gpt2
10
  pipeline_tag: text-generation
11
  library_name: transformers
12
  tags:
13
- - GPT-2
14
- - Islamic
15
- - QA
16
- - Fine-tuned
17
- - Text Generation
18
  metrics:
19
- - name: Perplexity
20
- value: 170.31
21
- type: Perplexity
22
- new_version: openai-community/gpt2
23
  model-index:
24
- - name: ChadGPT
25
- results:
26
- - task:
27
- type: text-generation
28
- dataset:
29
- name: minhalvp/islamqa
30
- type: text # Added type for dataset
31
- metrics:
32
- - name: Perplexity
33
- value: 170.31
34
- type: Perplexity # Added type for metrics
35
- source:
36
- name: Self-evaluated
37
- url: "https://huggingface.co/12sciencejnv/ChadGPT" # Added source URL
38
  ---
39
 
40
  # ChadGPT
@@ -56,9 +54,6 @@ English
56
  ## Base Model
57
  The model is based on the `gpt2` architecture.
58
 
59
- ## New Version
60
- gpt2 (Base model, fine-tuned version)
61
-
62
  ## Pipeline Tag
63
  text-generation
64
 
@@ -69,4 +64,4 @@ transformers
69
  GPT-2, Islamic, QA, Fine-tuned, Text Generation
70
 
71
  ## Eval Results
72
- The model achieved a training loss of approximately 2.3 after 3 epochs of fine-tuning.
 
 
1
  ---
2
  license: apache-2.0
3
  datasets:
4
+ - minhalvp/islamqa
5
  language:
6
+ - en
7
  base_model:
8
+ - openai-community/gpt2
9
  pipeline_tag: text-generation
10
  library_name: transformers
11
  tags:
12
+ - GPT-2
13
+ - Islamic
14
+ - QA
15
+ - Fine-tuned
16
+ - Text Generation
17
  metrics:
18
+ - name: Perplexity
19
+ value: 170.31
20
+ type: Perplexity
 
21
  model-index:
22
+ - name: ChadGPT
23
+ results:
24
+ - task:
25
+ type: text-generation
26
+ dataset:
27
+ name: minhalvp/islamqa
28
+ type: text
29
+ metrics:
30
+ - name: Perplexity
31
+ value: 170.31
32
+ type: Perplexity
33
+ source:
34
+ name: Self-evaluated
35
+ url: https://huggingface.co/12sciencejnv/ChadGPT
36
  ---
37
 
38
  # ChadGPT
 
54
  ## Base Model
55
  The model is based on the `gpt2` architecture.
56
 
 
 
 
57
  ## Pipeline Tag
58
  text-generation
59
 
 
64
  GPT-2, Islamic, QA, Fine-tuned, Text Generation
65
 
66
  ## Eval Results
67
+ The model achieved a training loss of approximately 2.3 after 3 epochs of fine-tuning.