Update README.md
Browse files
README.md
CHANGED
|
@@ -1,17 +1,12 @@
|
|
| 1 |
---
|
| 2 |
-
|
| 3 |
-
license: apache-2.0
|
| 4 |
-
base_model: google-t5/t5-base
|
| 5 |
-
tags:
|
| 6 |
-
- generated_from_trainer
|
| 7 |
-
metrics:
|
| 8 |
-
- rouge
|
| 9 |
-
model-index:
|
| 10 |
-
- name: PreDA_t5-base
|
| 11 |
-
results: []
|
| 12 |
language:
|
| 13 |
- en
|
| 14 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 15 |
---
|
| 16 |
|
| 17 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
@@ -146,8 +141,8 @@ for decode_dream in output:
|
|
| 146 |
```
|
| 147 |
|
| 148 |
# Dual-Use Implication
|
| 149 |
-
Upon evaluation we identified no dual-use implication for the present model
|
| 150 |
|
| 151 |
# Cite
|
| 152 |
Please note that the paper referring to this model, titled PreDA: Prefix-Based Dream Reports Annotation
|
| 153 |
-
with Generative Language Models, has been accepted for publication at LOD 2025 conference and will appear in the conference proceedings.
|
|
|
|
| 1 |
---
|
| 2 |
+
license: cc0-1.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
language:
|
| 4 |
- en
|
| 5 |
+
metrics:
|
| 6 |
+
- rouge
|
| 7 |
+
base_model:
|
| 8 |
+
- google-t5/t5-base
|
| 9 |
+
library_name: transformers
|
| 10 |
---
|
| 11 |
|
| 12 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
|
| 141 |
```
|
| 142 |
|
| 143 |
# Dual-Use Implication
|
| 144 |
+
Upon evaluation we identified no dual-use implication for the present model. The model parameters, including the weights are available under CC0 1.0 Public Domain Dedication.
|
| 145 |
|
| 146 |
# Cite
|
| 147 |
Please note that the paper referring to this model, titled PreDA: Prefix-Based Dream Reports Annotation
|
| 148 |
+
with Generative Language Models, has been accepted for publication at LOD 2025 conference and will appear in the conference proceedings.
|