Update README.md
Browse files
README.md
CHANGED
|
@@ -1,6 +1,6 @@
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
-
license:
|
| 4 |
base_model: google-t5/t5-small
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
|
@@ -146,7 +146,7 @@ for decode_dream in output:
|
|
| 146 |
```
|
| 147 |
|
| 148 |
# Dual-Use Implication
|
| 149 |
-
Upon evaluation we identified no dual-use implication for the present model
|
| 150 |
|
| 151 |
# Cite
|
| 152 |
Please note that the paper referring to this model, titled PreDA: Prefix-Based Dream Reports Annotation
|
|
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
+
license: cc0-1.0
|
| 4 |
base_model: google-t5/t5-small
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
|
|
|
| 146 |
```
|
| 147 |
|
| 148 |
# Dual-Use Implication
|
| 149 |
+
Upon evaluation we identified no dual-use implication for the present model. The model parameters, including the weights are available under CC0 1.0 Public Domain Dedication.
|
| 150 |
|
| 151 |
# Cite
|
| 152 |
Please note that the paper referring to this model, titled PreDA: Prefix-Based Dream Reports Annotation
|