Add pipeline tag and project page URL
#1
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -1,12 +1,13 @@
|
|
| 1 |
---
|
| 2 |
-
library_name: transformers
|
| 3 |
-
license: apache-2.0
|
| 4 |
-
language:
|
| 5 |
-
- en
|
| 6 |
datasets:
|
| 7 |
- Skylion007/openwebtext
|
|
|
|
|
|
|
|
|
|
|
|
|
| 8 |
metrics:
|
| 9 |
- perplexity
|
|
|
|
| 10 |
---
|
| 11 |
|
| 12 |
## Using DUO
|
|
@@ -19,7 +20,7 @@ tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')
|
|
| 19 |
model = AutoModelForMaskedLM.from_pretrained('s-sahoo/duo')
|
| 20 |
```
|
| 21 |
For a hands-on example, check out this [Colab notebook](https://colab.research.google.com/drive/1Sf7R-dqdR6gq-H8nyZ9E3ZkyvqMTqcwq?usp=sharing).
|
| 22 |
-
For more information and implementation details, visit our github repository: [DUO](https://github.com/s-sahoo/duo)
|
| 23 |
|
| 24 |
## Model Details
|
| 25 |
The model, which has a context length of `1024` and is similar in size to GPT2-medium with approximately `130 million` non-embedding parameters,
|
|
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
datasets:
|
| 3 |
- Skylion007/openwebtext
|
| 4 |
+
language:
|
| 5 |
+
- en
|
| 6 |
+
library_name: transformers
|
| 7 |
+
license: apache-2.0
|
| 8 |
metrics:
|
| 9 |
- perplexity
|
| 10 |
+
pipeline_tag: text-generation
|
| 11 |
---
|
| 12 |
|
| 13 |
## Using DUO
|
|
|
|
| 20 |
model = AutoModelForMaskedLM.from_pretrained('s-sahoo/duo')
|
| 21 |
```
|
| 22 |
For a hands-on example, check out this [Colab notebook](https://colab.research.google.com/drive/1Sf7R-dqdR6gq-H8nyZ9E3ZkyvqMTqcwq?usp=sharing).
|
| 23 |
+
For more information and implementation details, visit our github repository: [DUO](https://github.com/s-sahoo/duo) and project page: [Project Page](https://s-sahoo.com/duo)
|
| 24 |
|
| 25 |
## Model Details
|
| 26 |
The model, which has a context length of `1024` and is similar in size to GPT2-medium with approximately `130 million` non-embedding parameters,
|