Update README.md
Browse files
README.md
CHANGED
|
@@ -30,7 +30,7 @@ DistilGPT2 (short for Distilled-GPT2) is an English-language model pre-trained w
|
|
| 30 |
|
| 31 |
## Model Details
|
| 32 |
|
| 33 |
-
- **Developed by:** Hugging Face
|
| 34 |
- **Model type:** Transformer-based Language Model
|
| 35 |
- **Language:** English
|
| 36 |
- **License:** Apache 2.0
|
|
|
|
| 30 |
|
| 31 |
## Model Details
|
| 32 |
|
| 33 |
+
- **Developed by:** Hugging Face
|
| 34 |
- **Model type:** Transformer-based Language Model
|
| 35 |
- **Language:** English
|
| 36 |
- **License:** Apache 2.0
|