Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -22,6 +22,7 @@ Useful HF resources and fantastic contributors for Dutch NLP are
|
|
| 22 |
* [Maarten Grootendorst](https://huggingface.co/MaartenGr), [homepage](https://www.maartengrootendorst.com/) and [git](https://github.com/MaartenGr)
|
| 23 |
* [Piek Vossen](https://vossen.info/)
|
| 24 |
* [Eva Rombouts](https://huggingface.co/ekrombouts)
|
|
|
|
| 25 |
|
| 26 |
## Organisations
|
| 27 |
* [University Medical Center Utrecht](https://github.com/umcu)
|
|
@@ -42,16 +43,24 @@ Useful HF resources and fantastic contributors for Dutch NLP are
|
|
| 42 |
* [Clinlp](https://github.com/umcu/clinlp)
|
| 43 |
|
| 44 |
## Encoder models
|
| 45 |
-
* [RobBERT 2023](https://huggingface.co/DTAI-KULeuven/robbert-2023-dutch-base)
|
| 46 |
-
* [BERTje](https://huggingface.co/GroNLP/bert-base-dutch-cased)
|
| 47 |
-
* [BelabBERT](https://huggingface.co/jwouts/belabBERT_115k)
|
| 48 |
-
* [MedRoBERTa.nl](https://huggingface.co/CLTL/MedRoBERTa.nl)
|
| 49 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 50 |
## Decoder models
|
| 51 |
-
* [GPT-2 on mC4](https://huggingface.co/yhavinga/gpt2-large-dutch), [GPT-2 finetuned on ](https://huggingface.co/GroNLP/gpt2-medium-dutch-embeddings)
|
| 52 |
-
* [GPT-neo on mC4](https://huggingface.co/yhavinga/gpt-neo-1.3B-dutch)
|
| 53 |
-
* [GEITje (based on Mistral)](https://github.com/Rijgersberg/GEITje)
|
| 54 |
-
* [Fietje (based on Phi-2)](https://huggingface.co/BramVanroy/fietje-2)
|
|
|
|
| 55 |
|
| 56 |
## NTMs
|
| 57 |
* [NLLB200](https://huggingface.co/facebook/nllb-200-3.3B)
|
|
|
|
| 22 |
* [Maarten Grootendorst](https://huggingface.co/MaartenGr), [homepage](https://www.maartengrootendorst.com/) and [git](https://github.com/MaartenGr)
|
| 23 |
* [Piek Vossen](https://vossen.info/)
|
| 24 |
* [Eva Rombouts](https://huggingface.co/ekrombouts)
|
| 25 |
+
* [Joeran Bosma](https://huggingface.co/joeranbosma/)
|
| 26 |
|
| 27 |
## Organisations
|
| 28 |
* [University Medical Center Utrecht](https://github.com/umcu)
|
|
|
|
| 43 |
* [Clinlp](https://github.com/umcu/clinlp)
|
| 44 |
|
| 45 |
## Encoder models
|
| 46 |
+
* [*RobBERT 2023*](https://huggingface.co/DTAI-KULeuven/robbert-2023-dutch-base)
|
| 47 |
+
* [*BERTje]*(https://huggingface.co/GroNLP/bert-base-dutch-cased)
|
| 48 |
+
* [*BelabBERT*](https://huggingface.co/jwouts/belabBERT_115k)
|
| 49 |
+
* [**MedRoBERTa.nl**](https://huggingface.co/CLTL/MedRoBERTa.nl)
|
| 50 |
+
* [**CardioBERTa.nl**](https://huggingface.co/UMCU/CardioBERTa.nl_clinical)
|
| 51 |
+
* [**CardioDeBERTa.nl**](https://huggingface.co/UMCU/CardioDeBERTa.nl)
|
| 52 |
+
* [**DRAGON-longformer-large-domain-specific**](https://huggingface.co/joeranbosma/dragon-longformer-large-domain-specific)
|
| 53 |
+
* [**DRAGON-longformer-base-domain-specific**](https://huggingface.co/joeranbosma/dragon-longformer-base-domain-specific)
|
| 54 |
+
* [**DRAGON-roberta-large-domain-specific**](https://huggingface.co/joeranbosma/dragon-roberta-large-domain-specific)
|
| 55 |
+
* [**DRAGON-roberta-base-domain-specific**](https://huggingface.co/joeranbosma/dragon-roberta-base-domain-specific)
|
| 56 |
+
* [**DRAGON-bert-base-domain-specific**](https://huggingface.co/joeranbosma/dragon-bert-base-domain-specific)
|
| 57 |
+
*
|
| 58 |
## Decoder models
|
| 59 |
+
* [*GPT-2 on mC4]*(https://huggingface.co/yhavinga/gpt2-large-dutch), [GPT-2 finetuned on ](https://huggingface.co/GroNLP/gpt2-medium-dutch-embeddings)
|
| 60 |
+
* [*GPT-neo on mC4]*(https://huggingface.co/yhavinga/gpt-neo-1.3B-dutch)
|
| 61 |
+
* [*GEITje (based on Mistral)*](https://github.com/Rijgersberg/GEITje)
|
| 62 |
+
* [*Fietje (based on Phi-2)*](https://huggingface.co/BramVanroy/fietje-2)
|
| 63 |
+
* [**J1**](https://huggingface.co/Juvoly/J1-Llama-8B-exp)
|
| 64 |
|
| 65 |
## NTMs
|
| 66 |
* [NLLB200](https://huggingface.co/facebook/nllb-200-3.3B)
|