File size: 1,728 Bytes
5a17977
 
 
 
 
 
 
 
9c0f5c7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5a17977
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
language:
- multilingual
license: apache-2.0
tags:
- trankit
---

# trankit pretrained weights

This repository simply hosts the decompressed model weights that come from the open-source [`nlp-uoregon/trankit`](https://github.com/nlp-uoregon/trankit) project. All archives in the original HuggingFace model `uonlp/trankit` have been extracted to their respective directories so the parameters can be consumed without additional unzip steps.

The upstream project distributes the assets under the [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0). The same license therefore applies to this copy of the weights—make sure every downstream consumer is aware of the original terms.

The directory layout mirrors the upstream release:

```
models/
  v1.0.0/
    xlm-roberta-base/
      <language-or-corpus>/
        *.mdl, *.pt, *.json, …
    xlm-roberta-large/
      <language-or-corpus>/

```

No additional code changes were made; if you need the training or inference library please refer to the upstream GitHub repository.
---
language:
- multilingual
license: apache-2.0
tags:
- trankit
---

This repository simply hosts the decompressed model weights that come from the open-source [`nlp-uoregon/trankit`](https://github.com/nlp-uoregon/trankit) project. All archives in the original HuggingFace model `uonlp/trankit` have been extracted to their respective directories so the parameters can be consumed without additional unzip steps.

The upstream project distributes the assets under the [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0). The same license therefore applies to this copy of the weights—make sure every downstream consumer is aware of the original terms.