Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
temsa
/
OpenMed-mLiteClinical-IrishCorePII-135M-v2-rc5
like
0
Token Classification
Transformers
ONNX
Safetensors
4 datasets
English
Irish
distilbert
pii
de-identification
ireland
irish
gaelic
ppsn
eircode
passport
phone-number
iban
int8
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
OpenMed-mLiteClinical-IrishCorePII-135M-v2-rc5
1.5 GB
1 contributor
History:
4 commits
temsa
Fix remote tokenizer loading for ONNX inference
805f769
verified
about 10 hours ago
eval
Add files using upload-large-folder tool
about 10 hours ago
onnx
Add files using upload-large-folder tool
about 10 hours ago
.gitattributes
218 Bytes
Add files using upload-large-folder tool
about 10 hours ago
.gitignore
26 Bytes
Add files using upload-large-folder tool
about 10 hours ago
LICENSE
10.3 kB
Add files using upload-large-folder tool
about 10 hours ago
NOTICE
423 Bytes
Add files using upload-large-folder tool
about 10 hours ago
README.md
4.85 kB
Add files using upload-large-folder tool
about 10 hours ago
config.json
6.25 kB
Add files using upload-large-folder tool
about 10 hours ago
inference_mask.py
2.46 kB
Fix remote tokenizer loading for transformers inference
about 10 hours ago
inference_mask_onnx.py
1.75 kB
Add files using upload-large-folder tool
about 10 hours ago
irish_core_decoder.py
12.8 kB
Add files using upload-large-folder tool
about 10 hours ago
label_meta.json
2.68 kB
Add files using upload-large-folder tool
about 10 hours ago
model.safetensors
539 MB
xet
Add files using upload-large-folder tool
about 10 hours ago
onnx_token_classifier.py
10 kB
Fix remote tokenizer loading for ONNX inference
about 10 hours ago
pyproject.toml
391 Bytes
Add files using upload-large-folder tool
about 10 hours ago
qa_config.json
1.3 kB
Add files using upload-large-folder tool
about 10 hours ago
raw_word_aligned.py
2.15 kB
Add files using upload-large-folder tool
about 10 hours ago
special_tokens_map.json
695 Bytes
Add files using upload-large-folder tool
about 10 hours ago
tokenizer.json
2.92 MB
Add files using upload-large-folder tool
about 10 hours ago
tokenizer_config.json
1.37 kB
Add files using upload-large-folder tool
about 10 hours ago
training_sources.json
2.49 kB
Add files using upload-large-folder tool
about 10 hours ago
vocab.txt
996 kB
Add files using upload-large-folder tool
about 10 hours ago