Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

loolootech
/
no-name-ner-th

Token Classification
Transformers
Safetensors
Thai
camembert
medical
NER
Model card Files Files and versions
xet
Community

Instructions to use loolootech/no-name-ner-th with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use loolootech/no-name-ner-th with Transformers:

    # Use a pipeline as a high-level helper
    from transformers import pipeline
    
    pipe = pipeline("token-classification", model="loolootech/no-name-ner-th")
    # Load model directly
    from transformers import AutoTokenizer, AutoModelForTokenClassification
    
    tokenizer = AutoTokenizer.from_pretrained("loolootech/no-name-ner-th")
    model = AutoModelForTokenClassification.from_pretrained("loolootech/no-name-ner-th")
  • Notebooks
  • Google Colab
  • Kaggle

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Gated model
You can list files but not access them

Preview of files found in this repository
  • .gitattributes
    1.63 kB
    Upload mascot-image-landscape.png 9 months ago
  • LICENSE
    20.8 kB
    Upload LICENSE 9 months ago
  • Looloohealth.png
    16.7 kB
    Upload 2 files 9 months ago
  • PresScribe.png
    30.2 kB
    Upload 2 files 9 months ago
  • README.md
    4.87 kB
    Update README.md 6 days ago
  • added_tokens.json
    15 kB
    Upload 7 files 9 months ago
  • config.json
    1.32 kB
    Upload 7 files 9 months ago
  • logo_mahamor.png
    36.9 kB
    Upload logo_mahamor.png about 1 month ago
  • mascot-image-landscape.png
    3.79 MB
    xet
    Upload mascot-image-landscape.png 9 months ago
  • model.safetensors
    1.11 GB
    xet
    Upload 7 files 9 months ago
  • sentencepiece.bpe.model
    5.26 MB
    xet
    Upload 7 files 9 months ago
  • special_tokens_map.json
    1.05 kB
    Upload 7 files 9 months ago
  • tokenizer.json
    17.3 MB
    xet
    Upload 7 files 9 months ago
  • tokenizer_config.json
    145 kB
    Upload 7 files 9 months ago