Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Serialtechlab
/
dhivehi-asr-correction-byt5

Transformers
Safetensors
t5
text2text-generation
Generated from Trainer
text-generation-inference
Model card Files Files and versions
xet
Community

Instructions to use Serialtechlab/dhivehi-asr-correction-byt5 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use Serialtechlab/dhivehi-asr-correction-byt5 with Transformers:

    # Load model directly
    from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
    
    tokenizer = AutoTokenizer.from_pretrained("Serialtechlab/dhivehi-asr-correction-byt5")
    model = AutoModelForSeq2SeqLM.from_pretrained("Serialtechlab/dhivehi-asr-correction-byt5")
  • Notebooks
  • Google Colab
  • Kaggle

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Gated model
You can list files but not access them

Preview of files found in this repository
  • .gitattributes
    1.52 kB
    initial commit about 1 month ago
  • README.md
    1.25 kB
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • added_tokens.json
    3.02 kB
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • config.json
    791 Bytes
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • generation_config.json
    152 Bytes
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • model.safetensors
    1.2 GB
    xet
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • preprocessor_config.json
    254 Bytes
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • special_tokens_map.json
    3.09 kB
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • tokenizer_config.json
    25.6 kB
    Initial commit: Cleaned final model weights without training history about 1 month ago
  • vocab.json
    966 kB
    Initial commit: Cleaned final model weights without training history about 1 month ago