Restore tokenizer.json from google-t5/t5-large for fast tokenizer support 0a8cae7 verified textsightai commited on 25 days ago
Fix handler: load tokenizer from google-t5/t5-large to avoid local spiece.model path issue b347258 verified textsightai commited on 25 days ago
Fix tokenizer_config: use minimal clean T5 config (remove backend=tokenizers) debca03 verified textsightai commited on 25 days ago
Add spiece.model (SentencePiece vocab) for T5Tokenizer d51a42c verified textsightai commited on 25 days ago
Fix: remove extra_special_tokens (incompatible list format) - extra_ids=100 handles this b533158 verified textsightai commited on 25 days ago
Remove fast tokenizer to fix extra_special_tokens compatibility issue 572fdfb verified textsightai commited on 25 days ago
Fix tokenizer_config: convert extra_special_tokens from list to dict for transformers compat 312cb19 verified textsightai commited on 25 days ago
Fix handler: use T5Tokenizer with legacy=True to fix extra_special_tokens error + float16 on GPU fa67548 verified textsightai commited on 25 days ago
Add custom handler for HuggingFace Inference Endpoints 863713b verified textsightai commited on 27 days ago
Update to v4: trained on 6.2K samples, composite 0.6047, 8/10 Human-Written verdicts 399f855 verified textsightai commited on 28 days ago