T5gemma2 Transformersjs/onnx optimum
#5
by
moogin
- opened
Could you please work or help with integration for optimum / transformersjs to have T5gemma2 running browser environment. Big thanks!
Want to use T5Gemma2 in transformers.js for browser/node inference. Even if text only
Hi
@moogin
Thanks for reaching out .
We will pass your request for browser environment using T5 Gemma 2 model to the relevant team .
If you want to build it yourselves you can follow this hight level roadmap
- Export T5Gemma2 weights- From Hugging Face → ONNX
- Validate model graph and make sure encoder/decoder blocks are preserved
- Export tokenizer vocab + merges/special tokens
- Run local benchmarks for both Browser and Node
You can start experimenting with the smallest T5Gemma2 270M model.
Thanks