T5gemma2 Transformersjs/onnx optimum

#5
by moogin - opened

Could you please work or help with integration for optimum / transformersjs to have T5gemma2 running browser environment. Big thanks!
Want to use T5Gemma2 in transformers.js for browser/node inference. Even if text only

Google org
edited Jan 6

Hi @moogin Thanks for reaching out .
We will pass your request for browser environment using T5 Gemma 2 model to the relevant team .
If you want to build it yourselves you can follow this hight level roadmap

  1. Export T5Gemma2 weights- From Hugging Face → ONNX
  2. Validate model graph and make sure encoder/decoder blocks are preserved
  3. Export tokenizer vocab + merges/special tokens
  4. Run local benchmarks for both Browser and Node

You can start experimenting with the smallest T5Gemma2 270M model.
Thanks

Sign up or log in to comment