Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
fblgit 
posted an update about 3 hours ago
Post
21
Introducing HarEmb - PII a single-transformer-block distilled layer from OpenMed PII Privacy filter.

Its a very tiny model that reaches comparable results at PII classification thru viterbi BIOES decoding, harnessing 98%~ the original model performance while being a tiny fraction of the base model.
It doubles the performance tk/s, reduces the active params dramatically and the VRAM footprint.

The evaluation & benchmarking is within the model repository and can be reproduced. I trained it with an RTX4090 without issues and it is compatible with OpenMed suite and a in-place replacement for openai privacy-filter model.

fblgit/haremb-privacy-filter-opennemo

I'm looking for people who wants to co-author/contribute/endorse HarEmb research and the technical paper for the model.

Contact xavi@juanako.ai
In this post