REVE: A Foundation Model for EEG -- Adapting to Any Setup with Large-Scale Pretraining on 25,000 Subjects Paper • 2510.21585 • Published Oct 24 • 6
MLM vs CLM Collection Research material on research about pre-training encoders, with extensive comparison on masked language modeling paradigm vs causal langage modeling. • 5 items • Updated 26 days ago
EuroBERT Encoding model Collection Suite of models for improved integration into RAG (for information retrieval), designed for ease-of-use and practicability in industrial context • 5 items • Updated Sep 11 • 1
Should We Still Pretrain Encoders with Masked Language Modeling? Paper • 2507.00994 • Published Jul 1 • 80
EuroBERT Encoding model Collection Suite of models for improved integration into RAG (for information retrieval), designed for ease-of-use and practicability in industrial context • 5 items • Updated Sep 11 • 1
EuroBERT: Scaling Multilingual Encoders for European Languages Paper • 2503.05500 • Published Mar 7 • 80
Is Preference Alignment Always the Best Option to Enhance LLM-Based Translation? An Empirical Analysis Paper • 2409.20059 • Published Sep 30, 2024 • 16