BidirLM-1B-Base / README.md
TheoDB's picture
Upload BidirLM-1B-Base
788ac17 verified
metadata
tags:
  - transformers
  - bidirectional
  - multilingual
license: apache-2.0
base_model: google/gemma-3-1b-pt
language:
  - multilingual
  - af
  - am
  - ar
  - az
  - be
  - bg
  - bn
  - bs
  - ca
  - ceb
  - cs
  - cy
  - da
  - de
  - el
  - en
  - es
  - et
  - eu
  - fa
  - fi
  - fr
  - ga
  - gl
  - gu
  - ha
  - he
  - hi
  - hr
  - ht
  - hu
  - hy
  - id
  - ig
  - is
  - it
  - ja
  - jv
  - ka
  - kk
  - kn
  - ko
  - ky
  - lt
  - lv
  - mg
  - mk
  - ml
  - mr
  - ms
  - mt
  - my
  - nb
  - ne
  - nl
  - nso
  - ny
  - pa
  - pl
  - ps
  - pt
  - ro
  - ru
  - sd
  - si
  - sk
  - sl
  - sn
  - so
  - sq
  - sr
  - su
  - sv
  - sw
  - ta
  - te
  - th
  - tl
  - tr
  - uk
  - ur
  - vi
  - wo
  - xh
  - yo
  - zh
  - zu

BidirLM-1B-Base

BidirLM-1B-Base is the intermediate MNTP-adapted checkpoint of the BidirLM family. It is obtained by converting Gemma3-1B from causal to bidirectional attention and training with Masked Next Token Prediction (MNTP) on 30B tokens from a multi-domain corpus (FineWeb-Edu, FineWeb2-HQ, FineMath, Stack V2), then merged 50/50 with the original Gemma3-1B weights.

For general embeddings and downstream fine-tuning, use BidirLM/BidirLM-1B-Embedding which adds contrastive training on top of this checkpoint.

Usage

from transformers import AutoTokenizer, AutoModel, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("BidirLM/BidirLM-1B-Base", trust_remote_code=True)

# Base encoder
model = AutoModel.from_pretrained("BidirLM/BidirLM-1B-Base", trust_remote_code=True)

# Masked language model
mlm = AutoModelForMaskedLM.from_pretrained("BidirLM/BidirLM-1B-Base", trust_remote_code=True)

Requirements

transformers>=4.57.6,<5.0.0

This model requires trust_remote_code=True.

Citation

@misc{boizard2026bidirlmtextomnimodalbidirectional,
      title={BidirLM: From Text to Omnimodal Bidirectional Encoders by Adapting and Composing Causal LLMs}, 
      author={Nicolas Boizard and Théo Deschamps-Berger and Hippolyte Gisserot-Boukhlef and Céline Hudelot and Pierre Colombo},
      year={2026},
      eprint={2604.02045},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2604.02045}, 
}