LocalLaws/LOCUS-Opacity

A ModernBERT regression model that scores local-ordinance text along the Opacity axis of the LOCUS (Local Ordinances Corpus, United States) dataset.

Fine-tuned from answerdotai/ModernBERT-base. The target is a TrueSkill mu distilled from pairwise LLM comparisons on the opacity axis, then z-score normalized across the training corpus.

Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

tok = AutoTokenizer.from_pretrained("LocalLaws/LOCUS-Opacity")
model = AutoModelForSequenceClassification.from_pretrained("LocalLaws/LOCUS-Opacity")
model.eval()

text = "No person shall keep any swine within the city limits."
enc = tok(text, return_tensors="pt", truncation=True, max_length=2048)
with torch.no_grad():
    score = model(**enc).logits.squeeze(-1).item()
print(score)
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for LocalLaws/LOCUS-Opacity

Finetuned
(1234)
this model

Dataset used to train LocalLaws/LOCUS-Opacity

Collection including LocalLaws/LOCUS-Opacity