--- library_name: transformers pipeline_tag: text-classification base_model: answerdotai/ModernBERT-base language: - en tags: - text-classification - memory - darkmem - modernbert --- # darkmem-classifier-v1 Seven-class memory-type classifier for darkmem. Labels: fact, decision, preference, problem, reference, architecture, milestone. ## Metrics accuracy 0.975 / macro F1 0.975 on 1,000-row gold (`gold_v3.jsonl`). ## Base model [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) ## Usage ```python from transformers import AutoModelForSequenceClassification, AutoTokenizer tok = AutoTokenizer.from_pretrained("darkraise/darkmem-classifier-v1", trust_remote_code=True) model = AutoModelForSequenceClassification.from_pretrained("darkraise/darkmem-classifier-v1", trust_remote_code=True) ``` ## License Inherits the license of the base model. Fine-tuned weights published under the same terms unless noted otherwise in the repo. ## Provenance Fine-tuned as part of [darkmem](https://github.com/) — a centralized memory system for AI agents. Training recipe and evaluation scripts are in the `fine-tuning/` subtree of the darkmem repository.