YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

-- language: pl tags:

  • sequence-classification
  • literackie
  • transformers
  • herbert

Klasyfikator treści literackich (HerBERT)

Model binarny wykrywa treści literackie i nie literackie tematyzujące literaturę np.: poświecone twórcom literatury, historii literatury, tematom i motywom. Bazuje na allegro/herbert-large-cased.

🧪 Szybki start

from transformers import AutoTokenizer, AutoModelForSequenceClassification
from huggingface_hub import hf_hub_download
import joblib, torch

model_id = "darekpe79/true-false-pbl-herbert"
model     = AutoModelForSequenceClassification.from_pretrained(model_id, use_safetensors=True)
tokenizer = AutoTokenizer.from_pretrained(model_id)

enc_path  = hf_hub_download(repo_id=model_id, filename="label_encoder.joblib")
label_enc = joblib.load(enc_path)

text = ""
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True)

model.eval()
with torch.no_grad():
    pred_id = model(**inputs).logits.argmax(-1).item()

print("Predykcja:", label_enc.inverse_transform([pred_id])[0])

📦 Pliki
nazwa	opis
model.safetensors	wagi modelu
config.json	konfiguracja
tokenizer.json, vocab.json, merges.txt	tokenizer
label_encoder.joblib	koder etykiet True / False
Autor

Model przygotował @darekpe79
Downloads last month
-
Safetensors
Model size
0.4B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support