YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

PhoBERT Large Fine-tuned for CV-Skill to Job Description Matching

This model is a PhoBERT-Large fine-tuned with LoRA for sequence classification.
It can be used to predict if a CV skill matches a job description.

Training Details

  • Base model: vinai/phobert-large
  • Task: Sequence Classification (2 labels)
  • LoRA config: r=16, lora_alpha=32, target_modules=["query", "key", "value", "output.dense"], lora_dropout=0.05
  • Optimizer: AdamW, lr=2e-4
  • Batch size: 120 (gradient accumulation 4)
  • Epochs: 8
  • Metrics: F1-score
  • Framework: HuggingFace Transformers + PEFT

Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("lengocquangLAB/phobert-large-cv-skill-jd-req-match")
model = AutoModelForSequenceClassification.from_pretrained("lengocquangLAB/phobert-large-cv-skill-jd-req-match")

inputs = tokenizer("CV skill text", "Job description text", return_tensors="pt")
outputs = model(**inputs)
pred = outputs.logits.argmax(dim=-1)
print(pred)
Downloads last month
-
Safetensors
Model size
0.4B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support