YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Hugging Face bundle: Terraform Plan Risk Labels + Baseline Classifier
This bundle contains three Hugging Face‑ready repos:
- Dataset:
bharathja/tfplan-risk-labels-multicloud - Model:
bharathja/tfplan-risk-classifier-deberta-v3-small - (Optional) CLI helper:
bharathja/tfplan-risk-cli
Publish order (recommended)
1) Create the dataset repo on the Hub
Create a Dataset repo named: tfplan-risk-labels-multicloud under your account, then push the contents of tfplan-risk-labels/.
cd tfplan-risk-labels
git init
git lfs install
git remote add origin https://huggingface.co/datasets/bharathja/tfplan-risk-labels-multicloud
git add .
git commit -m "init dataset"
git push -u origin main
2) Train the baseline classifier locally
cd ../tfplan-risk-classifier-deberta-v3-small
python -m pip install -r requirements.txt
python train.py --dataset_path ../tfplan-risk-labels --output_dir ./out --epochs 3
3) Create the model repo and push the trained artifacts
Create a Model repo named: tfplan-risk-classifier-deberta-v3-small, then push out/.
cd out
git init
git lfs install
git remote add origin https://huggingface.co/bharathja/tfplan-risk-classifier-deberta-v3-small
cp ../README.md ./README.md
git add .
git commit -m "add model"
git push -u origin main
Quick sanity check (after publish)
Load dataset
from datasets import load_dataset
ds = load_dataset("bharathja/tfplan-risk-labels-multicloud", data_files={"train":"train.jsonl","validation":"validation.jsonl","test":"test.jsonl"})
print(ds["train"][0]["labels"])
Load model
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tok = AutoTokenizer.from_pretrained("bharathja/tfplan-risk-classifier-deberta-v3-small")
model = AutoModelForSequenceClassification.from_pretrained("bharathja/tfplan-risk-classifier-deberta-v3-small")
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support