AetherMind_SRL: Self-Reflective Learning for Robust Natural Language Inference
samerzaher80
β’ β’ 2How to use samerzaher80/AetherMind_SRL with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="samerzaher80/AetherMind_SRL") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("samerzaher80/AetherMind_SRL")
model = AutoModelForSequenceClassification.from_pretrained("samerzaher80/AetherMind_SRL")Author: Samer S. Najm (Sam)
Organization: AetherMind Project
Model Type: Knowledge-Distilled Transformer (Student Model)
Domain: Natural Language Inference (NLI) + Medical Reasoning (ADNI SRL)
AetherMind_SRL is the 12th-round refined version of AetherMindβs knowledge-distilled student model, trained using self-reflective learning (SRL), knowledge distillation, ADNI medical contradictions, and general-domain NLI datasets.
| Dataset | Accuracy | Macro F1 | Samples |
|---|---|---|---|
| SNLI | 89.64% | 89.55% | 9,824 |
| MNLI-M | 90.20% | 90.00% | 9,815 |
| MNLI-MM | 89.61% | 89.35% | 9,832 |
| ANLI R1 | 79.90% | 79.89% | 1,000 |
| ANLI R2 | 67.50% | 67.35% | 1,000 |
| ANLI R3 | 67.33% | 66.81% | 1,200 |
Teacher: microsoft/deberta-v3-base
Student: AetherMind_SRL
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model_id = "samerzaher80/AetherMind_SRL"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForSequenceClassification.from_pretrained(model_id).cuda()
premise = "The patient scored 28 on the MMSE last year."
hypothesis = "The patient shows signs of cognitive decline."
inputs = tokenizer(premise, hypothesis, return_tensors="pt").to("cuda")
with torch.no_grad():
logits = model(**inputs).logits
predicted = torch.argmax(logits, dim=-1).item()
labels = ["entailment", "neutral", "contradiction"]
print("Prediction:", labels[predicted])
{
"tags": [
"natural-language-inference",
"knowledge-distillation",
"biomedical-nlp",
"aethermind",
"nli",
"self-reflective-learning",
"transformers"
]
}