Bam3752 commited on
Commit
f1b54df
·
verified ·
1 Parent(s): 665c5e5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -18,7 +18,7 @@ metrics:
18
  - type: loss
19
  value: 0.2673
20
  model-index:
21
- - name: PubMedBERT NLI LoRA
22
  results:
23
  - task:
24
  type: natural-language-inference
@@ -35,14 +35,14 @@ model-index:
35
  value: 0.2673
36
  ---
37
 
38
- # PubMedBERT NLI (LoRA Fine-Tuned)
39
 
40
- [![Model](https://img.shields.io/badge/Model-LoRA-green)](https://huggingface.co/Bam3752/Model-lora)
41
  [![Hugging Face](https://img.shields.io/badge/HF-Bam3752-blue)](https://huggingface.co/Bam3752)
42
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
43
 
44
- A biomedical **Natural Language Inference (NLI)** model based on **PubMedBERT** and fine-tuned with **LoRA** adapters.
45
- It is designed to evaluate **entailment, contradiction, and neutrality** between reasoning steps in biomedical chain-of-thought (CoT) explanations.
46
 
47
  ---
48
 
@@ -50,8 +50,8 @@ It is designed to evaluate **entailment, contradiction, and neutrality** between
50
 
51
  - **Base model:** [pritamdeka/PubMedBERT-MNLI-MedNLI](https://huggingface.co/pritamdeka/PubMedBERT-MNLI-MedNLI)
52
  - **Fine-tuning datasets:** BioASQ + MedNLI
53
- - **Objective:** 3-class NLI (entailment, neutral, contradiction)
54
- - **Method:** LoRA (PEFT) adapters
55
  - **Hardware:** Apple MPS (Metal backend)
56
  - **Hyperparameters:**
57
  - Epochs: 4
@@ -72,19 +72,19 @@ It is designed to evaluate **entailment, contradiction, and neutrality** between
72
  | Macro F1 | **0.9036** |
73
  | Eval Loss | **0.2673** |
74
 
75
- Calibrated with isotonic regression (`calibration/isotonic.pkl`) for more reliable probability estimates.
76
 
77
  ---
78
 
79
  ## 🚀 Usage
80
 
81
- ### Load with Transformers
82
 
83
  ```python
84
  from transformers import AutoModelForSequenceClassification, AutoTokenizer
85
 
86
- model = AutoModelForSequenceClassification.from_pretrained("Bam3752/Model-lora")
87
- tokenizer = AutoTokenizer.from_pretrained("Bam3752/Model-lora")
88
 
89
  premise = "Aspirin reduces the risk of myocardial infarction."
90
  hypothesis = "Aspirin prevents heart attacks."
 
18
  - type: loss
19
  value: 0.2673
20
  model-index:
21
+ - name: PubMedBERT BioNLI LoRA
22
  results:
23
  - task:
24
  type: natural-language-inference
 
35
  value: 0.2673
36
  ---
37
 
38
+ # PubMedBERT BioNLI LoRA
39
 
40
+ [![Model](https://img.shields.io/badge/Model-LoRA-green)](https://huggingface.co/Bam3752/PubMedBERT-BioNLI-LoRA)
41
  [![Hugging Face](https://img.shields.io/badge/HF-Bam3752-blue)](https://huggingface.co/Bam3752)
42
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
43
 
44
+ **PubMedBERT BioNLI LoRA** is a biomedical **Natural Language Inference (NLI)** model fine-tuned with **LoRA adapters**.
45
+ It classifies **entailment, contradiction, and neutrality** between biomedical text pairs, optimized for **chain-of-thought reasoning validation**.
46
 
47
  ---
48
 
 
50
 
51
  - **Base model:** [pritamdeka/PubMedBERT-MNLI-MedNLI](https://huggingface.co/pritamdeka/PubMedBERT-MNLI-MedNLI)
52
  - **Fine-tuning datasets:** BioASQ + MedNLI
53
+ - **Objective:** 3-class NLI (entailment / neutral / contradiction)
54
+ - **Method:** LoRA parameter-efficient fine-tuning
55
  - **Hardware:** Apple MPS (Metal backend)
56
  - **Hyperparameters:**
57
  - Epochs: 4
 
72
  | Macro F1 | **0.9036** |
73
  | Eval Loss | **0.2673** |
74
 
75
+ 👉 Calibrated with isotonic regression (`calibration/isotonic.pkl`) for reliable probabilities.
76
 
77
  ---
78
 
79
  ## 🚀 Usage
80
 
81
+ ### Transformers
82
 
83
  ```python
84
  from transformers import AutoModelForSequenceClassification, AutoTokenizer
85
 
86
+ model = AutoModelForSequenceClassification.from_pretrained("Bam3752/PubMedBERT-BioNLI-LoRA")
87
+ tokenizer = AutoTokenizer.from_pretrained("Bam3752/PubMedBERT-BioNLI-LoRA")
88
 
89
  premise = "Aspirin reduces the risk of myocardial infarction."
90
  hypothesis = "Aspirin prevents heart attacks."