nielsr HF Staff commited on
Commit
bccd331
·
verified ·
1 Parent(s): f4074f4

Improve model card: Add pipeline tag, library, license & GitHub link

Browse files

This PR enhances the model card by:
- Adding `pipeline_tag: text-generation` to improve model discoverability.
- Updating `library_name` to `transformers` as the primary library for model inference, enabling the "how to use" widget.
- Adding `license: other` to the metadata, aligning with the "Same as base model (Llama 3.1)" described in the content.
- Including a direct link to the GitHub repository.

Please review and merge this PR.

Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -1,12 +1,16 @@
1
  ---
2
  base_model: meta-llama/Llama-3.1-8B-Instruct
3
- library_name: peft
 
 
4
  ---
5
 
6
  # Llama 7B Uncertainty Calibration Model (Brier Loss)
7
 
8
  This model is a fine-tuned version of Llama-3.1-8B-Instruct optimized for uncertainty calibration using our method [ConfTuner: Training Large Language Models to Express Their Confidence Verbally](https://arxiv.org/abs/2508.18847), accepted by NeurIPS 2025.
9
 
 
 
10
  ## Model Details
11
 
12
  ### Model Description
@@ -34,4 +38,4 @@ This model is optimized for tasks requiring well-calibrated uncertainty estimate
34
 
35
  ### Framework versions
36
 
37
- - PEFT 0.12.0
 
1
  ---
2
  base_model: meta-llama/Llama-3.1-8B-Instruct
3
+ library_name: transformers
4
+ pipeline_tag: text-generation
5
+ license: other
6
  ---
7
 
8
  # Llama 7B Uncertainty Calibration Model (Brier Loss)
9
 
10
  This model is a fine-tuned version of Llama-3.1-8B-Instruct optimized for uncertainty calibration using our method [ConfTuner: Training Large Language Models to Express Their Confidence Verbally](https://arxiv.org/abs/2508.18847), accepted by NeurIPS 2025.
11
 
12
+ The official code is available at: [https://github.com/liushiliushi/ConfTuner](https://github.com/liushiliushi/ConfTuner)
13
+
14
  ## Model Details
15
 
16
  ### Model Description
 
38
 
39
  ### Framework versions
40
 
41
+ - PEFT 0.12.0