Update README.md
Browse files
README.md
CHANGED
|
@@ -1,8 +1,10 @@
|
|
| 1 |
---
|
| 2 |
language: en
|
| 3 |
tags:
|
| 4 |
-
- summarization
|
| 5 |
- medical
|
|
|
|
|
|
|
| 6 |
---
|
| 7 |
|
| 8 |
# Automatic Personalized Impression Generation for PET Reports Using Large Language Models πβ
|
|
@@ -51,5 +53,4 @@ The models were trained on NVIDIA A100 GPUs.
|
|
| 51 |
|
| 52 |
## π Additional Resources
|
| 53 |
- **Finetuned from model:** [Facebook's BART Large) (https://huggingface.co/google/t5-v1_1-large)
|
| 54 |
-
- **Codebase for training and inference:** [GitHub Repository](https://github.com/xtie97/PET-Report-Summarization)
|
| 55 |
-
|
|
|
|
| 1 |
---
|
| 2 |
language: en
|
| 3 |
tags:
|
| 4 |
+
- summarization
|
| 5 |
- medical
|
| 6 |
+
library_name: transformers
|
| 7 |
+
pipeline_tag: summarization
|
| 8 |
---
|
| 9 |
|
| 10 |
# Automatic Personalized Impression Generation for PET Reports Using Large Language Models πβ
|
|
|
|
| 53 |
|
| 54 |
## π Additional Resources
|
| 55 |
- **Finetuned from model:** [Facebook's BART Large) (https://huggingface.co/google/t5-v1_1-large)
|
| 56 |
+
- **Codebase for training and inference:** [GitHub Repository](https://github.com/xtie97/PET-Report-Summarization)
|
|
|