Transformers
PyTorch
Safetensors
English
encoder-decoder
text2text-generation
medical
How to use from the
Use from the
Transformers library
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("IndianaUniversityDatasetsModels/test-BB3")
model = AutoModelForSeq2SeqLM.from_pretrained("IndianaUniversityDatasetsModels/test-BB3")
Quick Links

Inputs and Outputs

  • **Expected Input: " [Problems] + Text"
  • **Target Output: " [findings] + Text [impression] + Text"
Downloads last month
10
Safetensors
Model size
0.2B params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support