metadata
base_model: google/flan-t5-large
library_name: peft
The Financial Agent flan-t5 is a compact, high-efficiency version of the Financial Agent architecture. Built upon the flan-t5-large backbone, this model has been specifically fine-tuned on the Financial Context Dataset to act as a linguistic bridge between user queries and structured data systems. It specializes in decomposing complex, natural language financial prompts into precise retrieval parameters, enabling a seamless handoff to downstream Data Modules for accurate and optimized information sourcing.
How to Get Started with the Model
Use the code below to get started with the model.
from peft import PeftModel, PeftConfig
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
peft_model_id = "Chaitanya14/Financial_Agent_flant5"
config = PeftConfig.from_pretrained(peft_model_id)
tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)
model = AutoModelForSeq2SeqLM.from_pretrained(config.base_model_name_or_path)
model = PeftModel.from_pretrained(model, peft_model_id)
Citation
If you use the Financial Agent flan-t5 model, please cite with the following BibTex entry:
@misc{sinha2025finbloomknowledgegroundinglarge,
title={FinBloom: Knowledge Grounding Large Language Model with Real-time Financial Data},
author={Ankur Sinha and Chaitanya Agarwal and Pekka Malo},
year={2025},
eprint={2502.18471},
archivePrefix={arXiv},
primaryClass={cs.IR},
url={https://arxiv.org/abs/2502.18471},
}