FinBloom: Knowledge Grounding Large Language Model with Real-time Financial Data
Paper
•
2502.18471
•
Published
FinBloom 7B is a specialized Large Language Model engineered for high-performance financial NLP applications. Built upon the BLOOM-7B architecture, the model has undergone extensive fine-tuning using a robust corpus of historical financial data, including comprehensive news archives from Reuters and DPA, as well as several years of official SEC filings. This targeted training enables the model to excel at complex downstream financial analysis and domain-specific reasoning.
| Dataset | Documents | Mean Words | Mean Tokens | Time Period |
|---|---|---|---|---|
| Reuters news | 14,574,641 | 369.23 | 459.09 | 1st Jan 2003-31st Dec 2012 |
| DPA news | 387,187 | 286.20 | 390.37 | 1st Jun 2001-31st May 2011 |
| SEC filings | 12,238,570 | 379.96 | 536.56 | 31st Mar 2009-31st Oct 2023 |
Use the code below to get started with the model.
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM, AutoTokenizer
peft_model_id = "Chaitanya14/FinBloom_7B"
config = PeftConfig.from_pretrained(peft_model_id)
tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)
model = AutoModelForCausalLM.from_pretrained(config.base_model_name_or_path)
model = PeftModel.from_pretrained(model, peft_model_id)
If you use the FinBloom 7B LLM, please cite with the following BibTex entry:
@misc{sinha2025finbloomknowledgegroundinglarge,
title={FinBloom: Knowledge Grounding Large Language Model with Real-time Financial Data},
author={Ankur Sinha and Chaitanya Agarwal and Pekka Malo},
year={2025},
eprint={2502.18471},
archivePrefix={arXiv},
primaryClass={cs.IR},
url={https://arxiv.org/abs/2502.18471},
}
Base model
bigscience/bloom-7b1