| | --- |
| | license: mit |
| | language: |
| | - en |
| | --- |
| | # TaxBERT |
| |
|
| | This repository accompanies the paper: Hechtner, F., Schmidt, L., Seebeck, A., & Weiß, M. (2025). How to design and employ specialized large language models for accounting and tax research: The example of TaxBERT. |
| | TaxBERT is a domain-adapated RoBERTa model, specifically designed to analyze qualitative corporate tax disclosures. |
| |
|
| | In the future, we will add the following features: |
| | - Tax Sentence Recognition |
| | - Tax Risk Sentiment |
| |
|
| | **SSRN**: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5146523 |
| | The paper provides an ‘A-to-Z’ description of how to design and employ specialized Bidirectional Encoder Representation of Transformers (BERT) models that are environmentally sustainable and practically feasible for accounting and tax researchers. |
| | |
| | **GitHub**: https://github.com/TaxBERT/TaxBERT |
| | |
| | If the following Guide/Repository is used for academic or scientific purposes, please cite the paper. |