Text Generation
Transformers
Safetensors
German
mistral
conversational
text-generation-inference

Improve model card with metadata and links

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +45 -5
README.md CHANGED
@@ -1,13 +1,49 @@
1
- We present SteuerLLM, a domain-adapted LLM for German tax law trained on a large-scale synthetic dataset generated from authentic examination material using a controlled retrieval-augmented pipeline. SteuerLLM (28B parameters) consistently outperforms general-purpose instruction-tuned models of comparable size and, in several cases, substantially larger systems, demonstrating that domain-specific data and architectural adaptation are more decisive than parameter scale for performance on realistic legal reasoning tasks.
 
 
 
 
 
 
 
 
 
 
2
 
 
3
 
4
- Demo: https://steuerllm.i5.ai.fau.de/
5
 
6
- Checkout all the details on Github: https://github.com/windprak/steuerllm
7
 
8
- It can be served via Transformers, vLLM, SGlang, .. basically any Framework since it is based on Mistral (Expanded Mistral Small from 24B to 28B).
9
- Recommended Temperature: 0.3
 
10
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
 
12
  ```bibtex
13
  @article{steuerllm,
@@ -18,3 +54,7 @@ Recommended Temperature: 0.3
18
  url = {https://arxiv.org/abs/2602.11081}
19
  }
20
  ```
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - de
4
+ license: other
5
+ library_name: transformers
6
+ pipeline_tag: text-generation
7
+ datasets:
8
+ - windprak/steuerllm_pretraining_dataset
9
+ - windprak/steuerllm_instruct_dataset
10
+ - windprak/SteuerEx
11
+ ---
12
 
13
+ # SteuerLLM: Local specialized large language model for German tax law analysis
14
 
15
+ SteuerLLM is a domain-adapted Large Language Model (LLM) consisting of 28 billion parameters, specifically designed for German tax law analysis. It was introduced in the paper [SteuerLLM: Local specialized large language model for German tax law analysis](https://huggingface.co/papers/2602.11081).
16
 
17
+ The model excels in domains governed by strict formal rules, precise terminology, and legally binding structures, such as tax law, where correct answers require exact statutory citation, structured legal argumentation, and numerical accuracy.
18
 
19
+ - **Paper:** [SteuerLLM: Local specialized large language model for German tax law analysis](https://huggingface.co/papers/2602.11081)
20
+ - **GitHub Repository:** [https://github.com/windprak/steuerllm](https://github.com/windprak/steuerllm)
21
+ - **Demo:** [https://steuerllm.i5.ai.fau.de/](https://steuerllm.i5.ai.fau.de/)
22
 
23
+ ## Model Description
24
+
25
+ SteuerLLM is based on an expanded Mistral Small architecture (extended from 24B to 28B parameters through a block expansion method). It was trained on a large-scale synthetic dataset generated from authentic German university tax law examination material using a controlled retrieval-augmented pipeline.
26
+
27
+ The training procedure follows a two-stage approach:
28
+ 1. **Continual Pretraining:** The base model's representations are adapted to tax-specific terminology and concepts by pretraining on domain-filtered web data.
29
+ 2. **Instruction Fine-tuning:** The model is then fine-tuned on synthetically generated question-answer pairs derived from primary German legal sources (e.g., EStG, AO, KStG) using the "Water Fountain Algorithm." This algorithm employs retrieval-augmented generation with semantic ranking to ensure factual grounding and contextual relevance.
30
+
31
+ SteuerLLM consistently outperforms general-purpose instruction-tuned models of comparable size and, in several cases, substantially larger systems, demonstrating the critical role of domain-specific data and architectural adaptation for performance on realistic legal reasoning tasks.
32
+
33
+ ## Evaluation
34
+
35
+ The model's performance was evaluated using **SteuerEx**, the first open benchmark derived from authentic German university tax law examinations. SteuerEx comprises 115 expert-validated examination questions spanning six core tax law domains and multiple academic levels, utilizing a statement-level, partial-credit evaluation framework.
36
+
37
+ ## Usage
38
+
39
+ SteuerLLM can be served via various frameworks, including **Transformers**, **vLLM**, and **SGLang**, as it is based on the Mistral architecture.
40
+
41
+ **Recommended Inference Parameters:**
42
+ - **Temperature:** 0.3
43
+
44
+ ## Citation
45
+
46
+ If you use this work, please cite:
47
 
48
  ```bibtex
49
  @article{steuerllm,
 
54
  url = {https://arxiv.org/abs/2602.11081}
55
  }
56
  ```
57
+
58
+ ## License
59
+
60
+ Research and academic use only.