Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -42,5 +42,17 @@ The project is divided into the core orchestrator and specialized [Flower Apps](
|
|
| 42 |
|
| 43 |
These applications are built upon the FlowerTune LLM templates for the paper "FlowerTune: A Cross-Domain Benchmark for Federated Fine-Tuning of Large Language Models" (https://arxiv.org/abs/2506.02961) presented at NeurIPS 2025 conference, showcasing federated fine-tuning of Language Models on specialised tasks.
|
| 44 |
|
| 45 |
-
- **BlossomTuneLLM**: A Flower App for the federated fine-tuning of transformers-based Large Language Models. https://github.com/ethicalabs-ai/BlossomTuneLLM
|
| 46 |
-
- **BlossomTuneLLM-MLX**: A specialized Flower App for the federated fine-tuning of Apple's MLX-LM models https://github.com/ethicalabs-ai/BlossomTuneLLM-MLX
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 42 |
|
| 43 |
These applications are built upon the FlowerTune LLM templates for the paper "FlowerTune: A Cross-Domain Benchmark for Federated Fine-Tuning of Large Language Models" (https://arxiv.org/abs/2506.02961) presented at NeurIPS 2025 conference, showcasing federated fine-tuning of Language Models on specialised tasks.
|
| 44 |
|
| 45 |
+
- **BlossomTuneLLM**: A Flower App for the federated fine-tuning of transformers-based Large Language Models. https://github.com/ethicalabs-ai/BlossomTuneLLM
|
| 46 |
+
- **BlossomTuneLLM-MLX**: A specialized Flower App for the federated fine-tuning of Apple's MLX-LM models https://github.com/ethicalabs-ai/BlossomTuneLLM-MLX
|
| 47 |
+
|
| 48 |
+
## Citations
|
| 49 |
+
|
| 50 |
+
```bibtex
|
| 51 |
+
@misc{gao-2025,
|
| 52 |
+
author = {Gao, Yan and Scamarcia, Massimo Roberto and Fernandez-Marques, Javier and Naseri, Mohammad and Ng, Chong Shen and Stripelis, Dimitris and Li, Zexi and Shen, Tao and Bai, Jiamu and Chen, Daoyuan and Zhang, Zikai and Hu, Rui and Song, InSeo and KangYoon, Lee and Jia, Hong and Dang, Ting and Wang, Junyan and Liu, Zheyuan and Beutel, Daniel Janes and Lyu, Lingjuan and Lane, Nicholas D.},
|
| 53 |
+
month = {6},
|
| 54 |
+
title = {{FlowerTune: a Cross-Domain benchmark for federated Fine-Tuning of large language models}},
|
| 55 |
+
year = {2025},
|
| 56 |
+
url = {https://arxiv.org/abs/2506.02961},
|
| 57 |
+
}
|
| 58 |
+
```
|