nielsr HF Staff commited on
Commit
78cb265
·
verified ·
1 Parent(s): 712e480

Improve model card: Add pipeline tag, library name, and Hugging Face paper link

Browse files

This PR enhances the model card by:
- Promoting the `text-to-image` tag to a top-level `pipeline_tag` for improved discoverability on the Hub.
- Adding `library_name: diffusers` to the metadata, which enables the "How to use" widget with an automated code snippet for `diffusers` usage. This is based on evidence from the repository's `model_index.json` files indicating `diffusers` compatibility.
- Updating the paper links within the model card content to point to the official Hugging Face paper page: [Paris: A Decentralized Trained Open-Weight Diffusion Model](https://huggingface.co/papers/2510.03434).

Please review and merge if these improvements are satisfactory.

Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -1,7 +1,8 @@
1
  ---
2
  license: mit
 
 
3
  tags:
4
- - text-to-image
5
  - diffusion
6
  - multi-expert
7
  - dit
@@ -21,13 +22,13 @@ tags:
21
  <a href="https://github.com/bageldotcom/paris" target="_blank">
22
  <img src="https://img.shields.io/badge/⭐_STAR_ON_GITHUB-100000?style=for-the-badge&logo=github&logoColor=white" alt="Star on GitHub" height="40">
23
  </a>
24
- <a href="https://github.com/bageldotcom/paris/blob/main/paper.pdf" target="_blank">
25
  <img src="https://img.shields.io/badge/📄_READ_PAPER-FF6B6B?style=for-the-badge&logoColor=white" alt="Read Technical Report" height="40">
26
  </a>
27
 
28
  <div style="margin-top: 20px;"></div>
29
 
30
- The world's first open-weight diffusion model trained entirely through decentralized computation. The model consists of 8 expert diffusion models (129M-605M parameters each) trained in complete isolation with no gradient, parameter, or intermediate activation synchronization, achieving superior parallelism efficiency over traditional methods while using 14× less data and 16× less compute than baselines. [Read our technical report](https://github.com/bageldotcom/paris/blob/main/paper.pdf) to learn more.
31
 
32
  # Key Characteristics
33
 
 
1
  ---
2
  license: mit
3
+ pipeline_tag: text-to-image
4
+ library_name: diffusers
5
  tags:
 
6
  - diffusion
7
  - multi-expert
8
  - dit
 
22
  <a href="https://github.com/bageldotcom/paris" target="_blank">
23
  <img src="https://img.shields.io/badge/⭐_STAR_ON_GITHUB-100000?style=for-the-badge&logo=github&logoColor=white" alt="Star on GitHub" height="40">
24
  </a>
25
+ <a href="https://huggingface.co/papers/2510.03434" target="_blank">
26
  <img src="https://img.shields.io/badge/📄_READ_PAPER-FF6B6B?style=for-the-badge&logoColor=white" alt="Read Technical Report" height="40">
27
  </a>
28
 
29
  <div style="margin-top: 20px;"></div>
30
 
31
+ The world's first open-weight diffusion model trained entirely through decentralized computation. The model consists of 8 expert diffusion models (129M-605M parameters each) trained in complete isolation with no gradient, parameter, or intermediate activation synchronization, achieving superior parallelism efficiency over traditional methods while using 14× less data and 16× less compute than baselines. [Read our technical report](https://huggingface.co/papers/2510.03434) to learn more.
32
 
33
  # Key Characteristics
34