nielsr HF Staff commited on
Commit
0679897
·
verified ·
1 Parent(s): 98a8323

Add `library_name: transformers` to metadata

Browse files

This PR updates the model card by adding `library_name: transformers` to the metadata.

Evidence for `transformers` compatibility:
- The `config.json` file contains `"transformers_version": "4.46.2"` and `architectures: ["LLaDAForMultiModalGeneration"]`, along with `auto_map` entries that point to `configuration_llada.LLaDAConfig` and `modeling_llada.LLaDAModelLM`.
- The presence of standard Hugging Face files like `tokenizer.json`, `special_tokens_map.json`, `tokenizer_config.json`, and `generation_config.json` further confirms compatibility.

This addition will enable the automated "How to use" widget on the model page, showcasing a code snippet for using the model with the 🤗 Transformers library, thereby improving its discoverability and ease of use.

Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -4,6 +4,7 @@ pipeline_tag: any-to-any
4
  tags:
5
  - Diffusion Large Language Model
6
  - Multi-Modal Generation and Understanding
 
7
  ---
8
 
9
  <p align="center">
@@ -127,5 +128,4 @@ This work was also supported and implemented by [MindSpeed MM](https://gitee.com
127
  year={2025},
128
  url={https://github.com/Alpha-VLLM/Lumina-DiMOO},
129
  }
130
- ```
131
-
 
4
  tags:
5
  - Diffusion Large Language Model
6
  - Multi-Modal Generation and Understanding
7
+ library_name: transformers
8
  ---
9
 
10
  <p align="center">
 
128
  year={2025},
129
  url={https://github.com/Alpha-VLLM/Lumina-DiMOO},
130
  }
131
+ ```