lccurious commited on
Commit
c2e6723
·
verified ·
1 Parent(s): e43dd0e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -8,7 +8,7 @@ tags:
8
  - text_generation
9
  ---
10
  DA2.0-flash-preview
11
- **LLaDA2-flash-preview** is a diffusion language model featuring a 100BA6B Mixture-of-Experts (MoE) architecture. As an enhanced, instruction-tuned iteration of the LLaDA2.0 series, it is optimized for practical applications.
12
 
13
  <div align="center">
14
  <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*kLORSaRfSK8AAAAAgIAAAAgAemJ7AQ/original" width="800" />
@@ -65,8 +65,8 @@ Fully open-source with commitment to transparency. We plan to release a **leadin
65
  ## 📦 Model Variants
66
  | Model ID | Description | Hugging Face Link |
67
  | --- | --- | --- |
68
- | `inclusionAI/LLaDA2-mini-preview` | Instruction-tuned model, ready for downstream applications. | [🤗 Model Card](https://huggingface.co/inclusionAI/LLaDA2.0-mini-preview) |
69
- | `inclusionAI/LLaDA2-flash-preview` | Instruction-tuned model, ready for downstream applications. | [🤗 Model Card](https://huggingface.co/inclusionAI/LLaDA2.0-flash-preview) |
70
 
71
 
72
  ---
@@ -93,7 +93,7 @@ import torch.nn.functional as F
93
  from transformers import AutoModelForCausalLM
94
  from transformers import AutoTokenizer
95
 
96
- model_path = "/path/to/LLaDA2-mini-preview"
97
  device = "auto"
98
  model = AutoModelForCausalLM.from_pretrained(
99
  model_path, trust_remote_code=True, device_map=device
 
8
  - text_generation
9
  ---
10
  DA2.0-flash-preview
11
+ **LLaDA2.0-flash-preview** is a diffusion language model featuring a 100BA6B Mixture-of-Experts (MoE) architecture. As an enhanced, instruction-tuned iteration of the LLaDA2.0 series, it is optimized for practical applications.
12
 
13
  <div align="center">
14
  <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*kLORSaRfSK8AAAAAgIAAAAgAemJ7AQ/original" width="800" />
 
65
  ## 📦 Model Variants
66
  | Model ID | Description | Hugging Face Link |
67
  | --- | --- | --- |
68
+ | `inclusionAI/LLaDA2.0-mini-preview` | Instruction-tuned model, ready for downstream applications. | [🤗 Model Card](https://huggingface.co/inclusionAI/LLaDA2.0-mini-preview) |
69
+ | `inclusionAI/LLaDA2.0-flash-preview` | Instruction-tuned model, ready for downstream applications. | [🤗 Model Card](https://huggingface.co/inclusionAI/LLaDA2.0-flash-preview) |
70
 
71
 
72
  ---
 
93
  from transformers import AutoModelForCausalLM
94
  from transformers import AutoTokenizer
95
 
96
+ model_path = "/path/to/LLaDA2.0-mini-preview"
97
  device = "auto"
98
  model = AutoModelForCausalLM.from_pretrained(
99
  model_path, trust_remote_code=True, device_map=device