Add link to paper

#2
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -26,7 +26,7 @@ For more information and implementation details, visit our github repository: [D
26
  The model, which has a context length of `1024` and is similar in size to GPT2-medium with approximately `130 million` non-embedding parameters,
27
  was trained for 1M steps on the OpenWebText corpus.
28
 
29
- For more details, please see our paper: [The Diffusion Duality](https://openreview.net/forum?id=CB0Ub2yXjC).
30
 
31
 
32
 
 
26
  The model, which has a context length of `1024` and is similar in size to GPT2-medium with approximately `130 million` non-embedding parameters,
27
  was trained for 1M steps on the OpenWebText corpus.
28
 
29
+ For more details, please see our paper: [The Diffusion Duality](https://huggingface.co/papers/2506.10892), also available on [OpenReview](https://openreview.net/forum?id=CB0Ub2yXjC).
30
 
31
 
32