Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
|
| 2 |
**Where to send questions or comments about the model:**
|
| 3 |
https://github.com/LargeWorldModel/lwm/issues
|
|
|
|
| 1 |
+
**Model type:**
|
| 2 |
+
LWM-32K-Jax is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data, along with a large collection if image and video data. It is an auto-regressive vision-language model, based on the transformer architecture. These are the Jax / Flax version of the parameters.
|
| 3 |
+
|
| 4 |
+
**Model date:**
|
| 5 |
+
LWM-32K-Jax was trained in January 2024.
|
| 6 |
+
|
| 7 |
+
**Paper or resources for more information:**
|
| 8 |
+
https://largeworldmodel.github.io/
|
| 9 |
+
|
| 10 |
+
## License
|
| 11 |
+
Llama 2 is licensed under the LLAMA 2 Community License,
|
| 12 |
+
Copyright (c) Meta Platforms, Inc. All Rights Reserved.
|
| 13 |
|
| 14 |
**Where to send questions or comments about the model:**
|
| 15 |
https://github.com/LargeWorldModel/lwm/issues
|