| inference: false | |
| <br> | |
| <br> | |
| # LWM-Text-Chat-512K Model Card | |
| ## Model details | |
| **Model type:** | |
| LWM-Text-Chat-512K is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data. It is an auto-regressive language model, based on the transformer architecture. | |
| **Model date:** | |
| LWM-Text-Chat-512K was trained in December 2023. | |
| **Paper or resources for more information:** | |
| https://largeworldmodel.github.io/ | |
| ## License | |
| Llama 2 is licensed under the LLAMA 2 Community License, | |
| Copyright (c) Meta Platforms, Inc. All Rights Reserved. | |
| **Where to send questions or comments about the model:** | |
| https://github.com/LargeWorldModel/lwm/issues | |
| ## Training dataset | |
| - 3500 subset of Books3 documents with 500K to 1M tokens | |
| *** | |
| Vanilla Quantization by [nold](https://huggingface.co/nold), Original Model [LargeWorldModel/LWM-Text-Chat-512K](https://huggingface.co/LargeWorldModel/LWM-Text-Chat-512K). Created using [llm-quantizer](https://github.com/Nold360/llm-quantizer) Pipeline - c4aee1019a651a28a4f89587b96894ae7188734f | |