LLaDA-100M-Test / README.md
Fredtt3's picture
Update README.md
ed16da2 verified
metadata
datasets:
  - Fredtt3/LLaDA-Sample-10BT
  - Fredtt3/LLaDA-Sample-ES
language:
  - en
  - es
pipeline_tag: text-generation
library_name: transformers

New checkpoint trained on an NVIDIA H100 for 8,000 steps and 65,536,000 tokens

It is not yet a competent model because it does not meet the minimum training requirement of 20-30 tokens per parameter. However, it can give us a better idea of how a better-trained model would perform.

If you want to try how to use it here is a file of how to use it in test_gen.py Or using this Google Colab notebook

Example of the results it gives:

image/png

For those who want to train and get the correct format to be able to load it with transformers, everything needed is in pre_trainv2.py of the project repo