| license: mit | |
| # Language model trained with reversed words | |
| The model weights here were trained using Tiny Stories, but with the words and puncuation reversed. | |
| This is described at: | |
| https://openright.org/reversed-llm-training/ | |
| ## Running the reverse inference | |
| The model may be ran using these weights with the 'llama2.c' project. | |
| The input tokens are expected in reverse, and the output tokens are also in reverse. For example, "This is a test.", becomes ".test a is This". | |
| We can simply run with reversed input, and see the reversed output. | |
| ``` | |
| ./run revstories15M.bin -s 3 -i '.sky the in butterfly' | |
| ``` | |
| > .sky the in butterfly purple beautiful the about family her tell | |
| > ... | |
| But to have the input and output unreversed, we can use the 'wtac' script to reverse the words. | |
| ``` | |
| ./run revstories15M.bin -s 3 -i "$(echo 'butterfly in the sky.' | python3 wtac.py)" | python3 ./wtac.py | |
| ``` | |
| > Once upon a time, there was a little girl named Lily. She loved to play in the park with her friends. One day, the butterfly landed on Lily's hands and led her to a flower. Lily was very happy and couldn't wait to tell her family about the beautiful purple butterfly in the sky. | |