Impossible language models trained from scratch with a GPT-2 Small architecture that lacks positional encodings.
-
mission-impossible-lms/no-shuffle-gpt2-no-pos
0.1B • Updated • 3 -
mission-impossible-lms/nondeterministic-shuffle-gpt2-no-pos
0.1B • Updated • 3 -
mission-impossible-lms/deterministic-shuffle-s21-gpt2-no-pos
0.1B • Updated • 1 -
mission-impossible-lms/deterministic-shuffle-s57-gpt2-no-pos
0.1B • Updated • 2