| license: mit | |
| datasets: | |
| - omarmomen/babylm_10M | |
| language: | |
| - en | |
| metrics: | |
| - perplexity | |
| library_name: transformers | |
| # Model Card for omarmomen/ptb_bpe_tokenizer_10k | |
| This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714). | |
| "omarmomen/ptb_bpe_tokenizer_10k" is a RobertaTokenizer pretrained on the Penn Tree Bank Training dataset (cased) with 10K tokens. | |
| https://arxiv.org/abs/2403.09714 |