Instructions to use codesage/codesage-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use codesage/codesage-small with Transformers:
# Load model directly from transformers import CodeSage model = CodeSage.from_pretrained("codesage/codesage-small", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Upload tokenizer_config.json
Browse files- tokenizer_config.json +5 -2
tokenizer_config.json
CHANGED
|
@@ -26,7 +26,10 @@
|
|
| 26 |
"eos_token": "<|endoftext|>",
|
| 27 |
"add_eos_token": true,
|
| 28 |
"model_max_length": 1000000000000000019884624838656,
|
| 29 |
-
"tokenizer_class": "CodeSageTokenizer",
|
| 30 |
"unk_token": "<|endoftext|>",
|
| 31 |
-
"vocab_size": 49152
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
}
|
|
|
|
| 26 |
"eos_token": "<|endoftext|>",
|
| 27 |
"add_eos_token": true,
|
| 28 |
"model_max_length": 1000000000000000019884624838656,
|
|
|
|
| 29 |
"unk_token": "<|endoftext|>",
|
| 30 |
+
"vocab_size": 49152,
|
| 31 |
+
"tokenizer_class": "CodeSageTokenizer",
|
| 32 |
+
"auto_map": {
|
| 33 |
+
"AutoTokenizer": ["tokenization_codesage.CodeSageTokenizer", null]
|
| 34 |
+
}
|
| 35 |
}
|