Instructions to use NiharMandahas/Os_script_evaluator with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use NiharMandahas/Os_script_evaluator with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("NiharMandahas/Os_script_evaluator", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update config.json
#1
by raghoeveer - opened
- config.json +1 -1
config.json
CHANGED
|
@@ -1,4 +1,4 @@
|
|
| 1 |
-
{
|
| 2 |
"alpha_pattern": {},
|
| 3 |
"auto_mapping": null,
|
| 4 |
"base_model_name_or_path": "NousResearch/Llama-2-7b-chat-hf",
|
|
|
|
| 1 |
+
{
|
| 2 |
"alpha_pattern": {},
|
| 3 |
"auto_mapping": null,
|
| 4 |
"base_model_name_or_path": "NousResearch/Llama-2-7b-chat-hf",
|