Robotics
Transformers
Safetensors
qwen2_5_vl
text-generation
vision-language-model
video-language-model
navigation
text-generation-inference
Instructions to use InternRobotics/InternVLA-N1-wo-dagger with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use InternRobotics/InternVLA-N1-wo-dagger with Transformers:
# Load model directly from transformers import AutoProcessor, AutoModelForCausalLM processor = AutoProcessor.from_pretrained("InternRobotics/InternVLA-N1-wo-dagger") model = AutoModelForCausalLM.from_pretrained("InternRobotics/InternVLA-N1-wo-dagger") - Notebooks
- Google Colab
- Kaggle
ChaimZhu commited on
Commit ·
b34c8e4
1
Parent(s): e906d2b
update config.json
Browse files- config.json +1 -0
config.json
CHANGED
|
@@ -16,6 +16,7 @@
|
|
| 16 |
"n_query": 16,
|
| 17 |
"navdp": "NavDP_Policy_DPT_CriticSum_DAT",
|
| 18 |
"navdp_pretrained": null,
|
|
|
|
| 19 |
"num_attention_heads": 28,
|
| 20 |
"num_hidden_layers": 28,
|
| 21 |
"num_key_value_heads": 4,
|
|
|
|
| 16 |
"n_query": 16,
|
| 17 |
"navdp": "NavDP_Policy_DPT_CriticSum_DAT",
|
| 18 |
"navdp_pretrained": null,
|
| 19 |
+
"navdp_version": 0.1,
|
| 20 |
"num_attention_heads": 28,
|
| 21 |
"num_hidden_layers": 28,
|
| 22 |
"num_key_value_heads": 4,
|