Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
sensenova
/
InteractiveOmni-4B
like
5
Follow
SenseNova
98
Any-to-Any
ONNX
Safetensors
Chinese
English
interactiveomni
custom_code
arxiv:
2510.13747
License:
mit
Model card
Files
Files and versions
xet
Community
main
InteractiveOmni-4B
12.4 GB
1 contributor
History:
5 commits
tongww
Update README.md
2460f0d
verified
2 months ago
.gitattributes
1.57 kB
upload initial model
2 months ago
README.md
34.5 kB
Update README.md
2 months ago
added_tokens.json
8.96 kB
upload initial model
2 months ago
campplus.onnx
28.3 MB
xet
upload initial model
2 months ago
config.json
16.3 kB
upload initial model
2 months ago
configuration_flow.py
4.2 kB
upload initial model
2 months ago
configuration_hifigan.py
3.54 kB
upload initial model
2 months ago
configuration_interactiveomni.py
5.08 kB
upload initial model
2 months ago
configuration_intern_vit.py
5.55 kB
upload initial model
2 months ago
configuration_voicelm.py
2.24 kB
upload initial model
2 months ago
configuration_whisper.py
17 kB
upload initial model
2 months ago
conversation.py
12.5 kB
upload initial model
2 months ago
generation_config.json
69 Bytes
upload initial model
2 months ago
merges.txt
1.67 MB
upload initial model
2 months ago
model-00001-of-00003.safetensors
4.97 GB
xet
upload initial model
2 months ago
model-00002-of-00003.safetensors
4.99 GB
xet
upload initial model
2 months ago
model-00003-of-00003.safetensors
2.44 GB
xet
upload initial model
2 months ago
model.safetensors.index.json
294 kB
upload initial model
2 months ago
modeling_flow.py
94.5 kB
upload initial model
2 months ago
modeling_hifigan.py
18.3 kB
upload initial model
2 months ago
modeling_interactiveomni.py
34.1 kB
upload initial model
2 months ago
modeling_intern_vit.py
18 kB
upload initial model
2 months ago
modeling_voicelm.py
9 kB
upload initial model
2 months ago
modeling_whisper.py
109 kB
upload initial model
2 months ago
special_tokens_map.json
7.07 kB
upload initial model
2 months ago
taozi.wav
807 kB
xet
upload initial model
2 months ago
tokenizer.json
7.09 MB
upload initial model
2 months ago
tokenizer_config.json
70.2 kB
upload initial model
2 months ago
vocab.json
2.78 MB
upload initial model
2 months ago