Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
BUAADreamer
/
Chinese-LLaVA-Med-7B
like
4
Visual Question Answering
Transformers
Safetensors
BUAADreamer/llava-med-zh-instruct-60k
BUAADreamer/llava-med-zh-eval
Chinese
llava
image-text-to-text
llama-factory
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
main
Chinese-LLaVA-Med-7B
14.1 GB
1 contributor
History:
10 commits
BUAADreamer
Update README.md
9382b99
verified
almost 2 years ago
.gitattributes
1.52 kB
initial commit
almost 2 years ago
README.md
447 Bytes
Update README.md
almost 2 years ago
added_tokens.json
41 Bytes
Upload tokenizer
almost 2 years ago
config.json
1.02 kB
Upload LlavaForConditionalGeneration
almost 2 years ago
generation_config.json
136 Bytes
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00001-of-00008.safetensors
1.95 GB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00002-of-00008.safetensors
1.93 GB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00003-of-00008.safetensors
1.99 GB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00004-of-00008.safetensors
1.99 GB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00005-of-00008.safetensors
1.99 GB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00006-of-00008.safetensors
1.99 GB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00007-of-00008.safetensors
1.93 GB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model-00008-of-00008.safetensors
353 MB
xet
Upload LlavaForConditionalGeneration
almost 2 years ago
model.safetensors.index.json
70.1 kB
Upload LlavaForConditionalGeneration
almost 2 years ago
preprocessor_config.json
819 Bytes
Upload processor
almost 2 years ago
special_tokens_map.json
552 Bytes
Upload tokenizer
almost 2 years ago
tokenizer.json
1.84 MB
Upload tokenizer
almost 2 years ago
tokenizer.model
500 kB
xet
Upload tokenizer
almost 2 years ago
tokenizer_config.json
1.95 kB
Upload tokenizer
almost 2 years ago