Datasets:
id
stringlengths 7
117
| author
stringclasses 6
values | sha
null | created_at
timestamp[us, tz=UTC] | last_modified
null | disabled
null | downloads
int64 0
18.6M
| downloads_all_time
null | gated
bool 1
class | gguf
null | inference
null | likes
int64 0
4.77k
| library_name
stringclasses 36
values | tags
sequencelengths 1
430
| pipeline_tag
stringclasses 32
values | mask_token
null | model_index
null | trending_score
int64 0
132
| architectures
sequencelengths 1
5
⌀ | bos_token_id
int64 -1
256k
⌀ | eos_token_id
int64 -1
256k
⌀ | hidden_act
stringclasses 15
values | hidden_size
int64 1
20.5k
⌀ | initializer_range
float64 0
1
⌀ | intermediate_size
int64 1
98.3k
⌀ | max_position_embeddings
int64 8
1.05M
⌀ | model_type
stringclasses 530
values | num_attention_heads
int64 1
5k
⌀ | num_hidden_layers
int64 -1
8.93k
⌀ | num_key_value_heads
int64 1
160
⌀ | rms_norm_eps
float64 0
7
⌀ | rope_theta
float64 1k
1,000B
⌀ | sliding_window
int64 0
262k
⌀ | tie_word_embeddings
bool 2
classes | torch_dtype
stringclasses 8
values | transformers_version
stringclasses 207
values | use_cache
bool 2
classes | vocab_size
int64 -1
5.03M
⌀ | attention_bias
bool 2
classes | attention_dropout
float64 0
0.5
⌀ | head_dim
int64 2
256
⌀ | mlp_bias
bool 2
classes | pretraining_tp
int64 0
8
⌀ | rope_scaling
dict |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HuggingFaceTB/SmolLM2-1.7B-Instruct
| null | null | 2024-10-31T13:42:06
| null | null | 43,720
| null | null | null | null | 307
|
transformers
|
[
"transformers",
"tensorboard",
"onnx",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 132
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 2,048
| 0.02
| 8,192
| 8,192
|
llama
| 32
| 24
| 32
| 0.00001
| 130,000
| null | true
|
bfloat16
|
4.42.3
| true
| 49,152
| false
| 0
| null | false
| 1
| null |
infly/OpenCoder-8B-Instruct
| null | null | 2024-11-07T16:23:14
| null | null | 402
| null | null | null | null | 77
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"zh",
"dataset:OpenCoder-LLM/opencoder-sft-stage1",
"dataset:OpenCoder-LLM/opencoder-sft-stage2",
"arxiv:2411.04905",
"base_model:infly/OpenCoder-8B-Base",
"base_model:finetune:infly/OpenCoder-8B-Base",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 77
|
[
"LlamaForCausalLM"
] | 96,540
| 96,539
|
silu
| 4,096
| 0.02
| 14,336
| 8,192
|
llama
| 32
| 32
| 8
| 0.00001
| 500,000
| null | false
|
bfloat16
|
4.37.0
| true
| 96,640
| false
| 0
| null | false
| 1
| null |
Qwen/Qwen2.5-Coder-7B-Instruct
| null | null | 2024-09-17T13:38:49
| null | null | 72,981
| null | null | null | null | 245
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"code",
"codeqwen",
"chat",
"qwen",
"qwen-coder",
"conversational",
"en",
"arxiv:2409.12186",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-Coder-7B",
"base_model:finetune:Qwen/Qwen2.5-Coder-7B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 54
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 3,584
| 0.02
| 18,944
| 32,768
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.44.0
| true
| 152,064
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-72B-Instruct
| null | null | 2024-09-16T11:56:31
| null | null | 416,529
| null | null | null | null | 452
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-72B",
"base_model:finetune:Qwen/Qwen2.5-72B",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 32
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 8,192
| 0.02
| 29,568
| 32,768
|
qwen2
| 64
| 80
| 8
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.43.1
| true
| 152,064
| null | 0
| null | null | null | null |
facebook/MobileLLM-125M
| null | null | 2024-10-30T22:48:34
| null | null | 4,281
| null | null | null | null | 80
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"mobilellm",
"text-generation",
"custom_code",
"arxiv:2402.14905",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"region:us"
] |
text-generation
| null | null | 29
|
[
"MobileLLMForCausalLM"
] | 1
| 2
|
silu
| 576
| 0.02
| 1,536
| 2,048
|
mobilellm
| 9
| 30
| 3
| 0.00001
| 10,000
| null | false
|
float16
|
4.41.2
| true
| 32,000
| false
| 0
| 64
| false
| 1
| null |
facebook/MobileLLM-1B
| null | null | 2024-10-31T00:07:47
| null | null | 6,405
| null | null | null | null | 102
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"mobilellm",
"text-generation",
"custom_code",
"arxiv:2402.14905",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"region:us"
] |
text-generation
| null | null | 25
|
[
"MobileLLMForCausalLM"
] | 1
| 2
|
silu
| 1,280
| 0.02
| 3,584
| 2,048
|
mobilellm
| 20
| 54
| 5
| 0.00001
| 10,000
| null | false
|
float16
|
4.41.2
| true
| 32,000
| false
| 0
| 64
| false
| 1
| null |
infly/OpenCoder-1.5B-Instruct
| null | null | 2024-11-07T16:22:28
| null | null | 186
| null | null | null | null | 21
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"zh",
"dataset:OpenCoder-LLM/opencoder-sft-stage1",
"dataset:OpenCoder-LLM/opencoder-sft-stage2",
"arxiv:2411.04905",
"base_model:infly/OpenCoder-1.5B-Base",
"base_model:finetune:infly/OpenCoder-1.5B-Base",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 21
|
[
"LlamaForCausalLM"
] | 96,540
| 96,539
|
silu
| 2,240
| 0.02
| 6,144
| 4,096
|
llama
| 14
| 24
| 14
| 0.00001
| 10,000
| null | false
|
bfloat16
|
4.37.0
| true
| 96,640
| false
| 0
| null | false
| 1
| null |
vikhyatk/moondream2
| null | null | 2024-03-04T18:03:06
| null | null | 197,554
| null | null | null | null | 693
|
transformers
|
[
"transformers",
"safetensors",
"gguf",
"moondream1",
"text-generation",
"image-text-to-text",
"custom_code",
"doi:10.57967/hf/3219",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-text-to-text
| null | null | 19
|
[
"Moondream"
] | null | null | null | null | null | null | null |
moondream1
| null | null | null | null | null | null | null |
float16
|
4.44.0
| null | null | null | null | null | null | null | null |
Qwen/Qwen2.5-7B-Instruct
| null | null | 2024-09-16T11:55:40
| null | null | 484,324
| null | null | null | null | 258
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-7B",
"base_model:finetune:Qwen/Qwen2.5-7B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 19
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 3,584
| 0.02
| 18,944
| 32,768
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.43.1
| true
| 152,064
| null | 0
| null | null | null | null |
infly/OpenCoder-8B-Base
| null | null | 2024-11-07T16:20:01
| null | null | 391
| null | null | null | null | 17
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"zh",
"arxiv:2411.04905",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 17
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 14,336
| 8,192
|
llama
| 32
| 32
| 8
| 0.00001
| 500,000
| null | false
|
bfloat16
|
4.44.2
| true
| 96,640
| false
| 0
| null | false
| 1
| null |
microsoft/Florence-2-large
| null | null | 2024-06-15T00:34:55
| null | null | 1,658,548
| null | null | null | null | 1,207
|
transformers
|
[
"transformers",
"pytorch",
"florence2",
"text-generation",
"vision",
"image-text-to-text",
"custom_code",
"arxiv:2311.06242",
"license:mit",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 16
|
[
"Florence2ForConditionalGeneration"
] | 0
| 2
| null | null | null | null | null |
florence2
| null | null | null | null | null | null | null |
float16
|
4.41.0.dev0
| null | 51,289
| null | null | null | null | null | null |
arcee-ai/SuperNova-Medius
| null | null | 2024-10-02T06:50:01
| null | null | 13,213
| null | null | null | null | 167
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mergekit",
"merge",
"conversational",
"base_model:Qwen/Qwen2.5-14B",
"base_model:finetune:Qwen/Qwen2.5-14B",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 16
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 5,120
| 0.02
| 13,824
| 131,072
|
qwen2
| 40
| 48
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.46.0.dev0
| true
| 152,064
| null | 0
| null | null | null | null |
ibm-granite/granite-3.0-8b-instruct
| null | null | 2024-10-02T21:16:23
| null | null | 29,211
| null | null | null | null | 155
|
transformers
|
[
"transformers",
"safetensors",
"granite",
"text-generation",
"language",
"granite-3.0",
"conversational",
"arxiv:0000.00000",
"base_model:ibm-granite/granite-3.0-8b-base",
"base_model:finetune:ibm-granite/granite-3.0-8b-base",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"region:us"
] |
text-generation
| null | null | 15
|
[
"GraniteForCausalLM"
] | 0
| 0
|
silu
| 4,096
| 0.02
| 12,800
| 4,096
|
granite
| 32
| 40
| 8
| 0.00001
| 10,000
| null | true
|
bfloat16
|
4.46.0.dev0
| true
| 49,155
| false
| 0.1
| null | false
| null | null |
rhymes-ai/Aria
| null | null | 2024-09-26T02:58:52
| null | null | 26,676
| null | null | null | null | 581
|
transformers
|
[
"transformers",
"safetensors",
"aria",
"text-generation",
"multimodal",
"image-text-to-text",
"conversational",
"custom_code",
"en",
"arxiv:2410.05993",
"license:apache-2.0",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 14
|
[
"AriaForConditionalGeneration"
] | null | null | null | null | null | null | null |
aria
| null | null | null | null | null | null | null |
bfloat16
|
4.45.0
| null | null | null | null | null | null | null | null |
HuggingFaceTB/SmolLM2-1.7B
| null | null | 2024-10-30T22:50:10
| null | null | 7,262
| null | null | null | null | 63
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 14
|
[
"LlamaForCausalLM"
] | 0
| 0
|
silu
| 2,048
| 0.02
| 8,192
| 8,192
|
llama
| 32
| 24
| 32
| 0.00001
| 130,000
| null | true
|
bfloat16
|
4.40.1
| true
| 49,152
| false
| 0
| null | null | 1
| null |
EVA-UNIT-01/EVA-Qwen2.5-32B-v0.2
| null | null | 2024-11-05T05:36:22
| null | null | 425
| null | null | null | null | 13
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"conversational",
"dataset:anthracite-org/kalo-opus-instruct-22k-no-refusal",
"dataset:Nopm/Opus_WritingStruct",
"dataset:Gryphe/Sonnet3.5-SlimOrcaDedupCleaned",
"dataset:Gryphe/Sonnet3.5-Charcard-Roleplay",
"dataset:Gryphe/ChatGPT-4o-Writing-Prompts",
"dataset:Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned",
"dataset:Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned",
"dataset:nothingiisreal/Reddit-Dirty-And-WritingPrompts",
"dataset:allura-org/Celeste-1.x-data-mixture",
"dataset:cognitivecomputations/dolphin-2.9.3",
"base_model:Qwen/Qwen2.5-32B",
"base_model:finetune:Qwen/Qwen2.5-32B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 13
|
[
"Qwen2ForCausalLM"
] | null | 151,643
|
silu
| 5,120
| 0.02
| 27,648
| 131,072
|
qwen2
| 40
| 64
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.45.1
| false
| 152,064
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-1.5B-Instruct
| null | null | 2024-09-17T14:10:29
| null | null | 14,955,186
| null | null | null | null | 128
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-1.5B",
"base_model:finetune:Qwen/Qwen2.5-1.5B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 12
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 1,536
| 0.02
| 8,960
| 32,768
|
qwen2
| 12
| 28
| 2
| 0.000001
| 1,000,000
| 32,768
| true
|
bfloat16
|
4.43.1
| true
| 151,936
| null | 0
| null | null | null | null |
HuggingFaceTB/SmolLM2-135M-Instruct
| null | null | 2024-10-31T13:41:10
| null | null | 9,151
| null | null | null | null | 53
|
transformers
|
[
"transformers",
"tensorboard",
"onnx",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 12
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 576
| 0.041667
| 1,536
| 8,192
|
llama
| 9
| 30
| 3
| 0.00001
| 100,000
| null | true
|
bfloat16
|
4.42.3
| true
| 49,152
| false
| 0
| null | false
| 1
| null |
jadechoghari/Ferret-UI-Llama8b
| null | null | 2024-10-09T16:32:15
| null | null | 748
| null | null | null | null | 42
|
transformers
|
[
"transformers",
"safetensors",
"ferret_llama",
"text-generation",
"image-text-to-text",
"conversational",
"custom_code",
"arxiv:2404.05719",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 11
|
[
"FerretLlamaForCausalLM"
] | 128,000
| 128,009
|
silu
| 4,096
| 0.02
| 14,336
| 8,192
|
ferret_llama
| null | 32
| 8
| 0.00001
| 500,000
| null | false
|
bfloat16
|
4.39.0
| true
| 128,258
| false
| 0
| null | null | 1
| null |
BAAI/Aquila-VL-2B-llava-qwen
| null | null | 2024-10-17T09:50:06
| null | null | 1,442
| null | null | null | null | 42
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"multimodal",
"image-text-to-text",
"conversational",
"en",
"zh",
"dataset:BAAI/Infinity-MM",
"dataset:BAAI/Infinity-Instruct",
"dataset:BAAI/Infinity-Preference",
"arxiv:2410.18558",
"base_model:Qwen/Qwen2.5-1.5B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-1.5B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
image-text-to-text
| null | null | 11
|
[
"LlavaQwenForCausalLM"
] | 151,643
| 151,645
|
silu
| 1,536
| 0.02
| 8,960
| 32,768
|
qwen2
| 12
| 28
| 2
| 0.000001
| 1,000,000
| 32,768
| true
|
bfloat16
|
4.40.0
| true
| 151,936
| null | 0
| null | null | null | null |
openai-community/gpt2
| null | null | 2022-03-02T23:29:04
| null | null | 16,737,409
| null | null | null | null | 2,354
|
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tflite",
"rust",
"onnx",
"safetensors",
"gpt2",
"text-generation",
"exbert",
"en",
"doi:10.57967/hf/0039",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 10
|
[
"GPT2LMHeadModel"
] | 50,256
| 50,256
| null | null | 0.02
| null | null |
gpt2
| null | null | null | null | null | null | null | null | null | null | 50,257
| null | null | null | null | null | null |
liuhaotian/llava-v1.5-7b
| null | null | 2023-10-05T18:25:51
| null | null | 793,191
| null | null | null | null | 363
|
transformers
|
[
"transformers",
"pytorch",
"llava",
"text-generation",
"image-text-to-text",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 10
|
[
"LlavaLlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 11,008
| 4,096
|
llava
| 32
| 32
| 32
| 0.00001
| null | null | false
|
float16
|
4.31.0
| true
| 32,000
| null | null | null | null | 1
| null |
infly/OpenCoder-1.5B-Base
| null | null | 2024-11-07T16:19:26
| null | null | 218
| null | null | null | null | 10
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"zh",
"arxiv:2411.04905",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 10
|
[
"LlamaForCausalLM"
] | 1
| 96,539
|
silu
| 2,240
| 0.02
| 6,144
| 4,096
|
llama
| 14
| 24
| 14
| 0.00001
| 10,000
| null | false
|
bfloat16
|
4.44.2
| true
| 96,640
| false
| 0
| null | false
| 1
| null |
Qwen/Qwen2.5-7B
| null | null | 2024-09-15T12:17:40
| null | null | 52,542
| null | null | null | null | 64
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"en",
"arxiv:2407.10671",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 9
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 3,584
| 0.02
| 18,944
| 131,072
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.40.1
| true
| 152,064
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-3B-Instruct
| null | null | 2024-09-17T14:08:52
| null | null | 214,629
| null | null | null | null | 78
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-3B",
"base_model:finetune:Qwen/Qwen2.5-3B",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 9
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 2,048
| 0.02
| 11,008
| 32,768
|
qwen2
| 16
| 36
| 2
| 0.000001
| 1,000,000
| 32,768
| true
|
bfloat16
|
4.43.1
| true
| 151,936
| null | 0
| null | null | null | null |
jadechoghari/Ferret-UI-Gemma2b
| null | null | 2024-10-09T16:19:17
| null | null | 1,885
| null | null | null | null | 44
|
transformers
|
[
"transformers",
"safetensors",
"ferret_gemma",
"text-generation",
"image-text-to-text",
"conversational",
"custom_code",
"arxiv:2404.05719",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 9
|
[
"FerretGemmaForCausalLM"
] | 2
| 1
|
gelu_pytorch_tanh
| 2,048
| 0.02
| 16,384
| 8,192
|
ferret_gemma
| 8
| 18
| 1
| 0.000001
| 10,000
| null | null |
bfloat16
|
4.39.0
| true
| 256,001
| false
| 0
| 256
| null | null | null |
HuggingFaceTB/SmolLM2-360M-Instruct
| null | null | 2024-10-31T13:41:35
| null | null | 11,297
| null | null | null | null | 43
|
transformers
|
[
"transformers",
"tensorboard",
"onnx",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 9
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 960
| 0.02
| 2,560
| 8,192
|
llama
| 15
| 32
| 5
| 0.00001
| 100,000
| null | true
|
bfloat16
|
4.42.3
| true
| 49,152
| false
| 0
| null | false
| 1
| null |
bigscience/bloom
| null | null | 2022-05-19T11:53:33
| null | null | 10,169
| null | null | null | null | 4,766
|
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"bloom",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zu",
"arxiv:2211.05100",
"arxiv:1909.08053",
"arxiv:2110.02861",
"arxiv:2108.12409",
"doi:10.57967/hf/0003",
"license:bigscience-bloom-rail-1.0",
"model-index",
"co2_eq_emissions",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 8
|
[
"BloomForCausalLM"
] | 1
| 2
| null | null | 0.02
| null | null |
bloom
| 112
| null | null | null | null | null | null | null |
4.21.0
| true
| 250,880
| null | 0
| null | null | 4
| null |
vikhyatk/moondream1
| null | null | 2024-01-20T18:10:04
| null | null | 298,246
| null | null | null | null | 484
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"moondream1",
"text-generation",
"custom_code",
"en",
"autotrain_compatible",
"region:us"
] |
text-generation
| null | null | 8
|
[
"Moondream"
] | null | null | null | null | null | null | null |
moondream1
| null | null | null | null | null | null | null |
float16
|
4.36.2
| null | null | null | null | null | null | null | null |
dunzhang/stella_en_1.5B_v5
| null | null | 2024-07-12T15:52:09
| null | null | 86,749
| null | null | null | null | 145
|
sentence-transformers
|
[
"sentence-transformers",
"pytorch",
"safetensors",
"qwen2",
"text-generation",
"mteb",
"transformers",
"sentence-similarity",
"custom_code",
"arxiv:2205.13147",
"license:mit",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| null | null | 8
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 1,536
| 0.02
| 8,960
| 131,072
|
qwen2
| 12
| 28
| 2
| 0.000001
| 1,000,000
| 131,072
| false
|
float32
|
4.42.3
| true
| 151,646
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-Coder-7B
| null | null | 2024-09-16T11:57:24
| null | null | 27,426
| null | null | null | null | 49
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"code",
"qwen",
"qwen-coder",
"codeqwen",
"conversational",
"en",
"arxiv:2409.12186",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-7B",
"base_model:finetune:Qwen/Qwen2.5-7B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 8
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 3,584
| 0.02
| 18,944
| 32,768
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.45.0.dev0
| true
| 152,064
| null | 0
| null | null | null | null |
dfurman/CalmeRys-78B-Orpo-v0.1
| null | null | 2024-09-24T10:25:46
| null | null | 5,324
| null | null | null | null | 40
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"orpo",
"sft",
"chatml",
"conversational",
"en",
"dataset:mlabonne/orpo-dpo-mix-40k",
"base_model:MaziyarPanahi/calme-2.4-rys-78b",
"base_model:finetune:MaziyarPanahi/calme-2.4-rys-78b",
"license:mit",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
text-generation
| null | null | 8
|
[
"Qwen2ForCausalLM"
] | 151,644
| 151,645
|
silu
| 8,192
| 0.02
| 29,568
| 32,768
|
qwen2
| 64
| 86
| 8
| 0.000001
| 1,000,000
| null | false
|
bfloat16
|
4.44.2
| false
| 151,646
| null | 0
| null | null | null | null |
rombodawg/Rombos-LLM-V2.6-Qwen-14b
| null | null | 2024-10-12T20:19:50
| null | null | 4,214
| null | null | null | null | 41
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"base_model:Qwen/Qwen2.5-14B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-14B-Instruct",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 8
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 5,120
| 0.02
| 13,824
| 131,072
|
qwen2
| 40
| 48
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.43.3
| true
| 152,064
| null | 0
| null | null | null | null |
HuggingFaceH4/zephyr-7b-beta
| null | null | 2023-10-26T11:25:49
| null | null | 732,427
| null | null | null | null | 1,604
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"mistral",
"text-generation",
"generated_from_trainer",
"conversational",
"en",
"dataset:HuggingFaceH4/ultrachat_200k",
"dataset:HuggingFaceH4/ultrafeedback_binarized",
"arxiv:2305.18290",
"arxiv:2310.16944",
"arxiv:2305.14233",
"arxiv:2310.01377",
"base_model:mistralai/Mistral-7B-v0.1",
"base_model:finetune:mistralai/Mistral-7B-v0.1",
"license:mit",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 7
|
[
"MistralForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 14,336
| 32,768
|
mistral
| 32
| 32
| 8
| 0.00001
| 10,000
| 4,096
| false
|
bfloat16
|
4.35.0
| true
| 32,000
| null | null | null | null | null | null |
jinaai/reader-lm-1.5b
| null | null | 2024-09-06T02:56:15
| null | null | 8,202
| null | null | null | null | 471
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"multilingual",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"region:eu"
] |
text-generation
| null | null | 7
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 1,536
| 0.02
| 8,960
| 256,000
|
qwen2
| 12
| 28
| 2
| 0.000001
| 2,000,000
| null | true
|
bfloat16
|
4.43.3
| true
| 151,936
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-0.5B
| null | null | 2024-09-15T12:15:39
| null | null | 99,649
| null | null | null | null | 96
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"en",
"arxiv:2407.10671",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 7
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 896
| 0.02
| 4,864
| 32,768
|
qwen2
| 14
| 24
| 2
| 0.000001
| 1,000,000
| 32,768
| true
|
bfloat16
|
4.40.1
| true
| 151,936
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-14B-Instruct
| null | null | 2024-09-16T11:56:10
| null | null | 120,731
| null | null | null | null | 107
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-14B",
"base_model:finetune:Qwen/Qwen2.5-14B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 7
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 5,120
| 0.02
| 13,824
| 32,768
|
qwen2
| 40
| 48
| 8
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.43.1
| true
| 152,064
| null | 0
| null | null | null | null |
EleutherAI/gpt-j-6b
| null | null | 2022-03-02T23:29:04
| null | null | 291,371
| null | null | null | null | 1,440
|
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"gptj",
"text-generation",
"causal-lm",
"en",
"dataset:EleutherAI/pile",
"arxiv:2104.09864",
"arxiv:2101.00027",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 6
|
[
"GPTJForCausalLM"
] | 50,256
| 50,256
| null | null | 0.02
| null | null |
gptj
| null | null | null | null | null | null | false
| null |
4.18.0.dev0
| true
| 50,400
| null | null | null | null | null | null |
georgesung/llama2_7b_chat_uncensored
| null | null | 2023-07-20T10:45:03
| null | null | 2,820
| null | null | null | null | 363
|
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"llama",
"text-generation",
"dataset:georgesung/wizard_vicuna_70k_unfiltered",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 6
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 11,008
| 2,048
|
llama
| 32
| 32
| null | 0.00001
| null | null | false
|
float32
|
4.30.2
| true
| 32,000
| null | null | null | null | null | null |
Qwen/Qwen2-7B-Instruct
| null | null | 2024-06-04T10:07:03
| null | null | 611,535
| null | null | null | null | 585
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2309.00071",
"base_model:Qwen/Qwen2-7B",
"base_model:finetune:Qwen/Qwen2-7B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 6
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 3,584
| 0.02
| 18,944
| 32,768
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.41.2
| true
| 152,064
| null | 0
| null | null | null | null |
Alibaba-NLP/gte-Qwen2-7B-instruct
| null | null | 2024-06-15T11:24:21
| null | null | 34,262
| null | null | null | null | 201
|
sentence-transformers
|
[
"sentence-transformers",
"safetensors",
"qwen2",
"text-generation",
"mteb",
"transformers",
"Qwen2",
"sentence-similarity",
"custom_code",
"arxiv:2308.03281",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| null | null | 6
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 3,584
| 0.02
| 18,944
| 131,072
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
float32
|
4.41.2
| true
| 151,646
| null | 0
| null | null | null | null |
upstage/solar-pro-preview-instruct
| null | null | 2024-09-09T01:08:58
| null | null | 1,073
| null | null | null | null | 421
|
transformers
|
[
"transformers",
"safetensors",
"solar",
"text-generation",
"nlp",
"conversational",
"custom_code",
"en",
"license:mit",
"autotrain_compatible",
"region:us"
] |
text-generation
| null | null | 6
|
[
"SolarForCausalLM"
] | 1
| 32,007
|
silu
| 5,120
| 0.02
| 17,920
| 4,096
|
solar
| 40
| 64
| 10
| 0.00001
| 10,000
| 2,047
| false
|
bfloat16
|
4.44.2
| true
| 32,128
| false
| 0
| null | false
| 1
| null |
Qwen/Qwen2.5-32B-Instruct
| null | null | 2024-09-17T04:17:55
| null | null | 119,483
| null | null | null | null | 115
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-32B",
"base_model:finetune:Qwen/Qwen2.5-32B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 6
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 5,120
| 0.02
| 27,648
| 32,768
|
qwen2
| 40
| 64
| 8
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.43.1
| true
| 152,064
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-Coder-1.5B-Instruct
| null | null | 2024-09-18T09:41:47
| null | null | 18,374
| null | null | null | null | 40
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"code",
"codeqwen",
"chat",
"qwen",
"qwen-coder",
"conversational",
"en",
"arxiv:2409.12186",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-Coder-1.5B",
"base_model:finetune:Qwen/Qwen2.5-Coder-1.5B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 6
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 1,536
| 0.02
| 8,960
| 32,768
|
qwen2
| 12
| 28
| 2
| 0.000001
| 1,000,000
| 131,072
| true
|
bfloat16
|
4.44.0
| true
| 151,936
| null | 0
| null | null | null | null |
allenai/Molmo-7B-D-0924
| null | null | 2024-09-25T01:48:22
| null | null | 73,568
| null | null | null | null | 424
|
transformers
|
[
"transformers",
"safetensors",
"molmo",
"text-generation",
"multimodal",
"olmo",
"pixmo",
"image-text-to-text",
"conversational",
"custom_code",
"en",
"arxiv:2409.17146",
"base_model:Qwen/Qwen2-7B",
"base_model:finetune:Qwen/Qwen2-7B",
"license:apache-2.0",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 6
|
[
"MolmoForCausalLM"
] | null | null | null | 3,584
| 0.02
| 37,888
| 4,096
|
molmo
| 28
| 28
| 4
| null | 1,000,000
| null | false
|
float32
|
4.43.3
| true
| 152,064
| null | null | null | null | null | null |
prithivMLmods/Llama-3.2-8B-GGUF-200K
| null | null | 2024-10-27T05:09:21
| null | null | 11,772
| null | null | null | null | 7
|
transformers
|
[
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"200K",
"text-generation",
"en",
"dataset:HuggingFaceH4/ultrachat_200k",
"license:creativeml-openrail-m",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 6
| null | null | null | null | null | null | null | null |
llama
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
HuggingFaceTB/SmolLM2-135M
| null | null | 2024-10-31T00:46:04
| null | null | 7,803
| null | null | null | null | 27
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 6
|
[
"LlamaForCausalLM"
] | 0
| 0
|
silu
| 576
| 0.041667
| 1,536
| 8,192
|
llama
| 9
| 30
| 3
| 0.00001
| 100,000
| null | true
|
bfloat16
|
4.40.1
| true
| 49,152
| false
| 0
| null | null | 1
| null |
cognitivecomputations/dolphin-2.9-llama3-8b
| null | null | 2024-04-20T23:14:52
| null | null | 3,628
| null | null | null | null | 418
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"axolotl",
"conversational",
"dataset:cognitivecomputations/Dolphin-2.9",
"dataset:teknium/OpenHermes-2.5",
"dataset:m-a-p/CodeFeedback-Filtered-Instruction",
"dataset:cognitivecomputations/dolphin-coder",
"dataset:cognitivecomputations/samantha-data",
"dataset:HuggingFaceH4/ultrachat_200k",
"dataset:microsoft/orca-math-word-problems-200k",
"dataset:abacusai/SystemChat-1.1",
"dataset:Locutusque/function-calling-chatml",
"dataset:internlm/Agent-FLAN",
"base_model:meta-llama/Meta-Llama-3-8B",
"base_model:finetune:meta-llama/Meta-Llama-3-8B",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"LlamaForCausalLM"
] | 128,000
| 128,256
|
silu
| 4,096
| 0.02
| 14,336
| 8,192
|
llama
| 32
| 32
| 8
| 0.00001
| 500,000
| null | false
|
bfloat16
|
4.40.0
| false
| 128,258
| false
| 0
| null | null | 1
| null |
Qwen/Qwen2-0.5B
| null | null | 2024-05-31T08:38:11
| null | null | 1,267,361
| null | null | null | null | 108
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"pretrained",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 896
| 0.02
| 4,864
| 131,072
|
qwen2
| 14
| 24
| 2
| 0.000001
| 1,000,000
| 131,072
| true
|
bfloat16
|
4.40.1
| true
| 151,936
| null | 0
| null | null | null | null |
Sao10K/L3-8B-Stheno-v3.2
| null | null | 2024-06-05T10:30:57
| null | null | 2,737
| null | null | null | null | 238
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"dataset:Gryphe/Opus-WritingPrompts",
"dataset:Sao10K/Claude-3-Opus-Instruct-15K",
"dataset:Sao10K/Short-Storygen-v2",
"dataset:Sao10K/c2-Logs-Filtered",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"LlamaForCausalLM"
] | 128,000
| 128,009
|
silu
| 4,096
| 0.02
| 14,336
| 8,192
|
llama
| 32
| 32
| 8
| 0.00001
| 500,000
| null | false
|
bfloat16
|
4.41.2
| true
| 128,256
| false
| 0
| null | false
| 1
| null |
princeton-nlp/gemma-2-9b-it-SimPO
| null | null | 2024-07-16T16:42:49
| null | null | 102,892
| null | null | null | null | 120
|
transformers
|
[
"transformers",
"safetensors",
"gemma2",
"text-generation",
"alignment-handbook",
"generated_from_trainer",
"conversational",
"dataset:princeton-nlp/gemma2-ultrafeedback-armorm",
"arxiv:2405.14734",
"arxiv:2310.01377",
"arxiv:2406.12845",
"base_model:google/gemma-2-9b-it",
"base_model:finetune:google/gemma-2-9b-it",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"Gemma2ForCausalLM"
] | 2
| 1
|
gelu_pytorch_tanh
| 3,584
| 0.02
| 14,336
| 8,192
|
gemma2
| 16
| 42
| 8
| 0.000001
| 10,000
| 4,096
| null |
bfloat16
|
4.42.4
| true
| 256,000
| false
| 0
| 256
| null | null | null |
mattshumer/Reflection-Llama-3.1-70B
| null | null | 2024-09-05T18:29:50
| null | null | 2,503
| null | null | null | null | 1,702
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"base_model:meta-llama/Llama-3.1-70B-Instruct",
"base_model:finetune:meta-llama/Llama-3.1-70B-Instruct",
"license:llama3.1",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"LlamaForCausalLM"
] | 128,000
| 128,009
|
silu
| 8,192
| 0.02
| 28,672
| 8,192
|
llama
| 64
| 80
| 8
| 0.00001
| 500,000
| null | false
|
float32
|
4.40.0
| true
| 128,262
| false
| 0
| null | null | 1
| null |
anthracite-org/magnum-v4-72b
| null | null | 2024-09-20T03:23:10
| null | null | 930
| null | null | null | null | 20
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"Qwen2ForCausalLM"
] | null | 151,645
|
silu
| 8,192
| 0.02
| 29,568
| 32,768
|
qwen2
| 64
| 80
| 8
| 0.000001
| 1,000,000
| null | false
|
bfloat16
|
4.44.0
| false
| 152,064
| null | 0
| null | null | null | null |
Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24
| null | null | 2024-09-20T13:32:03
| null | null | 13,734
| null | null | null | null | 81
|
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"en",
"ru",
"dataset:Vikhrmodels/GrandMaster-PRO-MAX",
"dataset:Vikhrmodels/Grounded-RAG-RU-v2",
"arxiv:2405.13929",
"base_model:mistralai/Mistral-Nemo-Instruct-2407",
"base_model:finetune:mistralai/Mistral-Nemo-Instruct-2407",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"MistralForCausalLM"
] | 1
| 2
|
silu
| 5,120
| 0.02
| 14,336
| 1,024,000
|
mistral
| 32
| 40
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.44.2
| true
| 131,074
| null | 0
| 128
| null | null | null |
VongolaChouko/Starcannon-Unleashed-12B-v1.0
| null | null | 2024-10-29T14:32:59
| null | null | 790
| null | null | null | null | 23
|
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"mergekit",
"merge",
"base_model:MarinaraSpaghetti/NemoMix-Unleashed-12B",
"base_model:merge:MarinaraSpaghetti/NemoMix-Unleashed-12B",
"base_model:nothingiisreal/MN-12B-Starcannon-v3",
"base_model:merge:nothingiisreal/MN-12B-Starcannon-v3",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"MistralForCausalLM"
] | 1
| 2
|
silu
| 5,120
| 0.02
| 14,336
| 1,024,000
|
mistral
| 32
| 40
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.46.0
| true
| 131,072
| null | 0
| 128
| null | null | null |
facebook/MobileLLM-600M
| null | null | 2024-10-30T22:57:03
| null | null | 1,061
| null | null | null | null | 26
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"mobilellm",
"text-generation",
"custom_code",
"arxiv:2402.14905",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"MobileLLMForCausalLM"
] | 1
| 2
|
silu
| 1,152
| 0.02
| 3,072
| 2,048
|
mobilellm
| 18
| 40
| 6
| 0.00001
| 10,000
| null | false
|
float16
|
4.41.2
| true
| 32,000
| false
| 0
| 64
| false
| 1
| null |
amd/AMD-OLMo-1B
| null | null | 2024-10-31T20:27:49
| null | null | 246
| null | null | null | null | 19
| null |
[
"safetensors",
"olmo",
"text-generation",
"dataset:allenai/dolma",
"license:apache-2.0",
"region:us"
] |
text-generation
| null | null | 5
|
[
"OlmoForCausalLM"
] | null | 50,279
|
silu
| 2,048
| 0.02
| 8,192
| 2,048
|
olmo
| 16
| 16
| 16
| null | 10,000
| null | true
|
float32
|
4.40.2
| true
| 50,304
| false
| 0
| null | null | null | null |
EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2
| null | null | 2024-11-06T19:49:06
| null | null | 53
| null | null | null | null | 5
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"conversational",
"dataset:anthracite-org/kalo-opus-instruct-22k-no-refusal",
"dataset:Nopm/Opus_WritingStruct",
"dataset:Gryphe/Sonnet3.5-SlimOrcaDedupCleaned",
"dataset:Gryphe/Sonnet3.5-Charcard-Roleplay",
"dataset:Gryphe/ChatGPT-4o-Writing-Prompts",
"dataset:Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned",
"dataset:Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned",
"dataset:nothingiisreal/Reddit-Dirty-And-WritingPrompts",
"dataset:allura-org/Celeste-1.x-data-mixture",
"dataset:cognitivecomputations/dolphin-2.9.3",
"base_model:Qwen/Qwen2.5-14B",
"base_model:finetune:Qwen/Qwen2.5-14B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 5
|
[
"Qwen2ForCausalLM"
] | null | 151,643
|
silu
| 5,120
| 0.02
| 13,824
| 131,072
|
qwen2
| 40
| 48
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.45.1
| false
| 152,064
| null | 0
| null | null | null | null |
facebook/opt-1.3b
| null | null | 2022-05-11T08:26:00
| null | null | 18,621,886
| null | null | null | null | 154
|
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"opt",
"text-generation",
"en",
"arxiv:2205.01068",
"arxiv:2005.14165",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
text-generation
| null | null | 4
|
[
"OPTForCausalLM"
] | 2
| 2
| null | 2,048
| null | null | 2,048
|
opt
| 32
| 24
| null | null | null | null | null |
float16
|
4.21.0.dev0
| true
| 50,272
| null | 0
| null | null | null | null |
huggyllama/llama-7b
| null | null | 2023-04-03T23:16:48
| null | null | 154,247
| null | null | null | null | 294
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"conversational",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 11,008
| 2,048
|
llama
| 32
| 32
| null | 0.000001
| null | null | false
|
float16
|
4.28.0.dev0
| true
| 32,000
| null | null | null | null | null | null |
ai-forever/ruGPT-3.5-13B
| null | null | 2023-05-02T12:53:36
| null | null | 2,878
| null | null | null | null | 260
|
transformers
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"gpt3",
"en",
"ru",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"GPT2LMHeadModel"
] | 2
| 3
| null | null | 0.02
| null | null |
gpt2
| null | null | null | null | null | null | null |
float32
|
4.27.1
| true
| 50,272
| null | null | null | null | null | null |
TheBloke/dolphin-2.5-mixtral-8x7b-GPTQ
| null | null | 2023-12-14T10:34:15
| null | null | 168
| null | null | null | null | 105
|
transformers
|
[
"transformers",
"safetensors",
"mixtral",
"text-generation",
"conversational",
"en",
"dataset:ehartford/dolphin",
"dataset:jondurbin/airoboros-2.2.1",
"dataset:ehartford/dolphin-coder",
"dataset:migtissera/Synthia-v1.3",
"dataset:teknium/openhermes",
"dataset:ise-uiuc/Magicoder-OSS-Instruct-75K",
"dataset:ise-uiuc/Magicoder-Evol-Instruct-110K",
"dataset:LDJnr/Pure-Dove",
"base_model:cognitivecomputations/dolphin-2.5-mixtral-8x7b",
"base_model:quantized:cognitivecomputations/dolphin-2.5-mixtral-8x7b",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"gptq",
"region:us"
] |
text-generation
| null | null | 4
|
[
"MixtralForCausalLM"
] | 1
| 32,000
|
silu
| 4,096
| 0.02
| 14,336
| 32,768
|
mixtral
| 32
| 32
| 8
| 0.00001
| 1,000,000
| 4,096
| false
|
bfloat16
|
4.37.0.dev0
| true
| 32,002
| null | 0
| null | null | 1
| null |
defog/sqlcoder-7b-2
| null | null | 2024-02-05T14:36:51
| null | null | 117,028
| null | null | null | null | 288
|
transformers
|
[
"transformers",
"safetensors",
"gguf",
"llama",
"text-generation",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 11,008
| 16,384
|
llama
| 32
| 32
| 32
| 0.00001
| 1,000,000
| null | false
|
float16
|
4.37.2
| true
| 32,016
| false
| 0
| null | null | 1
| null |
RUCKBReasoning/TableLLM-7b
| null | null | 2024-02-06T12:07:20
| null | null | 512
| null | null | null | null | 12
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"Table",
"QA",
"Code",
"en",
"dataset:RUCKBReasoning/TableLLM-SFT",
"arxiv:2403.19318",
"license:llama2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 11,008
| 16,384
|
llama
| 32
| 32
| 32
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.36.2
| true
| 32,016
| false
| 0
| null | null | 1
| null |
BAAI/bge-reranker-v2-gemma
| null | null | 2024-03-16T12:09:04
| null | null | 13,503
| null | null | null | null | 48
|
sentence-transformers
|
[
"sentence-transformers",
"safetensors",
"gemma",
"text-generation",
"transformers",
"text-classification",
"multilingual",
"arxiv:2312.15503",
"arxiv:2402.03216",
"license:apache-2.0",
"region:us"
] |
text-classification
| null | null | 4
|
[
"GemmaForCausalLM"
] | 2
| 1
|
gelu
| 2,048
| 0.02
| 16,384
| 8,192
|
gemma
| 8
| 18
| 1
| 0.000001
| 10,000
| null | null |
float32
|
4.38.1
| true
| 256,000
| false
| 0
| 256
| null | null | null |
IlyaGusev/saiga_llama3_8b
| null | null | 2024-04-18T18:25:25
| null | null | 13,161
| null | null | null | null | 108
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"ru",
"dataset:IlyaGusev/saiga_scored",
"doi:10.57967/hf/2368",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"LlamaForCausalLM"
] | 128,000
| 128,001
|
silu
| 4,096
| 0.02
| 14,336
| 8,192
|
llama
| 32
| 32
| 8
| 0.00001
| 500,000
| null | false
|
bfloat16
|
4.40.0
| true
| 128,256
| false
| 0
| null | null | 1
| null |
nothingiisreal/MN-12B-Celeste-V1.9
| null | null | 2024-07-31T04:55:12
| null | null | 565
| null | null | null | null | 119
|
transformers
|
[
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"en",
"dataset:nothingiisreal/c2-logs-cleaned",
"dataset:kalomaze/Opus_Instruct_25k",
"dataset:nothingiisreal/Reddit-Dirty-And-WritingPrompts",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"MistralForCausalLM"
] | 1
| 2
|
silu
| 5,120
| 0.02
| 14,336
| 1,024,000
|
mistral
| 32
| 40
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.44.0.dev0
| false
| 131,072
| null | 0
| 128
| null | null | null |
lmms-lab/LLaVA-Video-7B-Qwen2
| null | null | 2024-09-02T06:36:42
| null | null | 35,547
| null | null | null | null | 32
|
transformers
|
[
"transformers",
"safetensors",
"llava",
"text-generation",
"multimodal",
"video-text-to-text",
"en",
"dataset:lmms-lab/LLaVA-OneVision-Data",
"dataset:lmms-lab/LLaVA-Video-178K",
"arxiv:2410.02713",
"base_model:lmms-lab/llava-onevision-qwen2-7b-si",
"base_model:finetune:lmms-lab/llava-onevision-qwen2-7b-si",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
video-text-to-text
| null | null | 4
|
[
"LlavaQwenForCausalLM"
] | 151,643
| 151,645
|
silu
| 3,584
| 0.02
| 18,944
| 32,768
|
llava
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.40.0.dev0
| true
| 152,064
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-7B-Instruct-GPTQ-Int4
| null | null | 2024-09-17T12:51:51
| null | null | 74,308
| null | null | null | null | 9
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-7B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-7B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"gptq",
"region:us"
] |
text-generation
| null | null | 4
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 3,584
| 0.02
| 18,944
| 32,768
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
float16
|
4.39.3
| true
| 152,064
| null | 0
| null | null | null | null |
Qwen/Qwen2.5-32B-Instruct-GPTQ-Int4
| null | null | 2024-09-17T12:52:55
| null | null | 11,841
| null | null | null | null | 17
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-32B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-32B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"gptq",
"region:us"
] |
text-generation
| null | null | 4
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 5,120
| 0.02
| 27,648
| 32,768
|
qwen2
| 40
| 64
| 8
| 0.000001
| 1,000,000
| 131,072
| false
|
float16
|
4.39.3
| true
| 152,064
| null | 0
| null | null | null | null |
AIDC-AI/Ovis1.6-Gemma2-9B
| null | null | 2024-09-18T12:29:44
| null | null | 7,812
| null | null | null | null | 241
|
transformers
|
[
"transformers",
"safetensors",
"ovis",
"text-generation",
"MLLM",
"image-text-to-text",
"conversational",
"custom_code",
"en",
"dataset:AIDC-AI/Ovis-dataset",
"arxiv:2405.20797",
"license:apache-2.0",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 4
|
[
"Ovis"
] | null | null | null | 3,584
| null | null | null |
ovis
| null | null | null | null | null | null | null |
bfloat16
|
4.44.2
| true
| null | null | null | null | null | null | null |
MaziyarPanahi/Qwen2.5-7B-Instruct-GGUF
| null | null | 2024-09-18T19:44:20
| null | null | 893,193
| null | null | null | null | 5
| null |
[
"gguf",
"mistral",
"quantized",
"2-bit",
"3-bit",
"4-bit",
"5-bit",
"6-bit",
"8-bit",
"GGUF",
"text-generation",
"base_model:Qwen/Qwen2.5-7B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-7B-Instruct",
"region:us",
"imatrix",
"conversational"
] |
text-generation
| null | null | 4
| null | null | null | null | null | null | null | null |
mistral
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
huihui-ai/Qwen2.5-7B-Instruct-abliterated-v2
| null | null | 2024-09-22T19:02:37
| null | null | 2,726
| null | null | null | null | 15
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"abliterated",
"uncensored",
"conversational",
"en",
"base_model:Qwen/Qwen2.5-7B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-7B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 3,584
| 0.02
| 18,944
| 32,768
|
qwen2
| 28
| 28
| 4
| 0.000001
| 1,000,000
| 131,072
| false
|
bfloat16
|
4.43.1
| true
| 152,064
| null | 0
| null | null | null | null |
allenai/Molmo-72B-0924
| null | null | 2024-09-25T06:23:32
| null | null | 6,780
| null | null | null | null | 254
|
transformers
|
[
"transformers",
"safetensors",
"molmo",
"text-generation",
"multimodal",
"olmo",
"pixmo",
"image-text-to-text",
"conversational",
"custom_code",
"en",
"arxiv:2409.17146",
"base_model:Qwen/Qwen2-72B",
"base_model:finetune:Qwen/Qwen2-72B",
"license:apache-2.0",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 4
|
[
"MolmoForCausalLM"
] | null | null | null | 8,192
| 0.02
| 59,136
| 4,096
|
molmo
| 64
| 80
| 8
| null | 1,000,000
| null | false
|
float32
|
4.43.3
| true
| 152,064
| null | null | null | null | null | null |
BAAI/Emu3-Gen
| null | null | 2024-09-25T11:03:49
| null | null | 14,857
| null | null | null | null | 184
|
transformers
|
[
"transformers",
"safetensors",
"Emu3",
"text-generation",
"any-to-any",
"custom_code",
"arxiv:2409.18869",
"license:apache-2.0",
"autotrain_compatible",
"region:us"
] |
any-to-any
| null | null | 4
|
[
"Emu3ForCausalLM"
] | 151,849
| 151,850
|
silu
| 4,096
| 0.02
| 14,336
| 9,216
|
Emu3
| 32
| 32
| 8
| 0.00001
| 1,000,000
| null | false
|
float32
|
4.44.0
| true
| 184,622
| null | 0.1
| null | null | 1
| null |
shuttleai/shuttle-3
| null | null | 2024-10-09T19:48:20
| null | null | 442
| null | null | null | null | 28
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"conversational",
"en",
"base_model:Qwen/Qwen2.5-72B",
"base_model:finetune:Qwen/Qwen2.5-72B",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"Qwen2ForCausalLM"
] | null | 151,645
|
silu
| 8,192
| 0.02
| 29,568
| 32,768
|
qwen2
| 64
| 80
| 8
| 0.000001
| 1,000,000
| null | false
|
bfloat16
|
4.45.0.dev0
| false
| 152,064
| null | 0
| null | null | null | null |
h2oai/h2ovl-mississippi-2b
| null | null | 2024-10-15T18:15:29
| null | null | 4,914
| null | null | null | null | 19
|
transformers
|
[
"transformers",
"safetensors",
"h2ovl_chat",
"feature-extraction",
"gpt",
"llm",
"multimodal large language model",
"ocr",
"text-generation",
"conversational",
"custom_code",
"en",
"license:apache-2.0",
"region:us"
] |
text-generation
| null | null | 4
|
[
"H2OVLChatModel"
] | null | null | null | null | null | null | null |
h2ovl_chat
| null | null | null | null | null | null | null |
bfloat16
| null | null | null | null | null | null | null | null | null |
BSC-LT/salamandraTA-2B
| null | null | 2024-10-28T08:43:09
| null | null | 304
| null | null | null | null | 4
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"translation",
"it",
"pt",
"de",
"en",
"es",
"eu",
"gl",
"fr",
"bg",
"cs",
"lt",
"hr",
"ca",
"nl",
"ro",
"da",
"el",
"fi",
"hu",
"sk",
"sl",
"et",
"pl",
"lv",
"mt",
"ga",
"sv",
"an",
"ast",
"oc",
"arxiv:1803.09010",
"arxiv:2010.11125",
"arxiv:2403.14009",
"arxiv:1907.05791",
"arxiv:1911.04944",
"arxiv:2207.04672",
"base_model:BSC-LT/salamandra-2b",
"base_model:finetune:BSC-LT/salamandra-2b",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:eu"
] |
translation
| null | null | 4
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 2,048
| 0.02
| 5,440
| 8,192
|
llama
| 16
| 24
| 16
| 0.00001
| 10,000
| null | false
|
bfloat16
|
4.42.4
| false
| 256,000
| false
| 0
| null | false
| 1
| null |
arcee-ai/Arcee-VyLinh
| null | null | 2024-10-29T20:49:46
| null | null | 761
| null | null | null | null | 17
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mergekit",
"merge",
"conversational",
"vi",
"base_model:Qwen/Qwen2.5-3B-Instruct",
"base_model:merge:Qwen/Qwen2.5-3B-Instruct",
"base_model:qnguyen3/VyLinh-3B",
"base_model:merge:qnguyen3/VyLinh-3B",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,645
|
silu
| 2,048
| 0.02
| 11,008
| 32,768
|
qwen2
| 16
| 36
| 2
| 0.000001
| 1,000,000
| null | true
|
bfloat16
|
4.46.1
| true
| 151,936
| null | 0
| null | null | null | null |
amd/AMD-OLMo-1B-SFT
| null | null | 2024-10-31T20:28:44
| null | null | 912
| null | null | null | null | 17
|
transformers
|
[
"transformers",
"safetensors",
"olmo",
"text-generation",
"dataset:allenai/dolma",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"OlmoForCausalLM"
] | null | 50,279
|
silu
| 2,048
| 0.02
| 8,192
| 2,048
|
olmo
| 16
| 16
| 16
| null | 10,000
| null | true
|
float32
|
4.40.2
| true
| 50,304
| false
| 0
| null | null | null | null |
TechxGenus/Typst-Coder-9B
| null | null | 2024-11-03T14:48:21
| null | null | 11
| null | null | null | null | 4
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"code",
"conversational",
"base_model:01-ai/Yi-Coder-9B",
"base_model:finetune:01-ai/Yi-Coder-9B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 11,008
| 131,072
|
llama
| 32
| 48
| 4
| 0.00001
| 10,000,000
| null | false
|
bfloat16
|
4.45.2
| false
| 64,000
| false
| 0
| 128
| false
| 1
| null |
allura-org/G2-9B-Aletheia-v1
| null | null | 2024-11-03T15:12:20
| null | null | 297
| null | null | null | null | 8
|
transformers
|
[
"transformers",
"safetensors",
"gemma2",
"text-generation",
"mergekit",
"merge",
"conversational",
"base_model:UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3",
"base_model:merge:UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3",
"base_model:allura-org/G2-9B-Sugarquill-v0",
"base_model:merge:allura-org/G2-9B-Sugarquill-v0",
"base_model:crestf411/gemma2-9B-sunfall-v0.5.2",
"base_model:merge:crestf411/gemma2-9B-sunfall-v0.5.2",
"license:gemma",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 4
|
[
"Gemma2ForCausalLM"
] | 2
| 1
|
gelu_pytorch_tanh
| 3,584
| 0.02
| 14,336
| 8,192
|
gemma2
| 16
| 42
| 8
| 0.000001
| 10,000
| 4,096
| null |
bfloat16
|
4.45.2
| true
| 256,000
| false
| 0
| 256
| null | null | null |
theprint/ReWiz-Qwen-2.5-14B
| null | null | 2024-11-05T10:01:22
| null | null | 256
| null | null | null | null | 4
|
transformers
|
[
"transformers",
"safetensors",
"gguf",
"qwen2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"theprint",
"rewiz",
"en",
"dataset:theprint/ReWiz",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] |
text-generation
| null | null | 4
|
[
"Qwen2ForCausalLM"
] | 151,643
| 151,643
|
silu
| 5,120
| 0.02
| 13,824
| 131,072
|
qwen2
| 40
| 48
| 8
| 0.00001
| 1,000,000
| null | false
|
bfloat16
|
4.44.2
| true
| 152,064
| null | 0
| null | null | null | null |
openai-community/gpt2-large
| null | null | 2022-03-02T23:29:04
| null | null | 1,207,892
| null | null | null | null | 270
|
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"rust",
"onnx",
"safetensors",
"gpt2",
"text-generation",
"en",
"arxiv:1910.09700",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"GPT2LMHeadModel"
] | 50,256
| 50,256
| null | null | 0.02
| null | null |
gpt2
| null | null | null | null | null | null | null | null | null | null | 50,257
| null | null | null | null | null | null |
pierreguillou/gpt2-small-portuguese
| null | null | 2022-03-02T23:29:05
| null | null | 8,870
| null | null | null | null | 40
|
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"gpt2",
"text-generation",
"pt",
"dataset:wikipedia",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"GPT2LMHeadModel"
] | 50,256
| 50,256
| null | null | 0.02
| null | null |
gpt2
| null | null | null | null | null | null | null | null | null | null | 50,257
| null | null | null | null | null | null |
Qwen/Qwen-VL-Chat
| null | null | 2023-08-20T04:45:22
| null | null | 21,054
| null | null | null | null | 337
|
transformers
|
[
"transformers",
"pytorch",
"qwen",
"text-generation",
"custom_code",
"zh",
"en",
"arxiv:2308.12966",
"autotrain_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"QWenLMHeadModel"
] | null | null | null | 4,096
| 0.02
| 22,016
| 8,192
|
qwen
| 32
| 32
| null | null | null | null | false
|
bfloat16
|
4.31.0
| true
| 151,936
| null | null | null | null | null | null |
microsoft/phi-1
| null | null | 2023-09-10T04:10:57
| null | null | 9,179
| null | null | null | null | 207
|
transformers
|
[
"transformers",
"safetensors",
"phi",
"text-generation",
"code",
"en",
"arxiv:2306.11644",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"PhiForCausalLM"
] | null | null |
gelu_new
| 2,048
| 0.02
| 8,192
| 2,048
|
phi
| 32
| 24
| null | null | 10,000
| null | false
|
float32
|
4.37.0
| true
| 51,200
| null | 0
| null | null | null | null |
prometheus-eval/prometheus-13b-v1.0
| null | null | 2023-10-12T07:19:38
| null | null | 3,889
| null | null | null | null | 126
|
transformers
|
[
"transformers",
"pytorch",
"llama",
"text-generation",
"text2text-generation",
"en",
"dataset:kaist-ai/Feedback-Collection",
"arxiv:2310.08491",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 5,120
| 0.02
| 13,824
| 4,096
|
llama
| 40
| 40
| 40
| 0.00001
| 10,000
| null | false
|
float32
|
4.33.1
| true
| 32,000
| null | null | null | null | 1
| null |
teknium/OpenHermes-2.5-Mistral-7B
| null | null | 2023-10-29T20:36:39
| null | null | 144,479
| null | null | null | null | 813
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"mistral",
"text-generation",
"instruct",
"finetune",
"chatml",
"gpt4",
"synthetic data",
"distillation",
"conversational",
"en",
"dataset:teknium/OpenHermes-2.5",
"base_model:mistralai/Mistral-7B-v0.1",
"base_model:finetune:mistralai/Mistral-7B-v0.1",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"MistralForCausalLM"
] | 1
| 32,000
|
silu
| 4,096
| 0.02
| 14,336
| 32,768
|
mistral
| 32
| 32
| 8
| 0.00001
| 10,000
| 4,096
| false
|
bfloat16
|
4.34.0.dev0
| false
| 32,002
| null | null | null | null | null | null |
01-ai/Yi-34B-200K
| null | null | 2023-11-06T01:46:54
| null | null | 5,178
| null | null | null | null | 317
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"arxiv:2403.04652",
"arxiv:2311.16502",
"arxiv:2401.11944",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 7,168
| 0.02
| 20,480
| 200,000
|
llama
| 56
| 60
| 8
| 0.00001
| 10,000,000
| null | false
|
bfloat16
|
4.34.0
| true
| 64,000
| null | null | null | null | 1
| null |
GOAT-AI/GOAT-70B-Storytelling
| null | null | 2023-11-17T08:12:07
| null | null | 1,002
| null | null | null | null | 41
|
transformers
|
[
"transformers",
"pytorch",
"llama",
"text-generation",
"facebook",
"meta",
"llama-2",
"Storywriter",
"license:llama2",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 8,192
| 0.02
| 28,672
| 4,096
|
llama
| 64
| 80
| 8
| 0.00001
| 10,000
| null | false
|
bfloat16
|
4.34.1
| true
| 32,000
| false
| null | null | null | 1
| null |
upstage/SOLAR-10.7B-v1.0
| null | null | 2023-12-12T14:57:41
| null | null | 32,758
| null | null | null | null | 291
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:2312.15166",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 14,336
| 4,096
|
llama
| 32
| 48
| 8
| 0.00001
| 10,000
| null | false
|
float16
|
4.35.2
| false
| 32,000
| false
| null | null | null | 1
| null |
microsoft/phi-2
| null | null | 2023-12-13T21:19:59
| null | null | 231,230
| null | null | null | null | 3,240
|
transformers
|
[
"transformers",
"safetensors",
"phi",
"text-generation",
"nlp",
"code",
"en",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"PhiForCausalLM"
] | 50,256
| 50,256
|
gelu_new
| 2,560
| 0.02
| 10,240
| 2,048
|
phi
| 32
| 32
| 32
| null | 10,000
| null | false
|
float16
|
4.37.0
| true
| 51,200
| null | 0
| null | null | null | null |
TinyLlama/TinyLlama-1.1B-Chat-v1.0
| null | null | 2023-12-30T06:27:30
| null | null | 1,255,617
| null | null | null | null | 1,087
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"dataset:cerebras/SlimPajama-627B",
"dataset:bigcode/starcoderdata",
"dataset:HuggingFaceH4/ultrachat_200k",
"dataset:HuggingFaceH4/ultrafeedback_binarized",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 2,048
| 0.02
| 5,632
| 2,048
|
llama
| 32
| 22
| 4
| 0.00001
| 10,000
| null | false
|
bfloat16
|
4.35.0
| true
| 32,000
| false
| null | null | null | 1
| null |
Unbabel/TowerInstruct-13B-v0.1
| null | null | 2024-01-29T10:39:36
| null | null | 513
| null | null | null | null | 21
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"translation",
"en",
"de",
"fr",
"zh",
"pt",
"nl",
"ru",
"ko",
"it",
"es",
"arxiv:2402.17733",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
translation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 32,005
|
silu
| 5,120
| 0.02
| 13,824
| 4,096
|
llama
| 40
| 40
| 40
| 0.00001
| 10,000
| null | false
|
float32
|
4.36.2
| false
| 32,007
| false
| 0
| null | null | 1
| null |
liuhaotian/llava-v1.6-vicuna-7b
| null | null | 2024-01-31T04:32:37
| null | null | 18,549
| null | null | null | null | 98
|
transformers
|
[
"transformers",
"safetensors",
"llava",
"text-generation",
"image-text-to-text",
"autotrain_compatible",
"region:us"
] |
image-text-to-text
| null | null | 3
|
[
"LlavaLlamaForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 11,008
| 4,096
|
llava
| 32
| 32
| 32
| 0.00001
| 10,000
| null | false
|
bfloat16
|
4.36.2
| true
| 32,000
| false
| 0
| null | null | 1
| null |
Ttimofeyka/MistralRP-Noromaid-NSFW-Mistral-7B-GGUF
| null | null | 2024-02-07T14:19:28
| null | null | 19,888
| null | null | null | null | 29
|
transformers
|
[
"transformers",
"safetensors",
"gguf",
"mistral",
"text-generation",
"mergekit",
"merge",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"MistralForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 14,336
| 32,768
|
mistral
| 32
| 32
| 8
| 0.00001
| 10,000
| 4,096
| false
|
bfloat16
|
4.37.2
| true
| 32,000
| null | 0
| null | null | null | null |
wolfram/miquliz-120b-v2.0
| null | null | 2024-02-10T11:41:03
| null | null | 463
| null | null | null | null | 93
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"mergekit",
"merge",
"conversational",
"en",
"de",
"fr",
"es",
"it",
"arxiv:2203.05482",
"base_model:152334H/miqu-1-70b-sf",
"base_model:merge:152334H/miqu-1-70b-sf",
"base_model:lizpreciatior/lzlv_70b_fp16_hf",
"base_model:merge:lizpreciatior/lzlv_70b_fp16_hf",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 2
|
silu
| 8,192
| 0.02
| 28,672
| 32,768
|
llama
| 64
| 140
| 8
| 0.00001
| 1,000,000
| null | false
|
float16
|
4.37.2
| true
| 32,000
| false
| 0
| null | null | 1
| null |
BioMistral/BioMistral-7B
| null | null | 2024-02-14T11:33:32
| null | null | 12,006
| null | null | null | null | 398
|
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"mistral",
"text-generation",
"medical",
"biology",
"conversational",
"fr",
"en",
"de",
"nl",
"es",
"pt",
"pl",
"ro",
"it",
"arxiv:2402.10373",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"MistralForCausalLM"
] | 1
| 2
|
silu
| 4,096
| 0.02
| 14,336
| 32,768
|
mistral
| 32
| 32
| 8
| 0.00001
| 10,000
| 4,096
| false
|
bfloat16
|
4.35.0
| false
| 32,000
| null | null | null | null | null | null |
yanolja/EEVE-Korean-Instruct-10.8B-v1.0
| null | null | 2024-02-22T04:39:04
| null | null | 17,880
| null | null | null | null | 129
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"conversational",
"arxiv:2402.14714",
"arxiv:2310.01377",
"arxiv:2306.02707",
"base_model:yanolja/EEVE-Korean-10.8B-v1.0",
"base_model:finetune:yanolja/EEVE-Korean-10.8B-v1.0",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| null | null | 3
|
[
"LlamaForCausalLM"
] | 1
| 32,000
|
silu
| 4,096
| 0.02
| 14,336
| 4,096
|
llama
| 32
| 48
| 8
| 0.00001
| 10,000
| null | false
|
bfloat16
|
4.36.2
| false
| 40,960
| false
| 0
| null | null | 1
| null |
End of preview. Expand
in Data Studio
MergeKit-configs: access all Hub architectures and automate your model merging process
This dataset facilitates the search for compatible architectures for model merging with MergeKit, streamlining the automation of high-performance merge searches. It provides a snapshot of the Hub’s configuration state, eliminating the need to manually open configuration files.
import polars as pl
# Login using e.g. `huggingface-cli login` to access this dataset
df = pl.read_parquet('hf://datasets/louisbrulenaudet/mergekit-configs/data/raw-00000-of-00001.parquet')
result = (
df.groupby(
[
"architectures",
"hidden_size",
"model_type",
"intermediate_size"
]
).agg(
pl.struct([pl.col("id")]).alias("models")
)
)
Citing & Authors
If you use this dataset in your research, please use the following BibTeX entry.
@misc{HFforLegal2024,
author = {Louis Brulé Naudet},
title = {MergeKit-configs: access all Hub architectures and automate your model merging process},
year = {2024}
howpublished = {\url{https://huggingface.co/datasets/louisbrulenaudet/mergekit-configs}},
}
Feedback
If you have any feedback, please reach out at louisbrulenaudet@icloud.com.
- Downloads last month
- 65