model_id stringclasses 120 values | model_name stringclasses 118 values | author stringclasses 43 values | created_at stringdate 2022-03-02 23:29:04 2025-11-01 12:45:39 | downloads int64 0 11.4M | likes int64 6 12.8k | library stringclasses 4 values | tags stringclasses 106 values | trending_score int64 6 912 | trending_rank int64 1 1k | architecture stringclasses 33 values | model_type stringclasses 33 values | num_parameters float64 268M 65.7B ⌀ | max_position_embeddings float64 2.05k 10.2M ⌀ | hidden_size float64 768 8.19k ⌀ | num_attention_heads float64 12 128 ⌀ | num_hidden_layers float64 16 92 ⌀ | vocab_size float64 32k 256k ⌀ | primary_category stringclasses 5 values | secondary_categories stringclasses 25 values | task_types stringclasses 37 values | language_support stringclasses 20 values | use_cases stringclasses 39 values | performance_metrics stringclasses 1 value | a2ap_compatibility_score float64 40 96 | merge_difficulty stringclasses 4 values | evolution_potential float64 0.4 0.96 | analysis_timestamp stringdate 2025-11-02 05:59:02 2025-11-02 06:09:35 | readme_summary stringclasses 96 values | special_features stringclasses 43 values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ibm-granite/granite-docling-258M | granite-docling-258M | ibm-granite | 2025-05-19T19:53:11+00:00 | 209,400 | 998 | transformers | ['transformers', 'safetensors', 'idefics3', 'image-to-text', 'text-generation', 'documents', 'code', 'formula', 'chart', 'ocr', 'layout', 'table', 'document-parse', 'docling', 'granite', 'extraction', 'math', 'image-text-to-text', 'conversational', 'en', 'dataset:ds4sd/SynthCodeNet', 'dataset:ds4sd/SynthFormulaNet', 'dataset:ds4sd/SynthChartNet', 'dataset:HuggingFaceM4/DoclingMatix', 'arxiv:2501.17887', 'arxiv:2503.11576', 'arxiv:2305.03393', 'license:apache-2.0', 'endpoints_compatible', 'region:us'] | 16 | 401 | Idefics3ForConditionalGeneration | idefics3 | null | null | null | null | null | 100,352 | Language Model | ['Fine-tuned', 'LoRA'] | ['text-generation', 'question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Education'] | {} | 48 | Hard | 0.48 | 2025-11-02T06:03:19.988669 | license: apache-2.0 datasets: - ds4sd/SynthCodeNet - ds4sd/SynthFormulaNet - ds4sd/SynthChartNet - HuggingFaceM4/DoclingMatix tags: - text-generation - documents - code - formula - chart - ocr - layou... | ['Fast Inference', 'Memory Efficient', 'Safety Aligned'] |
inclusionAI/Ling-1T | Ling-1T | inclusionAI | 2025-10-02T13:41:55+00:00 | 4,508 | 501 | transformers | ['transformers', 'safetensors', 'bailing_moe', 'text-generation', 'conversational', 'custom_code', 'arxiv:2507.17702', 'arxiv:2507.17634', 'license:mit', 'autotrain_compatible', 'region:us'] | 16 | 402 | BailingMoeV2ForCausalLM | bailing_moe | 65,712,160,768 | 32,768 | 8,192 | 64 | 80 | 157,184 | Language Model | ['RLHF', 'Merged'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Education', 'Legal'] | {} | 63 | Hard | 0.63 | 2025-11-02T06:03:20.134842 | license: mit pipeline_tag: text-generation library_name: transformers <p align="center"> <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*4QxcQrBlTiAAAAAAQXAAAAgAemJ7AQ/original" width... | ['Function Calling', 'Long Context', 'Fast Inference', 'Multi-turn'] |
meta-llama/Meta-Llama-3-8B-Instruct | Meta-Llama-3-8B-Instruct | meta-llama | 2024-04-17T09:35:12+00:00 | 984,685 | 4,254 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'license:llama3', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 15 | 403 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:21.308919 | No README available | [] |
meta-llama/Llama-3.1-8B | Llama-3.1-8B | meta-llama | 2024-07-14T22:20:15+00:00 | 551,850 | 1,895 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'license:llama3.1', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 15 | 404 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:22.500888 | No README available | [] |
Qwen/Qwen3-4B-Instruct-2507 | Qwen3-4B-Instruct-2507 | Qwen | 2025-08-05T10:58:03+00:00 | 3,942,352 | 436 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 15 | 405 | Qwen3ForCausalLM | qwen3 | 3,220,111,360 | 262,144 | 2,560 | 32 | 36 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 68 | Medium | 0.68 | 2025-11-02T06:03:22.687904 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-4B-Instruct-2507/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai" target="_... | ['RAG Support', 'Safety Aligned'] |
LiquidAI/LFM2-350M-ENJP-MT | LFM2-350M-ENJP-MT | LiquidAI | 2025-09-03T04:15:07+00:00 | 7,197 | 75 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'translation', 'japanese', 'en', 'ja', 'base_model:LiquidAI/LFM2-350M', 'base_model:finetune:LiquidAI/LFM2-350M', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 15 | 406 | Lfm2ForCausalLM | lfm2 | 268,435,456 | 128,000 | 1,024 | 16 | 16 | 65,536 | Chat/Instruct | ['Fine-tuned', 'LoRA', 'Specialized'] | ['text-generation', 'translation', 'conversation'] | ['English', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Business', 'Healthcare', 'Legal', 'Finance'] | {} | 70 | Medium | 0.7 | 2025-11-02T06:03:24.320049 | library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ja pipeline_tag: translation tags: - liquid - lfm2 - edge - translation - japanese base_model: - L... | ['Multi-turn'] |
rednote-hilab/dots.ocr | dots.ocr | rednote-hilab | 2025-07-30T09:55:44+00:00 | 1,171,956 | 1,106 | dots_ocr | ['dots_ocr', 'safetensors', 'text-generation', 'image-to-text', 'ocr', 'document-parse', 'layout', 'table', 'formula', 'transformers', 'custom_code', 'image-text-to-text', 'conversational', 'en', 'zh', 'multilingual', 'license:mit', 'region:us'] | 14 | 407 | DotsOCRForCausalLM | dots_ocr | 1,026,097,152 | 131,072 | 1,536 | 12 | 28 | 151,936 | Language Model | [] | ['text-generation', 'translation', 'summarization', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Finance'] | {} | 75 | Medium | 0.75 | 2025-11-02T06:03:24.562211 | license: mit library_name: dots_ocr pipeline_tag: image-text-to-text tags: - image-to-text - ocr - document-parse - layout - table - formula - transformers - custom_code language: - en - zh - multilin... | ['RAG Support', 'Fast Inference', 'Memory Efficient'] |
Qwen/Qwen3-Coder-30B-A3B-Instruct | Qwen3-Coder-30B-A3B-Instruct | Qwen | 2025-07-31T07:04:55+00:00 | 447,208 | 721 | transformers | ['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 14 | 408 | Qwen3MoeForCausalLM | qwen3_moe | 2,727,084,032 | 262,144 | 2,048 | 32 | 48 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'code-generation', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['General Purpose'] | {} | 78 | Medium | 0.78 | 2025-11-02T06:03:24.768006 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Coder-30B-A3B-Instruct/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" ta... | ['Fast Inference', 'Memory Efficient', 'Safety Aligned'] |
RUC-DataLab/DeepAnalyze-8B | DeepAnalyze-8B | RUC-DataLab | 2025-10-17T09:15:30+00:00 | 1,763 | 48 | transformers | ['transformers', 'safetensors', 'text-generation', 'dataset:RUC-DataLab/DataScience-Instruct-500K', 'arxiv:2510.16872', 'license:mit', 'endpoints_compatible', 'region:us'] | 14 | 409 | Unknown | unknown | null | null | null | null | null | null | Language Model | [] | ['text-generation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:25.984213 | datasets: - RUC-DataLab/DataScience-Instruct-500K license: mit pipeline_tag: text-generation library_name: transformers <p align="center" width="100%"> <img src="assets/logo.png" alt="DeepAnalyze" sty... | [] |
LiquidAI/LFM2-350M-PII-Extract-JP | LFM2-350M-PII-Extract-JP | LiquidAI | 2025-09-30T07:08:56+00:00 | 426 | 51 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'conversational', 'ja', 'base_model:LiquidAI/LFM2-350M', 'base_model:finetune:LiquidAI/LFM2-350M', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 13 | 410 | Lfm2ForCausalLM | lfm2 | 268,435,456 | 128,000 | 1,024 | 16 | 16 | 65,536 | Language Model | ['Fine-tuned', 'LoRA', 'Specialized'] | ['text-generation', 'translation', 'code-generation', 'conversation'] | ['English', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Healthcare'] | {} | 70 | Medium | 0.7 | 2025-11-02T06:03:26.160879 | library_name: transformers language: - ja base_model: - LiquidAI/LFM2-350M license: other license_name: lfm1.0 license_link: LICENSE <center> <div style="text-align: center;"> <img src="https://cdn-up... | ['RAG Support', 'Multi-turn'] |
KORMo-Team/KORMo-10B-sft | KORMo-10B-sft | KORMo-Team | 2025-10-12T12:13:51+00:00 | 2,532 | 111 | transformers | ['transformers', 'safetensors', 'kormo', 'text-generation', 'conversational', 'custom_code', 'arxiv:2510.09426', 'license:apache-2.0', 'autotrain_compatible', 'region:us'] | 13 | 411 | KORMoForCausalLM | kormo | 8,565,817,344 | 131,072 | 4,096 | 32 | 40 | 125,184 | Language Model | ['LoRA'] | ['text-generation', 'question-answering', 'reasoning', 'conversation'] | ['English', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Education', 'Healthcare'] | {} | 58 | Hard | 0.58 | 2025-11-02T06:03:28.302798 | library_name: transformers license: apache-2.0 <!-- <p align="center"> <img src="https://github.com/MLP-Lab/KORMo-tutorial/blob/main/tutorial/attachment/kormo_logo.png?raw=true" style="width: 100%; ma... | ['RAG Support', 'Safety Aligned'] |
cturan/MiniMax-M2-GGUF | MiniMax-M2-GGUF | cturan | 2025-10-28T17:22:15+00:00 | 4,667 | 13 | transformers | ['transformers', 'gguf', 'text-generation', 'base_model:MiniMaxAI/MiniMax-M2', 'base_model:quantized:MiniMaxAI/MiniMax-M2', 'license:mit', 'endpoints_compatible', 'region:us', 'conversational'] | 13 | 412 | Unknown | unknown | null | null | null | null | null | null | Reasoning | [] | ['text-generation', 'reasoning'] | ['English', 'Japanese', 'Spanish', 'German', 'Russian', 'Arabic'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:29.769489 | pipeline_tag: text-generation license: mit library_name: transformers base_model: - MiniMaxAI/MiniMax-M2 **Note:** This setup is experimental. The `minimax` branch will not work with the standard `lla... | [] |
Qwen/Qwen3-4B | Qwen3-4B | Qwen | 2025-04-27T03:41:29+00:00 | 1,248,515 | 432 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-4B-Base', 'base_model:finetune:Qwen/Qwen3-4B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 12 | 413 | Qwen3ForCausalLM | qwen3 | 3,220,111,360 | 40,960 | 2,560 | 32 | 36 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 81 | Medium | 0.81 | 2025-11-02T06:03:29.980909 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-4B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-4B-Base <a href="https://chat... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
Qwen/Qwen3-8B | Qwen3-8B | Qwen | 2025-04-27T03:42:21+00:00 | 2,548,149 | 708 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-8B-Base', 'base_model:finetune:Qwen/Qwen3-8B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 12 | 414 | Qwen3ForCausalLM | qwen3 | 7,870,087,168 | 40,960 | 4,096 | 32 | 36 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 81 | Medium | 0.81 | 2025-11-02T06:03:30.122764 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-8B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-8B-Base <a href="https://chat... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
nvidia/NVIDIA-Nemotron-Nano-12B-v2 | NVIDIA-Nemotron-Nano-12B-v2 | nvidia | 2025-08-21T01:31:50+00:00 | 69,727 | 117 | transformers | ['transformers', 'safetensors', 'nvidia', 'pytorch', 'text-generation', 'conversational', 'en', 'es', 'fr', 'de', 'it', 'ja', 'dataset:nvidia/Nemotron-Post-Training-Dataset-v1', 'dataset:nvidia/Nemotron-Post-Training-Dataset-v2', 'dataset:nvidia/Nemotron-Pretraining-Dataset-sample', 'dataset:nvidia/Nemotron-CC-v2', 'dataset:nvidia/Nemotron-CC-Math-v1', 'dataset:nvidia/Nemotron-Pretraining-SFT-v1', 'arxiv:2504.03624', 'arxiv:2508.14444', 'arxiv:2412.02595', 'base_model:nvidia/NVIDIA-Nemotron-Nano-12B-v2-Base', 'base_model:finetune:nvidia/NVIDIA-Nemotron-Nano-12B-v2-Base', 'license:other', 'endpoints_compatible', 'region:us'] | 12 | 415 | NemotronHForCausalLM | nemotron_h | 20,174,602,240 | 131,072 | 5,120 | 40 | 62 | 131,072 | Language Model | ['Fine-tuned', 'Quantized', 'Merged', 'Specialized'] | ['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Business', 'Legal', 'Finance'] | {} | 61 | Hard | 0.61 | 2025-11-02T06:03:30.293300 | license: other license_name: nvidia-open-model-license license_link: >- https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license/ pipeline_tag: text-generation datasets: -... | ['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned'] |
microsoft/VibeVoice-1.5B | VibeVoice-1.5B | microsoft | 2025-08-25T13:46:48+00:00 | 223,609 | 1,943 | transformers | ['transformers', 'safetensors', 'vibevoice', 'text-generation', 'Podcast', 'text-to-speech', 'en', 'zh', 'arxiv:2508.19205', 'arxiv:2412.08635', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 12 | 416 | VibeVoiceForConditionalGeneration | vibevoice | null | null | null | null | null | null | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Education', 'Business', 'Legal'] | {} | 53 | Hard | 0.53 | 2025-11-02T06:03:30.555314 | language: - en - zh license: mit pipeline_tag: text-to-speech tags: - Podcast library_name: transformers VibeVoice is a novel framework designed for generating expressive, long-form, multi-speaker con... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn'] |
LiquidAI/LFM2-350M-Extract | LFM2-350M-Extract | LiquidAI | 2025-09-03T17:13:53+00:00 | 5,852 | 61 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'conversational', 'en', 'ar', 'zh', 'fr', 'de', 'ja', 'ko', 'es', 'base_model:LiquidAI/LFM2-350M', 'base_model:finetune:LiquidAI/LFM2-350M', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 12 | 417 | Lfm2ForCausalLM | lfm2 | 268,435,456 | 128,000 | 1,024 | 16 | 16 | 65,536 | Language Model | ['LoRA'] | ['text-generation', 'translation', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Creative Writing'] | {} | 70 | Medium | 0.7 | 2025-11-02T06:03:30.714878 | library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ar - zh - fr - de - ja - ko - es pipeline_tag: text-generation tags: - liquid - lfm2 - edge base_m... | ['Multi-turn'] |
tencent/HunyuanImage-3.0 | HunyuanImage-3.0 | tencent | 2025-09-25T06:28:28+00:00 | 101,407 | 947 | transformers | ['transformers', 'safetensors', 'hunyuan_image_3_moe', 'text-generation', 'text-to-image', 'custom_code', 'arxiv:2509.23951', 'license:other', 'autotrain_compatible', 'region:us'] | 12 | 418 | HunyuanImage3ForCausalMM | hunyuan_image_3_moe | 6,987,710,464 | 12,800 | 4,096 | 32 | 32 | 133,120 | Language Model | ['RLHF'] | ['text-generation', 'translation', 'summarization', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Education', 'Creative Writing'] | {} | 55 | Hard | 0.55 | 2025-11-02T06:03:30.841044 | license: other license_name: tencent-hunyuan-community license_link: LICENSE pipeline_tag: text-to-image library_name: transformers <div align="center"> <img src="./assets/logo.png" alt="HunyuanImage-... | ['RAG Support', 'Fast Inference', 'Memory Efficient', 'Multi-turn'] |
meta-llama/Llama-2-7b-chat-hf | Llama-2-7b-chat-hf | meta-llama | 2023-07-13T16:45:23+00:00 | 380,375 | 4,631 | transformers | ['transformers', 'pytorch', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'llama-2', 'conversational', 'en', 'arxiv:2307.09288', 'license:llama2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 11 | 419 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:31.997241 | No README available | [] |
Qwen/Qwen2.5-0.5B | Qwen2.5-0.5B | Qwen | 2024-09-15T12:15:39+00:00 | 747,023 | 329 | transformers | ['transformers', 'safetensors', 'qwen2', 'text-generation', 'conversational', 'en', 'arxiv:2407.10671', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 11 | 420 | Qwen2ForCausalLM | qwen2 | 367,345,664 | 32,768 | 896 | 14 | 24 | 151,936 | Language Model | ['RLHF', 'Specialized'] | ['text-generation', 'question-answering', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['General Purpose'] | {} | 91 | Easy | 0.91 | 2025-11-02T06:03:32.183318 | license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen2.5-0.5B/blob/main/LICENSE language: - en pipeline_tag: text-generation library_name: transformers Qwen2.5 is the latest series of Qwe... | ['Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
meta-llama/Llama-3.2-3B-Instruct | Llama-3.2-3B-Instruct | meta-llama | 2024-09-18T15:19:20+00:00 | 1,924,026 | 1,789 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'arxiv:2405.16406', 'license:llama3.2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 11 | 421 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:35.428809 | No README available | [] |
MiniMaxAI/MiniMax-M1-80k | MiniMax-M1-80k | MiniMaxAI | 2025-06-13T08:21:14+00:00 | 262 | 685 | transformers | ['transformers', 'safetensors', 'minimax_m1', 'text-generation', 'vllm', 'conversational', 'custom_code', 'arxiv:2506.13585', 'license:apache-2.0', 'autotrain_compatible', 'region:us'] | 11 | 422 | MiniMaxM1ForCausalLM | minimax_m1 | 37,467,979,776 | 10,240,000 | 6,144 | 64 | 80 | 200,064 | Language Model | ['RLHF', 'Merged'] | ['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Education', 'Creative Writing', 'Business'] | {} | 66 | Medium | 0.66 | 2025-11-02T06:03:35.679171 | pipeline_tag: text-generation license: apache-2.0 library_name: transformers tags: - vllm <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="http://www.w3.org/... | ['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned'] |
Qwen/Qwen3-30B-A3B-Instruct-2507 | Qwen3-30B-A3B-Instruct-2507 | Qwen | 2025-07-28T07:31:27+00:00 | 1,212,976 | 638 | transformers | ['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'arxiv:2402.17463', 'arxiv:2407.02490', 'arxiv:2501.15383', 'arxiv:2404.06654', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 11 | 423 | Qwen3MoeForCausalLM | qwen3_moe | 2,727,084,032 | 262,144 | 2,048 | 32 | 48 | 151,936 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 96 | Easy | 0.96 | 2025-11-02T06:03:35.833304 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-30B-A3B-Instruct-2507/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/?mode... | ['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
unsloth/gpt-oss-20b-GGUF | gpt-oss-20b-GGUF | unsloth | 2025-08-05T17:12:17+00:00 | 211,159 | 451 | transformers | ['transformers', 'gguf', 'gpt_oss', 'text-generation', 'openai', 'unsloth', 'base_model:openai/gpt-oss-20b', 'base_model:quantized:openai/gpt-oss-20b', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us', 'conversational'] | 11 | 424 | GptOssForCausalLM | gpt_oss | 2,967,920,640 | 131,072 | 2,880 | 64 | 24 | 201,088 | Language Model | ['Fine-tuned', 'Quantized', 'Specialized'] | ['text-generation', 'reasoning', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Business'] | {} | 85 | Easy | 0.85 | 2025-11-02T06:03:35.979576 | base_model: - openai/gpt-oss-20b license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - openai - unsloth > [!NOTE] > GGUF uploads with our fixes. More details and [Read o... | ['Function Calling', 'Fast Inference', 'Multi-turn'] |
Qwen/Qwen3-Next-80B-A3B-Instruct | Qwen3-Next-80B-A3B-Instruct | Qwen | 2025-09-09T15:40:56+00:00 | 1,663,221 | 845 | transformers | ['transformers', 'safetensors', 'qwen3_next', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2404.06654', 'arxiv:2505.09388', 'arxiv:2501.15383', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 11 | 425 | Qwen3NextForCausalLM | qwen3_next | 2,727,084,032 | 262,144 | 2,048 | 16 | 48 | 151,936 | Language Model | ['Merged'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Creative Writing'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:03:36.135334 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Next-80B-A3B-Instruct/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" tar... | ['Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
inclusionAI/Ring-1T | Ring-1T | inclusionAI | 2025-10-10T16:39:04+00:00 | 1,807 | 216 | transformers | ['transformers', 'safetensors', 'bailing_moe', 'text-generation', 'conversational', 'custom_code', 'arxiv:2510.18855', 'license:mit', 'autotrain_compatible', 'region:us'] | 11 | 426 | BailingMoeV2ForCausalLM | bailing_moe | 65,712,160,768 | 65,536 | 8,192 | 64 | 80 | 157,184 | Language Model | ['RLHF'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Education', 'Creative Writing', 'Business', 'Healthcare'] | {} | 63 | Hard | 0.63 | 2025-11-02T06:03:36.372208 | pipeline_tag: text-generation license: mit library_name: transformers <p align="center"> <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*4QxcQrBlTiAAAAAAQXAAAAgAemJ7AQ/original" width... | ['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient'] |
cerebras/GLM-4.6-REAP-218B-A32B-FP8 | GLM-4.6-REAP-218B-A32B-FP8 | cerebras | 2025-10-23T20:29:59+00:00 | 757 | 34 | transformers | ['transformers', 'safetensors', 'glm4_moe', 'text-generation', 'glm', 'MOE', 'pruning', 'compression', 'conversational', 'en', 'arxiv:2510.13999', 'base_model:zai-org/GLM-4.6-FP8', 'base_model:quantized:zai-org/GLM-4.6-FP8', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'compressed-tensors', 'region:us'] | 11 | 427 | Glm4MoeForCausalLM | glm4_moe | 29,716,643,840 | 202,752 | 5,120 | 96 | 92 | 151,552 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Creative Writing'] | {} | 60 | Hard | 0.6 | 2025-11-02T06:03:36.501473 | language: - en library_name: transformers tags: - glm - MOE - pruning - compression license: mit name: cerebras/GLM-4.6-REAP-218B-A32B-FP8 description: > This model was obtained by uniformly pruning 4... | ['Function Calling', 'Fast Inference', 'Memory Efficient', 'Multi-turn'] |
internlm/JanusCoder-8B | JanusCoder-8B | internlm | 2025-10-27T09:33:54+00:00 | 139 | 11 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'image-text-to-text', 'conversational', 'arxiv:2510.23538', 'arxiv:2403.14734', 'arxiv:2510.09724', 'arxiv:2507.22080', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 11 | 428 | Qwen3ForCausalLM | qwen3 | 7,870,087,168 | 32,768 | 4,096 | 32 | 36 | 151,936 | Language Model | ['Specialized'] | ['text-generation', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research'] | {} | 68 | Medium | 0.68 | 2025-11-02T06:03:36.685705 | license: apache-2.0 pipeline_tag: image-text-to-text library_name: transformers [💻Github Repo](https://github.com/InternLM/JanusCoder) • [🤗Model Collections](https://huggingface.co/collections/internl... | ['Safety Aligned'] |
unsloth/MiniMax-M2 | MiniMax-M2 | unsloth | 2025-10-28T12:05:08+00:00 | 264 | 11 | transformers | ['transformers', 'safetensors', 'minimax', 'text-generation', 'conversational', 'arxiv:2504.07164', 'arxiv:2509.06501', 'arxiv:2509.13160', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us'] | 11 | 429 | MiniMaxM2ForCausalLM | minimax | 7,635,861,504 | 196,608 | 3,072 | 48 | 62 | 200,064 | Language Model | ['LoRA'] | ['text-generation', 'question-answering', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production'] | {} | 91 | Easy | 0.91 | 2025-11-02T06:03:36.854738 | pipeline_tag: text-generation license: mit library_name: transformers <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="http://www.w3.org/2000/svg"> <path d="... | ['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
ByteDance/Ouro-1.4B-Thinking | Ouro-1.4B-Thinking | ByteDance | 2025-10-28T22:14:40+00:00 | 34 | 11 | transformers | ['transformers', 'safetensors', 'ouro', 'text-generation', 'looped-language-model', 'reasoning', 'recurrent-depth', 'thinking', 'chain-of-thought', 'conversational', 'custom_code', 'arxiv:2510.25741', 'license:apache-2.0', 'autotrain_compatible', 'region:us'] | 11 | 430 | OuroForCausalLM | ouro | 1,308,622,848 | 65,536 | 2,048 | 16 | 24 | 49,152 | Language Model | ['Specialized'] | ['text-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Creative Writing'] | {} | 83 | Medium | 0.83 | 2025-11-02T06:03:37.035534 | library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - looped-language-model - reasoning - recurrent-depth - thinking - chain-of-thought  **⚠... | ['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient'] |
mistralai/Mistral-7B-Instruct-v0.2 | Mistral-7B-Instruct-v0.2 | mistralai | 2023-12-11T13:18:44+00:00 | 2,963,552 | 2,997 | transformers | ['transformers', 'pytorch', 'safetensors', 'mistral', 'text-generation', 'finetuned', 'mistral-common', 'conversational', 'arxiv:2310.06825', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'region:us'] | 10 | 431 | MistralForCausalLM | mistral | 6,573,522,944 | 32,768 | 4,096 | 32 | 32 | 32,000 | Language Model | ['Fine-tuned'] | ['text-generation', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Education'] | {} | 78 | Medium | 0.78 | 2025-11-02T06:03:39.548652 | library_name: transformers license: apache-2.0 tags: - finetuned - mistral-common new_version: mistralai/Mistral-7B-Instruct-v0.3 inference: false widget: - messages: - role: user content: What is you... | ['RAG Support', 'Long Context'] |
fdtn-ai/Foundation-Sec-8B | Foundation-Sec-8B | fdtn-ai | 2025-04-26T17:20:37+00:00 | 8,906 | 265 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'security', 'en', 'arxiv:2504.21039', 'base_model:meta-llama/Llama-3.1-8B', 'base_model:finetune:meta-llama/Llama-3.1-8B', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 10 | 432 | LlamaForCausalLM | llama | 6,968,311,808 | 131,072 | 4,096 | 32 | 32 | 128,384 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'text-classification', 'summarization', 'reasoning'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Healthcare', 'Legal'] | {} | 78 | Medium | 0.78 | 2025-11-02T06:03:39.720049 | base_model: - meta-llama/Llama-3.1-8B language: - en library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - security Foundation-Sec-8B (Llama-3.1-FoundationAI-SecurityLLM... | ['RAG Support', 'Fast Inference', 'Memory Efficient', 'Safety Aligned'] |
katanemo/Arch-Router-1.5B | Arch-Router-1.5B | katanemo | 2025-05-30T18:16:23+00:00 | 3,508 | 216 | transformers | ['transformers', 'safetensors', 'qwen2', 'text-generation', 'routing', 'preference', 'arxiv:2506.16655', 'llm', 'conversational', 'en', 'base_model:Qwen/Qwen2.5-1.5B-Instruct', 'base_model:finetune:Qwen/Qwen2.5-1.5B-Instruct', 'license:other', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 10 | 433 | Qwen2ForCausalLM | qwen2 | 1,026,097,152 | 32,768 | 1,536 | 12 | 28 | 151,936 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Healthcare', 'Legal'] | {} | 83 | Medium | 0.83 | 2025-11-02T06:03:39.984687 | base_model: - Qwen/Qwen2.5-1.5B-Instruct language: - en library_name: transformers license: other license_name: katanemo-research license_link: https://huggingface.co/katanemo/Arch-Router-1.5B/blob/ma... | ['Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned'] |
dphn/Dolphin-Mistral-24B-Venice-Edition | Dolphin-Mistral-24B-Venice-Edition | dphn | 2025-06-12T05:29:16+00:00 | 7,206 | 279 | transformers | ['transformers', 'safetensors', 'mistral', 'text-generation', 'conversational', 'base_model:mistralai/Mistral-Small-24B-Instruct-2501', 'base_model:finetune:mistralai/Mistral-Small-24B-Instruct-2501', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 10 | 434 | MistralForCausalLM | mistral | 13,254,000,640 | 32,768 | 5,120 | 32 | 40 | 131,072 | Language Model | ['Specialized'] | ['text-generation', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Business', 'Legal'] | {} | 63 | Hard | 0.63 | 2025-11-02T06:03:40.260696 | license: apache-2.0 base_model: - mistralai/Mistral-Small-24B-Instruct-2501 pipeline_tag: text-generation library_name: transformers Website: https://dphn.ai Twitter: https://x.com/dphnAI Web Chat: ht... | ['Multi-turn', 'Safety Aligned'] |
LiquidAI/LFM2-350M-Math | LFM2-350M-Math | LiquidAI | 2025-08-25T17:06:02+00:00 | 1,296 | 49 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'conversational', 'en', 'base_model:LiquidAI/LFM2-350M', 'base_model:finetune:LiquidAI/LFM2-350M', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 10 | 435 | Lfm2ForCausalLM | lfm2 | 268,435,456 | 128,000 | 1,024 | 16 | 16 | 65,536 | Chat/Instruct | ['RLHF', 'LoRA'] | ['text-generation', 'translation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Education'] | {} | 70 | Medium | 0.7 | 2025-11-02T06:03:40.411436 | library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en pipeline_tag: text-generation tags: - liquid - lfm2 - edge base_model: LiquidAI/LFM2-350M <center> <... | ['RAG Support', 'Multi-turn'] |
facebook/MobileLLM-Pro | MobileLLM-Pro | facebook | 2025-09-10T18:40:06+00:00 | 4,486 | 138 | transformers | ['transformers', 'safetensors', 'llama4_text', 'text-generation', 'facebook', 'meta', 'pytorch', 'conversational', 'custom_code', 'en', 'base_model:facebook/MobileLLM-Pro-base', 'base_model:finetune:facebook/MobileLLM-Pro-base', 'license:fair-noncommercial-research-license', 'autotrain_compatible', 'region:us'] | 10 | 436 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:41.632278 | No README available | [] |
nineninesix/kani-tts-370m | kani-tts-370m | nineninesix | 2025-09-30T07:31:22+00:00 | 8,444 | 136 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'text-to-speech', 'en', 'de', 'ar', 'zh', 'es', 'ko', 'arxiv:2505.20506', 'base_model:nineninesix/kani-tts-450m-0.2-pt', 'base_model:finetune:nineninesix/kani-tts-450m-0.2-pt', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 10 | 437 | Lfm2ForCausalLM | lfm2 | 283,798,528 | 128,000 | 1,024 | 16 | 16 | 80,539 | Language Model | ['Fine-tuned', 'Specialized'] | ['conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Legal'] | {} | 75 | Medium | 0.75 | 2025-11-02T06:03:41.800663 | license: apache-2.0 language: - en - de - ar - zh - es - ko pipeline_tag: text-to-speech library_name: transformers base_model: - nineninesix/kani-tts-450m-0.2-pt <p> <img src="https://www.nineninesix... | ['Fast Inference', 'Memory Efficient', 'Multi-turn'] |
vandijklab/C2S-Scale-Gemma-2-27B | C2S-Scale-Gemma-2-27B | vandijklab | 2025-10-06T20:22:30+00:00 | 9,715 | 137 | transformers | ['transformers', 'safetensors', 'gemma2', 'text-generation', 'biology', 'scRNAseq', 'genomics', 'computational-biology', 'bioinformatics', 'gene-expression', 'cell-biology', 'pytorch', 'cell-type-annotation', 'Question Answering', 'en', 'base_model:google/gemma-2-27b', 'base_model:finetune:google/gemma-2-27b', 'license:cc-by-4.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 10 | 438 | Gemma2ForCausalLM | gemma2 | 12,900,630,528 | 8,192 | 4,608 | 32 | 46 | 256,000 | Language Model | ['Fine-tuned', 'RLHF', 'Specialized'] | ['text-generation', 'question-answering', 'text-classification', 'reasoning'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Education', 'Legal'] | {} | 60 | Hard | 0.6 | 2025-11-02T06:03:41.977487 | license: cc-by-4.0 language: - en base_model: google/gemma-2-27b library_name: transformers pipeline_tag: text-generation tags: - biology - scRNAseq - gemma2 - genomics - computational-biology - bioin... | ['RAG Support', 'Fast Inference'] |
openai-community/gpt2 | gpt2 | openai-community | 2022-03-02T23:29:04+00:00 | 11,402,243 | 3,004 | transformers | ['transformers', 'pytorch', 'tf', 'jax', 'tflite', 'rust', 'onnx', 'safetensors', 'gpt2', 'text-generation', 'exbert', 'en', 'doi:10.57967/hf/0039', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 9 | 439 | GPT2LMHeadModel | gpt2 | null | null | null | null | null | 50,257 | Language Model | ['Fine-tuned'] | ['text-generation', 'code-generation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:42.144996 | language: en tags: - exbert license: mit Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large Pretrained model on English language using a causal language mod... | [] |
google/gemma-7b | gemma-7b | google | 2024-02-08T22:36:43+00:00 | 31,556 | 3,228 | transformers | ['transformers', 'safetensors', 'gguf', 'gemma', 'text-generation', 'arxiv:2305.14314', 'arxiv:2312.11805', 'arxiv:2009.03300', 'arxiv:1905.07830', 'arxiv:1911.11641', 'arxiv:1904.09728', 'arxiv:1905.10044', 'arxiv:1907.10641', 'arxiv:1811.00937', 'arxiv:1809.02789', 'arxiv:1911.01547', 'arxiv:1705.03551', 'arxiv:2107.03374', 'arxiv:2108.07732', 'arxiv:2110.14168', 'arxiv:2304.06364', 'arxiv:2206.04615', 'arxiv:1804.06876', 'arxiv:2110.08193', 'arxiv:2009.11462', 'arxiv:2101.11718', 'arxiv:1804.09301', 'arxiv:2109.07958', 'arxiv:2203.09509', 'license:gemma', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 9 | 440 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:43.364377 | No README available | [] |
meta-llama/Llama-3.3-70B-Instruct | Llama-3.3-70B-Instruct | meta-llama | 2024-11-26T16:08:47+00:00 | 764,118 | 2,549 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'de', 'arxiv:2204.05149', 'base_model:meta-llama/Llama-3.1-70B', 'base_model:finetune:meta-llama/Llama-3.1-70B', 'license:llama3.3', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 9 | 441 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:46.687702 | No README available | [] |
Qwen/Qwen3-1.7B | Qwen3-1.7B | Qwen | 2025-04-27T03:41:05+00:00 | 1,284,936 | 305 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-1.7B-Base', 'base_model:finetune:Qwen/Qwen3-1.7B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 9 | 442 | Qwen3ForCausalLM | qwen3 | 1,720,451,072 | 40,960 | 2,048 | 16 | 28 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 78 | Medium | 0.78 | 2025-11-02T06:03:46.820218 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-1.7B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-1.7B-Base <a href="https://... | ['Fast Inference', 'Multi-turn', 'Safety Aligned'] |
LiquidAI/LFM2-350M | LFM2-350M | LiquidAI | 2025-07-10T12:01:24+00:00 | 26,822 | 170 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'conversational', 'en', 'ar', 'zh', 'fr', 'de', 'ja', 'ko', 'es', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 9 | 443 | Lfm2ForCausalLM | lfm2 | 268,435,456 | 128,000 | 1,024 | 16 | 16 | 65,536 | Language Model | ['LoRA'] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Creative Writing', 'Healthcare'] | {} | 88 | Easy | 0.88 | 2025-11-02T06:03:47.041244 | library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ar - zh - fr - de - ja - ko - es pipeline_tag: text-generation tags: - liquid - lfm2 - edge <cente... | ['Function Calling', 'RAG Support', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
moonshotai/Kimi-K2-Instruct | Kimi-K2-Instruct | moonshotai | 2025-07-11T00:55:12+00:00 | 81,900 | 2,198 | transformers | ['transformers', 'safetensors', 'kimi_k2', 'text-generation', 'conversational', 'custom_code', 'doi:10.57967/hf/5976', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us'] | 9 | 444 | DeepseekV3ForCausalLM | kimi_k2 | 38,784,729,088 | 131,072 | 7,168 | 64 | 61 | 163,840 | Language Model | [] | ['question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production'] | {} | 63 | Hard | 0.63 | 2025-11-02T06:03:47.251259 | license: other license_name: modified-mit library_name: transformers new_version: moonshotai/Kimi-K2-Instruct-0905 <div align="center"> <picture> <img src="figures/kimi-logo.png" width="30%" alt="Kimi... | ['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn'] |
google/gemma-3-270m-it | gemma-3-270m-it | google | 2025-07-30T18:06:27+00:00 | 206,547 | 448 | transformers | ['transformers', 'safetensors', 'gemma3_text', 'text-generation', 'gemma3', 'gemma', 'google', 'conversational', 'arxiv:2503.19786', 'arxiv:1905.07830', 'arxiv:1905.10044', 'arxiv:1911.11641', 'arxiv:1705.03551', 'arxiv:1911.01547', 'arxiv:1907.10641', 'arxiv:2311.07911', 'arxiv:2311.12022', 'arxiv:2411.04368', 'arxiv:1904.09728', 'arxiv:1903.00161', 'arxiv:2009.03300', 'arxiv:2304.06364', 'arxiv:2103.03874', 'arxiv:2110.14168', 'arxiv:2108.07732', 'arxiv:2107.03374', 'arxiv:2403.07974', 'arxiv:2305.03111', 'arxiv:2405.04520', 'arxiv:2210.03057', 'arxiv:2106.03193', 'arxiv:1910.11856', 'arxiv:2502.12404', 'arxiv:2502.21228', 'arxiv:2404.16816', 'arxiv:2104.12756', 'arxiv:2311.16502', 'arxiv:2203.10244', 'arxiv:2404.12390', 'arxiv:1810.12440', 'arxiv:1908.02660', 'arxiv:2310.02255', 'arxiv:2312.11805', 'base_model:google/gemma-3-270m', 'base_model:finetune:google/gemma-3-270m', 'license:gemma', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 9 | 445 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:48.418413 | No README available | [] |
nvidia/NVIDIA-Nemotron-Nano-9B-v2 | NVIDIA-Nemotron-Nano-9B-v2 | nvidia | 2025-08-12T22:43:32+00:00 | 190,478 | 419 | transformers | ['transformers', 'safetensors', 'nvidia', 'pytorch', 'text-generation', 'conversational', 'en', 'es', 'fr', 'de', 'it', 'ja', 'dataset:nvidia/Nemotron-Post-Training-Dataset-v1', 'dataset:nvidia/Nemotron-Post-Training-Dataset-v2', 'dataset:nvidia/Nemotron-Pretraining-Dataset-sample', 'dataset:nvidia/Nemotron-CC-v2', 'dataset:nvidia/Nemotron-CC-Math-v1', 'dataset:nvidia/Nemotron-Pretraining-SFT-v1', 'arxiv:2504.03624', 'arxiv:2508.14444', 'arxiv:2412.02595', 'base_model:nvidia/NVIDIA-Nemotron-Nano-12B-v2', 'base_model:finetune:nvidia/NVIDIA-Nemotron-Nano-12B-v2', 'license:other', 'endpoints_compatible', 'region:us'] | 9 | 446 | NemotronHForCausalLM | nemotron_h | 14,074,511,360 | 131,072 | 4,480 | 40 | 56 | 131,072 | Language Model | ['Quantized', 'Merged', 'Specialized'] | ['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Business', 'Legal', 'Finance'] | {} | 71 | Medium | 0.71 | 2025-11-02T06:03:48.555502 | license: other license_name: nvidia-open-model-license license_link: >- https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license/ pipeline_tag: text-generation datasets: -... | ['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned'] |
LiquidAI/LFM2-8B-A1B | LFM2-8B-A1B | LiquidAI | 2025-10-07T13:55:39+00:00 | 13,348 | 231 | transformers | ['transformers', 'safetensors', 'lfm2_moe', 'text-generation', 'liquid', 'lfm2', 'edge', 'moe', 'conversational', 'en', 'ar', 'zh', 'fr', 'de', 'ja', 'ko', 'es', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 9 | 447 | Lfm2MoeForCausalLM | lfm2_moe | 1,342,177,280 | 128,000 | 2,048 | 32 | 24 | 65,536 | Language Model | ['Quantized', 'LoRA'] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Creative Writing', 'Healthcare'] | {} | 88 | Easy | 0.88 | 2025-11-02T06:03:48.680263 | library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ar - zh - fr - de - ja - ko - es pipeline_tag: text-generation tags: - liquid - lfm2 - edge - moe ... | ['Function Calling', 'RAG Support', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
cerebras/Qwen3-Coder-REAP-25B-A3B | Qwen3-Coder-REAP-25B-A3B | cerebras | 2025-10-20T15:40:03+00:00 | 971 | 32 | transformers | ['transformers', 'qwen3_moe', 'text-generation', 'qwen-coder', 'MOE', 'pruning', 'compression', 'conversational', 'en', 'arxiv:2510.13999', 'base_model:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'base_model:finetune:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 9 | 448 | Qwen3MoeForCausalLM | qwen3_moe | 2,727,084,032 | 262,144 | 2,048 | 32 | 48 | 151,936 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Creative Writing'] | {} | 85 | Easy | 0.85 | 2025-11-02T06:03:48.842699 | language: - en library_name: transformers tags: - qwen-coder - MOE - pruning - compression license: apache-2.0 name: cerebras/Qwen3-Coder-REAP-25B-A3B description: > This model was obtained by uniform... | ['Function Calling', 'Fast Inference', 'Memory Efficient', 'Multi-turn'] |
meta-llama/Llama-3.2-1B-Instruct | Llama-3.2-1B-Instruct | meta-llama | 2024-09-18T15:12:47+00:00 | 3,843,095 | 1,141 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'arxiv:2405.16406', 'license:llama3.2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 8 | 449 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:50.032006 | No README available | [] |
microsoft/Phi-4-mini-instruct | Phi-4-mini-instruct | microsoft | 2025-02-19T01:00:58+00:00 | 249,315 | 623 | transformers | ['transformers', 'safetensors', 'phi3', 'text-generation', 'nlp', 'code', 'conversational', 'custom_code', 'multilingual', 'ar', 'zh', 'cs', 'da', 'nl', 'en', 'fi', 'fr', 'de', 'he', 'hu', 'it', 'ja', 'ko', 'no', 'pl', 'pt', 'ru', 'es', 'sv', 'th', 'tr', 'uk', 'arxiv:2503.01743', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 8 | 450 | Phi3ForCausalLM | phi3 | 4,238,475,264 | 131,072 | 3,072 | 24 | 32 | 200,064 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Business', 'Healthcare', 'Legal'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:03:50.417744 | language: - multilingual - ar - zh - cs - da - nl - en - fi - fr - de - he - hu - it - ja - ko - 'no' - pl - pt - ru - es - sv - th - tr - uk library_name: transformers license: mit license_link: http... | ['Function Calling', 'RAG Support', 'Long Context', 'Multi-turn', 'Safety Aligned'] |
Qwen/Qwen3-30B-A3B | Qwen3-30B-A3B | Qwen | 2025-04-27T03:43:05+00:00 | 421,099 | 808 | transformers | ['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-30B-A3B-Base', 'base_model:finetune:Qwen/Qwen3-30B-A3B-Base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 8 | 451 | Qwen3MoeForCausalLM | qwen3_moe | 2,727,084,032 | 40,960 | 2,048 | 32 | 48 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:03:52.610146 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-30B-A3B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-30B-A3B-Base <a href="ht... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
Qwen/Qwen3-Embedding-8B | Qwen3-Embedding-8B | Qwen | 2025-06-03T14:39:10+00:00 | 751,756 | 419 | sentence-transformers | ['sentence-transformers', 'safetensors', 'qwen3', 'text-generation', 'transformers', 'sentence-similarity', 'feature-extraction', 'text-embeddings-inference', 'arxiv:2506.05176', 'base_model:Qwen/Qwen3-8B-Base', 'base_model:finetune:Qwen/Qwen3-8B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 8 | 452 | Qwen3ForCausalLM | qwen3 | 7,868,977,152 | 40,960 | 4,096 | 32 | 36 | 151,665 | Language Model | [] | ['text-classification', 'code-generation', 'reasoning'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['General Purpose'] | {} | 76 | Medium | 0.76 | 2025-11-02T06:03:52.883519 | license: apache-2.0 base_model: - Qwen/Qwen3-8B-Base tags: - transformers - sentence-transformers - sentence-similarity - feature-extraction - text-embeddings-inference <p align="center"> <img src="ht... | ['Long Context', 'Safety Aligned'] |
unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF | Qwen3-Coder-30B-A3B-Instruct-GGUF | unsloth | 2025-07-31T10:27:38+00:00 | 128,072 | 304 | transformers | ['transformers', 'gguf', 'unsloth', 'qwen3', 'qwen', 'text-generation', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'base_model:quantized:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'license:apache-2.0', 'endpoints_compatible', 'region:us', 'imatrix', 'conversational'] | 8 | 453 | Unknown | unknown | null | null | null | null | null | null | Language Model | [] | ['text-generation', 'question-answering', 'code-generation', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research'] | {} | 48 | Hard | 0.48 | 2025-11-02T06:03:54.409772 | tags: - unsloth - qwen3 - qwen base_model: - Qwen/Qwen3-Coder-30B-A3B-Instruct library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Coder-30B-A3B-Instruct/blo... | ['Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned'] |
tencent/DeepSeek-V3.1-Terminus-W4AFP8 | DeepSeek-V3.1-Terminus-W4AFP8 | tencent | 2025-10-28T03:13:16+00:00 | 25 | 8 | transformers | ['transformers', 'safetensors', 'deepseek_v3', 'text-generation', 'quantized', 'TensorRT-Model-Optimizer', 'int4', 'fp8', 'conversational', 'custom_code', 'base_model:deepseek-ai/DeepSeek-V3.1-Terminus', 'base_model:finetune:deepseek-ai/DeepSeek-V3.1-Terminus', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', '8-bit', 'region:us'] | 8 | 454 | DeepseekV3ForCausalLM | deepseek_v3 | 38,537,003,008 | 163,840 | 7,168 | 128 | 61 | 129,280 | Code Generation | ['Quantized', 'Merged'] | ['question-answering'] | ['English', 'Spanish', 'German', 'Russian', 'Arabic'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:54.702318 | license: mit library_name: transformers base_model: - deepseek-ai/DeepSeek-V3.1-Terminus tags: - quantized - TensorRT-Model-Optimizer - int4 - fp8 This model is a mixed-precision quantized version of ... | [] |
unsloth/MiniMax-M2-GGUF | MiniMax-M2-GGUF | unsloth | 2025-11-01T12:45:39+00:00 | 0 | 8 | transformers | ['transformers', 'gguf', 'text-generation', 'arxiv:2504.07164', 'arxiv:2509.06501', 'arxiv:2509.13160', 'base_model:MiniMaxAI/MiniMax-M2', 'base_model:quantized:MiniMaxAI/MiniMax-M2', 'license:mit', 'endpoints_compatible', 'region:us', 'imatrix', 'conversational'] | 8 | 455 | Unknown | unknown | null | null | null | null | null | null | Language Model | ['LoRA'] | ['text-generation', 'question-answering', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production'] | {} | 66 | Medium | 0.66 | 2025-11-02T06:03:56.151663 | pipeline_tag: text-generation license: mit library_name: transformers base_model: - MiniMaxAI/MiniMax-M2 <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="htt... | ['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
TinyLlama/TinyLlama-1.1B-Chat-v1.0 | TinyLlama-1.1B-Chat-v1.0 | TinyLlama | 2023-12-30T06:27:30+00:00 | 4,184,525 | 1,438 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'conversational', 'en', 'dataset:cerebras/SlimPajama-627B', 'dataset:bigcode/starcoderdata', 'dataset:HuggingFaceH4/ultrachat_200k', 'dataset:HuggingFaceH4/ultrafeedback_binarized', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 7 | 456 | LlamaForCausalLM | llama | 1,172,832,256 | 2,048 | 2,048 | 32 | 22 | 32,000 | Language Model | ['Fine-tuned'] | ['text-generation', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['General Purpose'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:03:56.385637 | license: apache-2.0 datasets: - cerebras/SlimPajama-627B - bigcode/starcoderdata - HuggingFaceH4/ultrachat_200k - HuggingFaceH4/ultrafeedback_binarized language: - en widget: - example_title: Fibonacc... | ['Long Context', 'Multi-turn', 'Safety Aligned'] |
meta-llama/Meta-Llama-3-8B | Meta-Llama-3-8B | meta-llama | 2024-04-17T09:35:16+00:00 | 1,760,997 | 6,358 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'en', 'license:llama3', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 7 | 457 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:57.624005 | No README available | [] |
microsoft/Phi-3-mini-4k-instruct | Phi-3-mini-4k-instruct | microsoft | 2024-04-22T16:18:17+00:00 | 1,546,273 | 1,321 | transformers | ['transformers', 'safetensors', 'phi3', 'text-generation', 'nlp', 'code', 'conversational', 'custom_code', 'en', 'fr', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 7 | 458 | Phi3ForCausalLM | phi3 | 3,722,379,264 | 4,096 | 3,072 | 32 | 32 | 32,064 | Language Model | ['Fine-tuned', 'Quantized', 'Specialized'] | ['text-generation', 'question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Business', 'Legal'] | {} | 81 | Medium | 0.81 | 2025-11-02T06:03:57.759928 | license: mit license_link: https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/resolve/main/LICENSE language: - en - fr pipeline_tag: text-generation tags: - nlp - code inference: parameters: temp... | ['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned'] |
AlicanKiraz0/Cybersecurity-BaronLLM_Offensive_Security_LLM_Q6_K_GGUF | Cybersecurity-BaronLLM_Offensive_Security_LLM_Q6_K_GGUF | AlicanKiraz0 | 2025-01-21T03:41:56+00:00 | 632 | 110 | transformers | ['transformers', 'gguf', 'llama-cpp', 'gguf-my-repo', 'text-generation', 'en', 'base_model:meta-llama/Llama-3.1-8B-Instruct', 'base_model:quantized:meta-llama/Llama-3.1-8B-Instruct', 'license:mit', 'endpoints_compatible', 'region:us', 'conversational'] | 7 | 459 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:03:58.977397 | No README available | [] |
Qwen/Qwen3-14B | Qwen3-14B | Qwen | 2025-04-27T03:42:45+00:00 | 737,394 | 300 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-14B-Base', 'base_model:finetune:Qwen/Qwen3-14B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 7 | 460 | Qwen3ForCausalLM | qwen3 | 13,360,824,320 | 40,960 | 5,120 | 40 | 40 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 71 | Medium | 0.71 | 2025-11-02T06:03:59.115683 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-14B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-14B-Base <a href="https://ch... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
Qwen/Qwen3-30B-A3B-Thinking-2507 | Qwen3-30B-A3B-Thinking-2507 | Qwen | 2025-07-29T11:05:11+00:00 | 172,593 | 310 | transformers | ['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'arxiv:2402.17463', 'arxiv:2407.02490', 'arxiv:2501.15383', 'arxiv:2404.06654', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 7 | 461 | Qwen3MoeForCausalLM | qwen3_moe | 2,727,084,032 | 262,144 | 2,048 | 32 | 48 | 151,936 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Creative Writing'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:01.296552 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-30B-A3B-Thinking-2507/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" tar... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
google/gemma-3-270m | gemma-3-270m | google | 2025-08-05T18:50:31+00:00 | 132,968 | 893 | transformers | ['transformers', 'safetensors', 'gemma3_text', 'text-generation', 'gemma3', 'gemma', 'google', 'arxiv:2503.19786', 'arxiv:1905.07830', 'arxiv:1905.10044', 'arxiv:1911.11641', 'arxiv:1705.03551', 'arxiv:1911.01547', 'arxiv:1907.10641', 'arxiv:2311.07911', 'arxiv:2311.12022', 'arxiv:2411.04368', 'arxiv:1904.09728', 'arxiv:1903.00161', 'arxiv:2009.03300', 'arxiv:2304.06364', 'arxiv:2103.03874', 'arxiv:2110.14168', 'arxiv:2108.07732', 'arxiv:2107.03374', 'arxiv:2403.07974', 'arxiv:2305.03111', 'arxiv:2405.04520', 'arxiv:2210.03057', 'arxiv:2106.03193', 'arxiv:1910.11856', 'arxiv:2502.12404', 'arxiv:2502.21228', 'arxiv:2404.16816', 'arxiv:2104.12756', 'arxiv:2311.16502', 'arxiv:2203.10244', 'arxiv:2404.12390', 'arxiv:1810.12440', 'arxiv:1908.02660', 'arxiv:2310.02255', 'arxiv:2312.11805', 'license:gemma', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 7 | 462 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:04:02.502694 | No README available | [] |
inference-net/Schematron-3B | Schematron-3B | inference-net | 2025-08-21T19:15:21+00:00 | 2,248,942 | 98 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'conversational', 'base_model:meta-llama/Llama-3.2-3B-Instruct', 'base_model:finetune:meta-llama/Llama-3.2-3B-Instruct', 'license:llama3.2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 7 | 463 | LlamaForCausalLM | llama | 3,564,896,256 | 131,072 | 3,072 | 24 | 28 | 128,256 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'code-generation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Legal'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:02.841800 | library_name: transformers license: llama3.2 base_model: meta-llama/Llama-3.2-3B-Instruct <p align="center"> <img alt="Schematron" src="https://huggingface.co/inference-net/Schematron-3B/resolve/main/... | ['Long Context', 'Fast Inference', 'Safety Aligned'] |
meta-llama/Llama-2-7b-hf | Llama-2-7b-hf | meta-llama | 2023-07-13T16:16:13+00:00 | 658,332 | 2,184 | transformers | ['transformers', 'pytorch', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'llama-2', 'en', 'arxiv:2307.09288', 'license:llama2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 6 | 464 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:04:04.011671 | No README available | [] |
mistralai/Mistral-7B-v0.1 | Mistral-7B-v0.1 | mistralai | 2023-09-20T13:03:50+00:00 | 509,066 | 3,999 | transformers | ['transformers', 'pytorch', 'safetensors', 'mistral', 'text-generation', 'pretrained', 'mistral-common', 'en', 'arxiv:2310.06825', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'region:us'] | 6 | 465 | MistralForCausalLM | mistral | 6,573,522,944 | 32,768 | 4,096 | 32 | 32 | 32,000 | Language Model | [] | ['text-generation'] | ['English', 'Spanish', 'German', 'Arabic'] | ['General Purpose'] | {} | 70 | Medium | 0.7 | 2025-11-02T06:04:04.176320 | library_name: transformers language: - en license: apache-2.0 tags: - pretrained - mistral-common inference: false extra_gated_description: >- If you want to learn more about how we process your perso... | [] |
TheBloke/Mistral-7B-Instruct-v0.2-GGUF | Mistral-7B-Instruct-v0.2-GGUF | TheBloke | 2023-12-11T22:18:46+00:00 | 60,731 | 475 | transformers | ['transformers', 'gguf', 'mistral', 'finetuned', 'text-generation', 'arxiv:2310.06825', 'base_model:mistralai/Mistral-7B-Instruct-v0.2', 'base_model:quantized:mistralai/Mistral-7B-Instruct-v0.2', 'license:apache-2.0', 'region:us', 'conversational'] | 6 | 466 | Unknown | mistral | null | null | null | null | null | null | Language Model | ['Fine-tuned', 'Quantized'] | ['text-generation', 'question-answering', 'summarization', 'conversation'] | ['English', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Creative Writing'] | {} | 63 | Hard | 0.63 | 2025-11-02T06:04:04.367912 | base_model: mistralai/Mistral-7B-Instruct-v0.2 inference: false license: apache-2.0 model_creator: Mistral AI_ model_name: Mistral 7B Instruct v0.2 model_type: mistral pipeline_tag: text-generation pr... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn'] |
Qwen/Qwen3-32B | Qwen3-32B | Qwen | 2025-04-27T03:52:59+00:00 | 1,610,353 | 561 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 6 | 467 | Qwen3ForCausalLM | qwen3 | 20,910,571,520 | 40,960 | 5,120 | 64 | 64 | 151,936 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Creative Writing'] | {} | 61 | Hard | 0.61 | 2025-11-02T06:04:04.610446 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-32B/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" target="_blank" style... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
Qwen/Qwen3-Reranker-0.6B | Qwen3-Reranker-0.6B | Qwen | 2025-05-29T13:30:45+00:00 | 1,066,755 | 249 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'text-ranking', 'arxiv:2506.05176', 'base_model:Qwen/Qwen3-0.6B-Base', 'base_model:finetune:Qwen/Qwen3-0.6B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 6 | 468 | Qwen3ForCausalLM | qwen3 | 507,630,592 | 40,960 | 1,024 | 16 | 28 | 151,669 | Language Model | [] | ['text-generation', 'text-classification', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['General Purpose'] | {} | 81 | Medium | 0.81 | 2025-11-02T06:04:04.821390 | license: apache-2.0 base_model: - Qwen/Qwen3-0.6B-Base library_name: transformers pipeline_tag: text-ranking <p align="center"> <img src="https://qianwen-res.oss-accelerate-overseas.aliyuncs.com/logo_... | ['Long Context', 'Safety Aligned'] |
zai-org/GLM-4.5-Air | GLM-4.5-Air | zai-org | 2025-07-20T03:25:55+00:00 | 507,067 | 504 | transformers | ['transformers', 'safetensors', 'glm4_moe', 'text-generation', 'conversational', 'en', 'zh', 'arxiv:2508.06471', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 6 | 469 | Glm4MoeForCausalLM | glm4_moe | 9,881,780,224 | 131,072 | 4,096 | 96 | 46 | 151,552 | Language Model | [] | ['text-generation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'German', 'Arabic'] | ['Business'] | {} | 55 | Hard | 0.55 | 2025-11-02T06:04:04.939977 | language: - en - zh library_name: transformers license: mit pipeline_tag: text-generation <div align="center"> <img src=https://raw.githubusercontent.com/zai-org/GLM-4.5/refs/heads/main/resources/logo... | [] |
swiss-ai/Apertus-8B-Instruct-2509 | Apertus-8B-Instruct-2509 | swiss-ai | 2025-08-13T09:30:23+00:00 | 376,921 | 384 | transformers | ['transformers', 'safetensors', 'apertus', 'text-generation', 'multilingual', 'compliant', 'swiss-ai', 'conversational', 'arxiv:2509.14233', 'base_model:swiss-ai/Apertus-8B-2509', 'base_model:finetune:swiss-ai/Apertus-8B-2509', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 6 | 470 | ApertusForCausalLM | apertus | 6,979,321,856 | 65,536 | 4,096 | 32 | 32 | 131,072 | Language Model | ['Specialized'] | ['text-generation', 'question-answering', 'summarization', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Production', 'Legal'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:05.123704 | license: apache-2.0 base_model: - swiss-ai/Apertus-8B-2509 pipeline_tag: text-generation library_name: transformers tags: - multilingual - compliant - swiss-ai - apertus extra_gated_prompt: "### Apert... | ['Function Calling', 'RAG Support', 'Long Context', 'Safety Aligned'] |
LiquidAI/LFM2-1.2B-Tool | LFM2-1.2B-Tool | LiquidAI | 2025-09-03T17:35:21+00:00 | 924 | 81 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'conversational', 'en', 'ar', 'zh', 'fr', 'de', 'ja', 'ko', 'es', 'base_model:LiquidAI/LFM2-1.2B', 'base_model:finetune:LiquidAI/LFM2-1.2B', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 6 | 471 | Lfm2ForCausalLM | lfm2 | 939,524,096 | 128,000 | 2,048 | 32 | 16 | 65,536 | Chat/Instruct | ['LoRA'] | ['text-generation', 'translation', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Healthcare'] | {} | 85 | Easy | 0.85 | 2025-11-02T06:04:07.300432 | library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ar - zh - fr - de - ja - ko - es pipeline_tag: text-generation tags: - liquid - lfm2 - edge base_m... | ['Function Calling', 'Fast Inference', 'Multi-turn'] |
Qwen/Qwen3-Next-80B-A3B-Thinking | Qwen3-Next-80B-A3B-Thinking | Qwen | 2025-09-09T15:45:31+00:00 | 230,661 | 436 | transformers | ['transformers', 'safetensors', 'qwen3_next', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'arxiv:2501.15383', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 6 | 472 | Qwen3NextForCausalLM | qwen3_next | 2,727,084,032 | 262,144 | 2,048 | 16 | 48 | 151,936 | Language Model | ['Merged'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Creative Writing'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:07.442566 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Next-80B-A3B-Thinking/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" tar... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
moondream/moondream3-preview | moondream3-preview | moondream | 2025-09-11T19:48:53+00:00 | 23,271 | 463 | transformers | ['transformers', 'safetensors', 'moondream3', 'text-generation', 'image-text-to-text', 'custom_code', 'doi:10.57967/hf/6761', 'license:other', 'autotrain_compatible', 'region:us'] | 6 | 473 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:04:08.652992 | No README available | [] |
ibm-granite/granite-4.0-micro | granite-4.0-micro | ibm-granite | 2025-09-16T19:47:09+00:00 | 25,720 | 220 | transformers | ['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 6 | 474 | GraniteMoeHybridForCausalLM | granitemoehybrid | 3,402,629,120 | 131,072 | 2,560 | 40 | 40 | 100,352 | Language Model | ['Fine-tuned', 'RLHF'] | ['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Business'] | {} | 81 | Medium | 0.81 | 2025-11-02T06:04:08.832688 | license: apache-2.0 library_name: transformers tags: - language - granite-4.0 📣 **Update [10-07-2025]:** Added a *default system prompt* to the chat template to guide the model towards more *professio... | ['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
Qwen/Qwen3-Next-80B-A3B-Instruct-FP8 | Qwen3-Next-80B-A3B-Instruct-FP8 | Qwen | 2025-09-22T03:48:53+00:00 | 163,485 | 56 | transformers | ['transformers', 'safetensors', 'qwen3_next', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2404.06654', 'arxiv:2505.09388', 'arxiv:2501.15383', 'base_model:Qwen/Qwen3-Next-80B-A3B-Instruct', 'base_model:quantized:Qwen/Qwen3-Next-80B-A3B-Instruct', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us'] | 6 | 475 | Qwen3NextForCausalLM | qwen3_next | 2,727,084,032 | 262,144 | 2,048 | 16 | 48 | 151,936 | Language Model | ['Quantized'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Creative Writing'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:09.017910 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Next-80B-A3B-Instruct-FP8/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-Next-8... | ['Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
Qwen/Qwen3Guard-Gen-8B | Qwen3Guard-Gen-8B | Qwen | 2025-09-23T11:40:09+00:00 | 31,113 | 61 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2510.14276', 'base_model:Qwen/Qwen3-8B', 'base_model:finetune:Qwen/Qwen3-8B', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 6 | 476 | Qwen3ForCausalLM | qwen3 | 7,870,087,168 | 32,768 | 4,096 | 32 | 36 | 151,936 | Language Model | ['Specialized'] | ['text-generation', 'text-classification', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Creative Writing', 'Healthcare', 'Legal', 'Finance'] | {} | 68 | Medium | 0.68 | 2025-11-02T06:04:09.169480 | library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3Guard-Gen-8B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-8B <p align="center"... | ['RAG Support', 'Safety Aligned'] |
ZJU-AI4H/Hulu-Med-7B | Hulu-Med-7B | ZJU-AI4H | 2025-10-09T02:27:37+00:00 | 1,626 | 24 | transformers | ['transformers', 'safetensors', 'hulumed_qwen2', 'text-generation', 'medical', 'multimodal', 'vision-language-model', 'image-to-text', 'video-understanding', '3d-understanding', 'qwen', 'pytorch', 'image-text-to-text', 'conversational', 'custom_code', 'arxiv:2510.08668', 'license:apache-2.0', 'autotrain_compatible', 'region:us'] | 6 | 477 | HulumedQwen2ForCausalLM | hulumed_qwen2 | 4,860,936,192 | 32,768 | 3,584 | 28 | 28 | 152,064 | Language Model | ['Merged', 'Specialized'] | ['text-generation', 'question-answering', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Healthcare'] | {} | 70 | Medium | 0.7 | 2025-11-02T06:04:09.494090 | license: apache-2.0 tags: - medical - multimodal - vision-language-model - image-to-text - video-understanding - 3d-understanding - qwen - pytorch frameworks: - pytorch pipeline_tag: image-text-to-tex... | ['RAG Support', 'Fast Inference', 'Multi-turn'] |
ZJU-AI4H/Hulu-Med-32B | Hulu-Med-32B | ZJU-AI4H | 2025-10-09T02:28:34+00:00 | 423 | 29 | transformers | ['transformers', 'safetensors', 'hulumed_qwen2', 'text-generation', 'medical', 'multimodal', 'vision-language-model', 'image-to-text', 'video-understanding', '3d-understanding', 'qwen', 'pytorch', 'image-text-to-text', 'conversational', 'custom_code', 'arxiv:2510.08668', 'license:apache-2.0', 'autotrain_compatible', 'region:us'] | 6 | 478 | HulumedQwen2ForCausalLM | hulumed_qwen2 | 20,911,226,880 | 32,768 | 5,120 | 40 | 64 | 152,064 | Language Model | ['Merged', 'Specialized'] | ['text-generation', 'question-answering', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Healthcare'] | {} | 50 | Hard | 0.5 | 2025-11-02T06:04:09.655809 | license: apache-2.0 tags: - medical - multimodal - vision-language-model - image-to-text - video-understanding - 3d-understanding - qwen - pytorch frameworks: - pytorch pipeline_tag: image-text-to-tex... | ['RAG Support', 'Fast Inference', 'Multi-turn'] |
nineninesix/kani-tts-400m-0.3-pt | kani-tts-400m-0.3-pt | nineninesix | 2025-10-18T12:17:29+00:00 | 711 | 6 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'text-to-speech', 'en', 'ja', 'de', 'ar', 'zh', 'es', 'ko', 'ky', 'dataset:laion/Emolia', 'dataset:NightPrince/MasriSpeech-Full', 'arxiv:2501.15907', 'arxiv:2506.09827', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 6 | 479 | Lfm2ForCausalLM | lfm2 | 283,798,528 | 128,000 | 1,024 | 16 | 16 | 80,539 | Language Model | ['Specialized'] | ['text-generation', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Legal'] | {} | 75 | Medium | 0.75 | 2025-11-02T06:04:09.810553 | license: apache-2.0 language: - en - ja - de - ar - zh - es - ko - ky pipeline_tag: text-to-speech library_name: transformers datasets: - laion/Emolia - NightPrince/MasriSpeech-Full <p> <img src="http... | ['Fast Inference', 'Memory Efficient', 'Multi-turn'] |
meituan-longcat/LongCat-Flash-Omni-FP8 | LongCat-Flash-Omni-FP8 | meituan-longcat | 2025-10-24T00:48:24+00:00 | 22 | 6 | LongCat-Flash-Omni | ['LongCat-Flash-Omni', 'safetensors', 'text-generation', 'transformers', 'conversational', 'custom_code', 'license:mit', 'fp8', 'region:us'] | 6 | 480 | LongcatFlashOmniForCausalLM | unknown | null | 131,072 | 6,144 | 64 | null | 131,072 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Legal'] | {} | 71 | Medium | 0.71 | 2025-11-02T06:04:10.128503 | license: mit library_name: LongCat-Flash-Omni pipeline_tag: text-generation tags: - transformers <div align="center"> <img src="https://raw.githubusercontent.com/meituan-longcat/LongCat-Flash-Omni/mai... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
MiniMaxAI/MiniMax-M2 | MiniMax-M2 | MiniMaxAI | 2025-10-22T13:45:10+00:00 | 529,835 | 912 | transformers | ['transformers', 'safetensors', 'minimax', 'text-generation', 'conversational', 'arxiv:2504.07164', 'arxiv:2509.06501', 'arxiv:2509.13160', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us'] | 912 | 481 | MiniMaxM2ForCausalLM | minimax | 7,635,861,504 | 196,608 | 3,072 | 48 | 62 | 200,064 | Language Model | ['LoRA'] | ['text-generation', 'question-answering', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production'] | {} | 91 | Easy | 0.91 | 2025-11-02T06:04:12.323881 | pipeline_tag: text-generation license: mit library_name: transformers <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="http://www.w3.org/2000/svg"> <path d="... | ['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
moonshotai/Kimi-Linear-48B-A3B-Instruct | Kimi-Linear-48B-A3B-Instruct | moonshotai | 2025-10-30T12:37:31+00:00 | 7,914 | 284 | transformers | ['transformers', 'safetensors', 'kimi_linear', 'text-generation', 'conversational', 'custom_code', 'arxiv:2510.26692', 'arxiv:2412.06464', 'license:mit', 'autotrain_compatible', 'region:us'] | 284 | 482 | KimiLinearForCausalLM | kimi_linear | 2,097,414,144 | null | 2,304 | 32 | 27 | 163,840 | Language Model | ['RLHF'] | ['text-generation', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Education'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:12.504925 | license: mit pipeline_tag: text-generation library_name: transformers <div align="center"> <a href="https://huggingface.co/papers/2510.26692"><img width="80%" src="figures/banner.png"></a> </div> <div... | ['Long Context', 'Fast Inference', 'Safety Aligned'] |
openai/gpt-oss-safeguard-20b | gpt-oss-safeguard-20b | openai | 2025-09-18T18:51:08+00:00 | 4,071 | 99 | transformers | ['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'base_model:openai/gpt-oss-20b', 'base_model:finetune:openai/gpt-oss-20b', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us'] | 99 | 483 | GptOssForCausalLM | gpt_oss | 2,967,920,640 | 131,072 | 2,880 | 64 | 24 | 201,088 | Language Model | [] | ['text-generation', 'reasoning'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Business'] | {} | 73 | Medium | 0.73 | 2025-11-02T06:04:12.629609 | license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm base_model: - openai/gpt-oss-20b base_model_relation: finetune <p align="center"> <img alt="gpt-oss-safeguard-... | ['Safety Aligned'] |
ibm-granite/granite-4.0-h-1b | granite-4.0-h-1b | ibm-granite | 2025-10-07T20:21:46+00:00 | 2,392 | 86 | transformers | ['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-h-1b-base', 'base_model:finetune:ibm-granite/granite-4.0-h-1b-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 86 | 484 | GraniteMoeHybridForCausalLM | granitemoehybrid | 1,286,602,752 | 131,072 | 1,536 | 12 | 40 | 100,352 | Language Model | ['Fine-tuned', 'RLHF', 'Specialized'] | ['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:12.802340 | license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-h-1b-base **Model Summary:** Granite-4.0-H-1B is a lightweight instruct model finetu... | ['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
openai/gpt-oss-20b | gpt-oss-20b | openai | 2025-08-04T22:33:29+00:00 | 4,652,383 | 3,841 | transformers | ['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us'] | 62 | 485 | GptOssForCausalLM | gpt_oss | 2,967,920,640 | 131,072 | 2,880 | 64 | 24 | 201,088 | Language Model | ['Fine-tuned', 'Specialized'] | ['text-generation', 'reasoning', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Business'] | {} | 85 | Easy | 0.85 | 2025-11-02T06:04:13.012465 | license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm <p align="center"> <img alt="gpt-oss-20b" src="https://raw.githubusercontent.com/openai/gpt-oss/main/docs/gpt-... | ['Function Calling', 'Fast Inference', 'Multi-turn'] |
zai-org/GLM-4.6 | GLM-4.6 | zai-org | 2025-09-29T18:22:51+00:00 | 65,184 | 939 | transformers | ['transformers', 'safetensors', 'glm4_moe', 'text-generation', 'conversational', 'en', 'zh', 'arxiv:2508.06471', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 60 | 486 | Glm4MoeForCausalLM | glm4_moe | 29,716,643,840 | 202,752 | 5,120 | 96 | 92 | 151,552 | Language Model | [] | ['text-generation', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Arabic'] | ['Creative Writing'] | {} | 63 | Hard | 0.63 | 2025-11-02T06:04:13.126626 | language: - en - zh library_name: transformers license: mit pipeline_tag: text-generation <div align="center"> <img src=https://raw.githubusercontent.com/zai-org/GLM-4.5/refs/heads/main/resources/logo... | ['Function Calling', 'Long Context'] |
ibm-granite/granite-4.0-h-350m | granite-4.0-h-350m | ibm-granite | 2025-10-07T20:23:17+00:00 | 3,059 | 56 | transformers | ['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-h-350m-base', 'base_model:finetune:ibm-granite/granite-4.0-h-350m-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 56 | 487 | GraniteMoeHybridForCausalLM | granitemoehybrid | 303,562,752 | 32,768 | 768 | 12 | 32 | 100,352 | Language Model | ['Fine-tuned', 'RLHF', 'Specialized'] | ['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:13.334512 | license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-h-350m-base **Model Summary:** Granite-4.0-H-350M is a lightweight instruct model fi... | ['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
openai/gpt-oss-safeguard-120b | gpt-oss-safeguard-120b | openai | 2025-09-18T18:50:45+00:00 | 1,640 | 50 | transformers | ['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'base_model:openai/gpt-oss-120b', 'base_model:finetune:openai/gpt-oss-120b', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us'] | 50 | 488 | GptOssForCausalLM | gpt_oss | 4,162,314,240 | 131,072 | 2,880 | 64 | 36 | 201,088 | Language Model | [] | ['text-generation', 'reasoning'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Business'] | {} | 68 | Medium | 0.68 | 2025-11-02T06:04:13.529190 | license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm base_model: - openai/gpt-oss-120b base_model_relation: finetune <p align="center"> <img alt="gpt-oss-safeguard... | ['Safety Aligned'] |
inclusionAI/LLaDA2.0-flash-preview | LLaDA2.0-flash-preview | inclusionAI | 2025-10-25T09:22:29+00:00 | 803 | 53 | transformers | ['transformers', 'safetensors', 'llada2_moe', 'text-generation', 'dllm', 'diffusion', 'llm', 'text_generation', 'conversational', 'custom_code', 'license:apache-2.0', 'autotrain_compatible', 'region:us'] | 46 | 489 | LLaDA2MoeModelLM | llada2_moe | 7,086,276,608 | 16,384 | 4,096 | 32 | 32 | 157,184 | Language Model | ['Fine-tuned', 'RLHF'] | ['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Korean', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Education'] | {} | 65 | Medium | 0.65 | 2025-11-02T06:04:13.684173 | license: apache-2.0 library_name: transformers tags: - dllm - diffusion - llm - text_generation **LLaDA2.0-flash-preview** is a diffusion language model featuring a 100BA6B Mixture-of-Experts (MoE) ar... | ['Function Calling', 'RAG Support', 'Fast Inference', 'Memory Efficient'] |
utter-project/EuroLLM-9B | EuroLLM-9B | utter-project | 2024-11-22T10:44:55+00:00 | 12,514 | 134 | transformers | ['transformers', 'pytorch', 'safetensors', 'llama', 'text-generation', 'en', 'de', 'es', 'fr', 'it', 'pt', 'pl', 'nl', 'tr', 'sv', 'cs', 'el', 'hu', 'ro', 'fi', 'uk', 'sl', 'sk', 'da', 'lt', 'lv', 'et', 'bg', 'no', 'ca', 'hr', 'ga', 'mt', 'gl', 'zh', 'ru', 'ko', 'ja', 'ar', 'hi', 'arxiv:2202.03799', 'arxiv:2402.17733', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 43 | 490 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:04:14.881722 | No README available | [] |
meta-llama/Llama-3.1-8B-Instruct | Llama-3.1-8B-Instruct | meta-llama | 2024-07-18T08:56:00+00:00 | 5,243,778 | 4,852 | transformers | ['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'base_model:meta-llama/Llama-3.1-8B', 'base_model:finetune:meta-llama/Llama-3.1-8B', 'license:llama3.1', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 39 | 491 | Unknown | unknown | null | null | null | null | null | null | General Language Model | [] | ['text-generation'] | ['English'] | ['General Purpose'] | {} | 40 | Critical | 0.4 | 2025-11-02T06:04:18.072797 | No README available | [] |
meituan-longcat/LongCat-Flash-Omni | LongCat-Flash-Omni | meituan-longcat | 2025-10-23T09:42:24+00:00 | 32 | 39 | LongCat-Flash-Omni | ['LongCat-Flash-Omni', 'safetensors', 'text-generation', 'transformers', 'conversational', 'custom_code', 'license:mit', 'region:us'] | 39 | 492 | LongcatFlashOmniForCausalLM | unknown | null | 131,072 | 6,144 | 64 | null | 131,072 | Language Model | [] | ['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Legal'] | {} | 71 | Medium | 0.71 | 2025-11-02T06:04:18.240056 | license: mit library_name: LongCat-Flash-Omni pipeline_tag: text-generation tags: - transformers <div align="center"> <img src="https://raw.githubusercontent.com/meituan-longcat/LongCat-Flash-Omni/mai... | ['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned'] |
openai/gpt-oss-120b | gpt-oss-120b | openai | 2025-08-04T22:33:06+00:00 | 3,697,129 | 4,085 | transformers | ['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us'] | 38 | 493 | GptOssForCausalLM | gpt_oss | 4,162,314,240 | 131,072 | 2,880 | 64 | 36 | 201,088 | Language Model | ['Fine-tuned', 'Specialized'] | ['text-generation', 'reasoning', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Business'] | {} | 80 | Medium | 0.8 | 2025-11-02T06:04:18.424643 | license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm <p align="center"> <img alt="gpt-oss-120b" src="https://raw.githubusercontent.com/openai/gpt-oss/main/docs/gpt... | ['Function Calling', 'Fast Inference', 'Multi-turn'] |
ibm-granite/granite-4.0-350m | granite-4.0-350m | ibm-granite | 2025-10-07T20:25:05+00:00 | 1,392 | 38 | transformers | ['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-350m-base', 'base_model:finetune:ibm-granite/granite-4.0-350m-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 38 | 494 | GraniteMoeHybridForCausalLM | granitemoehybrid | 455,081,984 | 32,768 | 1,024 | 16 | 28 | 100,352 | Language Model | ['Fine-tuned', 'RLHF', 'Specialized'] | ['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:18.578195 | license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-350m-base **Model Summary:** Granite-4.0-350M is a lightweight instruct model finetu... | ['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
ibm-granite/granite-4.0-1b | granite-4.0-1b | ibm-granite | 2025-10-07T20:24:27+00:00 | 1,469 | 37 | transformers | ['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-1b-base', 'base_model:finetune:ibm-granite/granite-4.0-1b-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 37 | 495 | GraniteMoeHybridForCausalLM | granitemoehybrid | 2,218,786,816 | 131,072 | 2,048 | 16 | 40 | 100,352 | Language Model | ['Fine-tuned', 'RLHF', 'Specialized'] | ['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation'] | ['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:19.507039 | license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-1b-base **Model Summary:** Granite-4.0-1B is a lightweight instruct model finetuned ... | ['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned'] |
moonshotai/Kimi-Linear-48B-A3B-Base | Kimi-Linear-48B-A3B-Base | moonshotai | 2025-10-30T12:39:14+00:00 | 71 | 36 | transformers | ['transformers', 'safetensors', 'kimi_linear', 'text-generation', 'conversational', 'custom_code', 'arxiv:2510.26692', 'arxiv:2412.06464', 'license:mit', 'autotrain_compatible', 'region:us'] | 36 | 496 | KimiLinearForCausalLM | kimi_linear | 2,097,414,144 | null | 2,304 | 32 | 27 | 163,840 | Language Model | ['RLHF'] | ['text-generation', 'code-generation', 'conversation'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Production', 'Education'] | {} | 86 | Easy | 0.86 | 2025-11-02T06:04:19.989520 | license: mit pipeline_tag: text-generation library_name: transformers This model is presented in the paper [Kimi Linear: An Expressive, Efficient Attention Architecture](https://huggingface.co/papers/... | ['Long Context', 'Fast Inference', 'Safety Aligned'] |
AvitoTech/avibe | avibe | AvitoTech | 2025-10-20T10:43:13+00:00 | 2,043 | 35 | transformers | ['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'ru', 'en', 'base_model:Qwen/Qwen3-8B', 'base_model:finetune:Qwen/Qwen3-8B', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us'] | 35 | 497 | Qwen3ForCausalLM | qwen3 | 7,724,507,136 | 32,768 | 4,096 | 32 | 36 | 116,394 | Language Model | [] | ['text-generation', 'question-answering', 'summarization', 'conversation'] | ['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['General Purpose'] | {} | 75 | Medium | 0.75 | 2025-11-02T06:04:20.183694 | license: apache-2.0 language: - ru - en base_model: - Qwen/Qwen3-8B pipeline_tag: text-generation library_name: transformers A-vibe это большая языковая модель, созданная Авито Тех, дочерней технологи... | ['Function Calling'] |
Alibaba-NLP/Tongyi-DeepResearch-30B-A3B | Tongyi-DeepResearch-30B-A3B | Alibaba-NLP | 2025-09-16T06:56:49+00:00 | 13,787 | 731 | transformers | ['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'en', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 32 | 498 | Qwen3MoeForCausalLM | qwen3_moe | 2,727,084,032 | 131,072 | 2,048 | 32 | 48 | 151,936 | Language Model | ['RLHF'] | ['text-generation', 'question-answering', 'reasoning'] | ['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic'] | ['Research', 'Production', 'Education'] | {} | 70 | Medium | 0.7 | 2025-11-02T06:04:20.427679 | license: apache-2.0 language: - en pipeline_tag: text-generation library_name: transformers We present **Tongyi DeepResearch**, an agentic large language model featuring 30 billion total parameters, ... | ['RAG Support'] |
deepseek-ai/DeepSeek-V3.2-Exp | DeepSeek-V3.2-Exp | deepseek-ai | 2025-09-29T06:07:26+00:00 | 104,688 | 760 | transformers | ['transformers', 'safetensors', 'deepseek_v32', 'text-generation', 'conversational', 'base_model:deepseek-ai/DeepSeek-V3.2-Exp-Base', 'base_model:finetune:deepseek-ai/DeepSeek-V3.2-Exp-Base', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us'] | 31 | 499 | DeepseekV32ForCausalLM | deepseek_v32 | 38,537,003,008 | 163,840 | 7,168 | 128 | 61 | 129,280 | Language Model | [] | ['text-generation', 'question-answering', 'reasoning', 'conversation'] | ['English', 'Chinese', 'Spanish', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research'] | {} | 58 | Hard | 0.58 | 2025-11-02T06:04:20.606169 | license: mit library_name: transformers base_model: - deepseek-ai/DeepSeek-V3.2-Exp-Base base_model_relation: finetune <!-- markdownlint-disable first-line-h1 --> <!-- markdownlint-disable html --> <!... | ['Function Calling', 'Fast Inference', 'Safety Aligned'] |
nineninesix/kani-tts-400m-en | kani-tts-400m-en | nineninesix | 2025-10-23T20:52:55+00:00 | 617 | 28 | transformers | ['transformers', 'safetensors', 'lfm2', 'text-generation', 'text-to-speech', 'en', 'arxiv:2501.15907', 'arxiv:2506.09827', 'base_model:nineninesix/kani-tts-400m-0.3-pt', 'base_model:finetune:nineninesix/kani-tts-400m-0.3-pt', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us'] | 28 | 500 | Lfm2ForCausalLM | lfm2 | 283,798,528 | 128,000 | 1,024 | 16 | 16 | 80,539 | Language Model | ['Specialized'] | ['text-generation', 'conversation'] | ['English', 'Chinese', 'Korean', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual'] | ['Research', 'Production', 'Education', 'Legal'] | {} | 75 | Medium | 0.75 | 2025-11-02T06:04:20.779069 | license: apache-2.0 language: - en pipeline_tag: text-to-speech library_name: transformers base_model: - nineninesix/kani-tts-400m-0.3-pt <p> <img src="https://cdn-uploads.huggingface.co/production/up... | ['Fast Inference', 'Memory Efficient', 'Multi-turn'] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.