lbs1163 commited on
Commit
c4037da
·
verified ·
1 Parent(s): 2bc0784

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +70 -1
README.md CHANGED
@@ -7,4 +7,73 @@ sdk: static
7
  pinned: false
8
  ---
9
 
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  pinned: false
8
  ---
9
 
10
+ <div align="center">
11
+ <a href="https://mobilint.com">
12
+ <img src="https://github.com/mobilint/mblt-model-zoo/blob/master/assets/Mobilint_Logo_Primary.png?raw=true"
13
+ width="50%"
14
+ alt="mobilint" />
15
+ </a>
16
+ </div>
17
+
18
+ # About Mobilint, Inc.
19
+
20
+ At Mobilint, we are at the forefront of addressing the challenges posed by the suboptimal performance of current processors.
21
+ By harnessing the power of high-performance NPUs, we are pushing the boundaries of AI technology, transforming the once theoretical into the practical.
22
+ This not only enhances the safety and convenience of individuals but also ignites the imagination of engineers.
23
+
24
+ We firmly believe in the transformative potential of AI, envisioning a future where it revolutionizes our lives in diverse and meaningful ways.
25
+ Committed to excellence and innovation, Mobilint is dedicated to developing and deploying AI acceleration technologies that are shaping a better, more advanced world for everyone.
26
+
27
+ # Model List
28
+
29
+ The following tables summarize Transformers' models available with our inference package **mblt-model-zoo**.
30
+ We provide the models that are quantized with our advanced quantization techniques. Performance metrics will be provided in the future.
31
+
32
+ ## Text Generation
33
+
34
+ | Model | Model ID | Link | Main | W8 | W4V8 | Note |
35
+ | ----- | -------- | ---- | ---- | -- | ---- | ---- |
36
+ | EXAONE-3.5-2.4B-Instruct | `mobilint/EXAONE-3.5-2.4B-Instruct` | [Link](https://huggingface.co/mobilint/EXAONE-3.5-2.4B-Instruct) | W4V8 | O | O | |
37
+ | EXAONE-3.5-7.8B-Instruct | `mobilint/EXAONE-3.5-7.8B-Instruct` | [Link](https://huggingface.co/mobilint/EXAONE-3.5-7.8B-Instruct) | W4V8 | O | O | |
38
+ | EXAONE-4.0-1.2B | `mobilint/EXAONE-4.0-1.2B` | [Link](https://huggingface.co/mobilint/EXAONE-4.0-1.2B) | W8 | O | X | |
39
+ | EXAONE-Deep-2.4B | `mobilint/EXAONE-Deep-2.4B` | [Link](https://huggingface.co/mobilint/EXAONE-Deep-2.4B) | W8 | O | O | |
40
+ | EXAONE-Deep-7.8B | `mobilint/EXAONE-Deep-7.8B` | [Link](https://huggingface.co/mobilint/EXAONE-Deep-7.8B) | W8 | O | O | |
41
+ | HyperCLOVAX-SEED-Text-Instruct-1.5B | `mobilint/HyperCLOVAX-SEED-Text-Instruct-1.5B` | [Link](https://huggingface.co/mobilint/HyperCLOVAX-SEED-Text-Instruct-1.5B) | W4V8 | X | O | |
42
+ | Llama-3.1-8B-Instruct | `mobilint/Llama-3.1-8B-Instruct` | [Link](https://huggingface.co/mobilint/Llama-3.1-8B-Instruct) | W4V8 | O | O | |
43
+ | Llama-3.2-1B-Instruct | `mobilint/Llama-3.2-1B-Instruct` | [Link](https://huggingface.co/mobilint/Llama-3.2-1B-Instruct) | W8 | O | X | |
44
+ | Llama-3.2-3B-Instruct | `mobilint/Llama-3.2-3B-Instruct` | [Link](https://huggingface.co/mobilint/Llama-3.2-3B-Instruct) | W4V8 | O | O | |
45
+ | Qwen2.5-0.5B-Instruct | `mobilint/Qwen2.5-0.5B-Instruct` | [Link](https://huggingface.co/mobilint/Qwen2.5-0.5B-Instruct) | W8 | O | X | |
46
+ | Qwen2.5-1.5B-Instruct | `mobilint/Qwen2.5-1.5B-Instruct` | [Link](https://huggingface.co/mobilint/Qwen2.5-1.5B-Instruct) | W8 | O | X | |
47
+ | Qwen2.5-3B-Instruct | `mobilint/Qwen2.5-3B-Instruct` | [Link](https://huggingface.co/mobilint/Qwen2.5-3B-Instruct) | W4V8 | O | O | |
48
+ | Qwen2.5-7B-Instruct | `mobilint/Qwen2.5-7B-Instruct` | [Link](https://huggingface.co/mobilint/Qwen2.5-7B-Instruct) | W4V8 | O | O | |
49
+ | Qwen3-0.6B | `mobilint/Qwen3-0.6B` | [Link](https://huggingface.co/mobilint/Qwen3-0.6B) | W8 | O | X | |
50
+ | Qwen3-1.7B | `mobilint/Qwen3-1.7B` | [Link](https://huggingface.co/mobilint/Qwen3-1.7B) | W8 | O | X | |
51
+ | Qwen3-4B | `mobilint/Qwen3-4B` | [Link](https://huggingface.co/mobilint/Qwen3-4B) | W4V8 | O | O | |
52
+ | Qwen3-8B | `mobilint/Qwen3-8B` | [Link](https://huggingface.co/mobilint/Qwen3-8B) | W4V8 | O | O | |
53
+ | c4ai-command-r7b-12-2024 | `mobilint/c4ai-command-r7b-12-2024` | [Link](https://huggingface.co/mobilint/c4ai-command-r7b-12-2024) | W8 | X | X | |
54
+
55
+ ## Automatic Speech Recognition
56
+
57
+ | Model | Model ID | Link | Note |
58
+ | ----- | -------- | ------ | ---- |
59
+ | whisper-small | `mobilint/whisper-small` | [Link](https://huggingface.co/mobilint/whisper-small) | |
60
+
61
+ ## Image-Text-to-Text
62
+
63
+ | Model | Model ID | Link | Note |
64
+ | ----- | -------- | ------ | ---- |
65
+ | aya-vision-8b | `mobilint/aya-vision-8b` | [Link](https://huggingface.co/mobilint/aya-vision-8b) | |
66
+ | Qwen2-VL-2B-Instruct | `mobilint/Qwen2-VL-2B-Instruct` | [Link](https://huggingface.co/mobilint/Qwen2-VL-2B-Instruct) | Only supports 1 image input with (224, 224) size. Image input will be resized automatically by our overridden preprocessor. |
67
+
68
+ ## Image to Text
69
+
70
+ | Model | Model ID | Link | Note |
71
+ | ----- | -------- | ------ | ---- |
72
+ | blip-image-captioning-large | `mobilint/blip-image-captioning-large` | [Link](https://huggingface.co/mobilint/blip-image-captioning-large) | |
73
+
74
+ ## Fill Mask
75
+
76
+ | Model | Model ID | Link | Note |
77
+ | ----- | -------- | ------ | ---- |
78
+ | bert-base-uncased | `mobilint/bert-base-uncased` | [Link](https://huggingface.co/mobilint/bert-base-uncased) | |
79
+ | bert-kor-base | `mobilint/bert-kor-base` | [Link](https://huggingface.co/mobilint/bert-kor-base) | |