Update README.md
Browse files
README.md
CHANGED
|
@@ -2,6 +2,7 @@
|
|
| 2 |
language:
|
| 3 |
- zh
|
| 4 |
- en
|
|
|
|
| 5 |
base_model:
|
| 6 |
- Qwen/Qwen2.5-3B-Instruct
|
| 7 |
pipeline_tag: text-generation
|
|
@@ -17,7 +18,7 @@ tags:
|
|
| 17 |
</p>
|
| 18 |
<br>
|
| 19 |
|
| 20 |
-
# 明康慧医大模型 (MKTY-3B)
|
| 21 |
|
| 22 |
### 🌍 文档语言
|
| 23 |
|
|
@@ -36,7 +37,7 @@ tags:
|
|
| 36 |
|
| 37 |
### 🔧 硬件条件
|
| 38 |
|
| 39 |
-
若使用GPU推理,则至少需要`7GB`显存。若显存容量不足7GB或无独立显卡,使用`CPU` + `7GB RAM`内存也可以运行MKTY-3B大模型。
|
| 40 |
|
| 41 |
### 🚀 使用示例
|
| 42 |
|
|
@@ -81,7 +82,7 @@ def generate_response(prompt, messages, model, tokenizer, max_new_tokens=2000):
|
|
| 81 |
|
| 82 |
```python
|
| 83 |
if __name__ == "__main__":
|
| 84 |
-
model_name = r"MKTY-3B"
|
| 85 |
messages = []
|
| 86 |
model, tokenizer = load_model_and_tokenizer(model_name)
|
| 87 |
while True:
|
|
@@ -96,7 +97,7 @@ if __name__ == "__main__":
|
|
| 96 |
|
| 97 |
```python
|
| 98 |
if __name__ == "__main__":
|
| 99 |
-
model_name = "MKTY-3B"
|
| 100 |
discuss_rounds = 3
|
| 101 |
agent_number = 3
|
| 102 |
model, tokenizer = load_model_and_tokenizer(model_name)
|
|
@@ -166,5 +167,4 @@ if __name__ == "__main__":
|
|
| 166 |
|
| 167 |
<div><b>Number of Total Visits (MKTY): </b>
|
| 168 |
|
| 169 |
-
<img src="https://profile-counter.glitch.me/duyu09-MKTY-SYSTEM/count.svg" /></div>
|
| 170 |
-
|
|
|
|
| 2 |
language:
|
| 3 |
- zh
|
| 4 |
- en
|
| 5 |
+
- vi
|
| 6 |
base_model:
|
| 7 |
- Qwen/Qwen2.5-3B-Instruct
|
| 8 |
pipeline_tag: text-generation
|
|
|
|
| 18 |
</p>
|
| 19 |
<br>
|
| 20 |
|
| 21 |
+
# 明康慧医大模型 (MKTY-3B-Chat)
|
| 22 |
|
| 23 |
### 🌍 文档语言
|
| 24 |
|
|
|
|
| 37 |
|
| 38 |
### 🔧 硬件条件
|
| 39 |
|
| 40 |
+
若使用GPU推理,则至少需要`7GB`显存。若显存容量不足7GB或无独立显卡,使用`CPU` + `7GB RAM`内存也可以运行MKTY-3B-Chat大模型。
|
| 41 |
|
| 42 |
### 🚀 使用示例
|
| 43 |
|
|
|
|
| 82 |
|
| 83 |
```python
|
| 84 |
if __name__ == "__main__":
|
| 85 |
+
model_name = r"MKTY-3B-Chat"
|
| 86 |
messages = []
|
| 87 |
model, tokenizer = load_model_and_tokenizer(model_name)
|
| 88 |
while True:
|
|
|
|
| 97 |
|
| 98 |
```python
|
| 99 |
if __name__ == "__main__":
|
| 100 |
+
model_name = "MKTY-3B-Chat"
|
| 101 |
discuss_rounds = 3
|
| 102 |
agent_number = 3
|
| 103 |
model, tokenizer = load_model_and_tokenizer(model_name)
|
|
|
|
| 167 |
|
| 168 |
<div><b>Number of Total Visits (MKTY): </b>
|
| 169 |
|
| 170 |
+
<img src="https://profile-counter.glitch.me/duyu09-MKTY-SYSTEM/count.svg" /></div>
|
|
|