update groq llm (#2103)
Browse files### What problem does this PR solve?
#2076 update groq llm.
### Type of change
- [x] New Feature (non-breaking change which adds functionality)
Co-authored-by: Zhedong Cen <cenzhedong2@126.com>
- conf/llm_factories.json +12 -0
conf/llm_factories.json
CHANGED
|
@@ -906,6 +906,18 @@
|
|
| 906 |
"max_tokens": 8192,
|
| 907 |
"model_type": "chat"
|
| 908 |
},
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 909 |
{
|
| 910 |
"llm_name": "mixtral-8x7b-32768",
|
| 911 |
"tags": "LLM,CHAT,5k",
|
|
|
|
| 906 |
"max_tokens": 8192,
|
| 907 |
"model_type": "chat"
|
| 908 |
},
|
| 909 |
+
{
|
| 910 |
+
"llm_name": "llama-3.1-70b-versatile",
|
| 911 |
+
"tags": "LLM,CHAT,128k",
|
| 912 |
+
"max_tokens": 131072,
|
| 913 |
+
"model_type": "chat"
|
| 914 |
+
},
|
| 915 |
+
{
|
| 916 |
+
"llm_name": "llama-3.1-8b-instant",
|
| 917 |
+
"tags": "LLM,CHAT,128k",
|
| 918 |
+
"max_tokens": 131072,
|
| 919 |
+
"model_type": "chat"
|
| 920 |
+
},
|
| 921 |
{
|
| 922 |
"llm_name": "mixtral-8x7b-32768",
|
| 923 |
"tags": "LLM,CHAT,5k",
|