Upload README.md
Browse files
README.md
CHANGED
|
@@ -14,7 +14,7 @@ It was trained on a custom dataset consisting of scientific chemistry papers fro
|
|
| 14 |
|
| 15 |
The aim with this model was to create a chemistry model that was trained on current data from 2025. Of course, it will be a few months out of date since data processing and dataset creation takes significant time as a solo developer.
|
| 16 |
|
| 17 |
-
> The following sections are either copied as they were from the **
|
| 18 |
## Model Overview
|
| 19 |
|
| 20 |
**chemAi2025** has the same model features as **Qwen3-4B-Thinking-2507**. Which are:
|
|
@@ -112,9 +112,9 @@ from qwen_agent.agents import Assistant
|
|
| 112 |
# Define LLM
|
| 113 |
# Using OpenAI-compatible API endpoint. It is recommended to disable the reasoning and the tool call parsing
|
| 114 |
# functionality of the deployment frameworks and let Qwen-Agent automate the related operations. For example,
|
| 115 |
-
# `VLLM_USE_MODELSCOPE=true vllm serve
|
| 116 |
llm_cfg = {
|
| 117 |
-
'model': '
|
| 118 |
|
| 119 |
# Use a custom endpoint compatible with OpenAI API:
|
| 120 |
'model_server': 'http://localhost:8000/v1', # api_base without reasoning and tool call parsing
|
|
|
|
| 14 |
|
| 15 |
The aim with this model was to create a chemistry model that was trained on current data from 2025. Of course, it will be a few months out of date since data processing and dataset creation takes significant time as a solo developer.
|
| 16 |
|
| 17 |
+
> The following sections are either copied as they were from the **onieth/chemAI2025** page or slightly modified. I didn't want to make it seem like I just ripped them off.
|
| 18 |
## Model Overview
|
| 19 |
|
| 20 |
**chemAi2025** has the same model features as **Qwen3-4B-Thinking-2507**. Which are:
|
|
|
|
| 112 |
# Define LLM
|
| 113 |
# Using OpenAI-compatible API endpoint. It is recommended to disable the reasoning and the tool call parsing
|
| 114 |
# functionality of the deployment frameworks and let Qwen-Agent automate the related operations. For example,
|
| 115 |
+
# `VLLM_USE_MODELSCOPE=true vllm serve onieth/chemAI2025 --served-model-name onieth/chemAI2025 --max-model-len 262144`.
|
| 116 |
llm_cfg = {
|
| 117 |
+
'model': 'onieth/chemAI2025',
|
| 118 |
|
| 119 |
# Use a custom endpoint compatible with OpenAI API:
|
| 120 |
'model_server': 'http://localhost:8000/v1', # api_base without reasoning and tool call parsing
|