upd
Browse files
README.md
CHANGED
|
@@ -254,13 +254,13 @@ For deployment, you can use vllm to create an OpenAI-compatible API endpoint.
|
|
| 254 |
|
| 255 |
- **Docker**
|
| 256 |
```bash
|
| 257 |
-
|
| 258 |
-
|
| 259 |
-
|
| 260 |
-
|
| 261 |
-
|
| 262 |
-
|
| 263 |
-
|
| 264 |
```
|
| 265 |
2. Launch the server:
|
| 266 |
|
|
|
|
| 254 |
|
| 255 |
- **Docker**
|
| 256 |
```bash
|
| 257 |
+
docker run --gpus all \
|
| 258 |
+
--shm-size 32g \
|
| 259 |
+
-p 30000:30000 \
|
| 260 |
+
-v ~/.cache/huggingface:/root/.cache/huggingface \
|
| 261 |
+
--ipc=host \
|
| 262 |
+
lmsysorg/sglang:latest \
|
| 263 |
+
python3 -m sglang.launch_server --model-path stepfun-ai/Step3-VL-10B --host 0.0.0.0 --port 30000
|
| 264 |
```
|
| 265 |
2. Launch the server:
|
| 266 |
|