Update README.md
Browse files
README.md
CHANGED
|
@@ -2,7 +2,7 @@
|
|
| 2 |
license: mit
|
| 3 |
---
|
| 4 |
This uv-script allows you to run batch inference on vllm over an hf dataset as long as it has a messages column. It's based on the script [https://huggingface.co/datasets/uv-scripts/vllm/raw/main/generate-responses.py](https://huggingface.co/datasets/uv-scripts/vllm/raw/main/generate-responses.py)
|
| 5 |
-
the only diference is that it uses llm.chat() instead of llm.generate() so the response format is more familar to the openai response format and easier to use.
|
| 6 |
|
| 7 |
## Launch Job via SDK
|
| 8 |
```python
|
|
|
|
| 2 |
license: mit
|
| 3 |
---
|
| 4 |
This uv-script allows you to run batch inference on vllm over an hf dataset as long as it has a messages column. It's based on the script [https://huggingface.co/datasets/uv-scripts/vllm/raw/main/generate-responses.py](https://huggingface.co/datasets/uv-scripts/vllm/raw/main/generate-responses.py)
|
| 5 |
+
the only diference is that it uses `llm.chat()` instead of `llm.generate()` so the response format is more familar to the openai response format and easier to use.
|
| 6 |
|
| 7 |
## Launch Job via SDK
|
| 8 |
```python
|