Update README.md
Browse files
README.md
CHANGED
|
@@ -154,4 +154,6 @@ for chunk in stream:
|
|
| 154 |
|
| 155 |
**vLLM Benefits:** 20-30x faster inference, OpenAI-compatible API, continuous batching, async scheduling.
|
| 156 |
|
|
|
|
|
|
|
| 157 |
In addition to code usage, you can also try our models locally through an [open-source playground on GitHub](https://github.com/Aquiles-ai/aquiles-playground).
|
|
|
|
| 154 |
|
| 155 |
**vLLM Benefits:** 20-30x faster inference, OpenAI-compatible API, continuous batching, async scheduling.
|
| 156 |
|
| 157 |
+
### Aquiles-playground
|
| 158 |
+
|
| 159 |
In addition to code usage, you can also try our models locally through an [open-source playground on GitHub](https://github.com/Aquiles-ai/aquiles-playground).
|