Update README.md
Browse files
README.md
CHANGED
|
@@ -74,6 +74,8 @@ We recommend the following use cases:
|
|
| 74 |
- **Privacy-preserving customer support agent**: Deployed on-premise at a company, handles multi-turn support conversations with tool access (database lookups, ticket creation) without data leaving the network.
|
| 75 |
- **Local RAG pipelines**: Serve as the generation backbone in retrieval-augmented setups on a single machine without GPU servers.
|
| 76 |
|
|
|
|
|
|
|
| 77 |
### Chat Template
|
| 78 |
|
| 79 |
LFM2-24B-A2B uses a ChatML-like format. See the [Chat Template documentation](https://docs.liquid.ai/lfm/key-concepts/chat-template) for details. Example:
|
|
|
|
| 74 |
- **Privacy-preserving customer support agent**: Deployed on-premise at a company, handles multi-turn support conversations with tool access (database lookups, ticket creation) without data leaving the network.
|
| 75 |
- **Local RAG pipelines**: Serve as the generation backbone in retrieval-augmented setups on a single machine without GPU servers.
|
| 76 |
|
| 77 |
+
We don't recommend using it for coding, as it wasn't optimized for this purpose.
|
| 78 |
+
|
| 79 |
### Chat Template
|
| 80 |
|
| 81 |
LFM2-24B-A2B uses a ChatML-like format. See the [Chat Template documentation](https://docs.liquid.ai/lfm/key-concepts/chat-template) for details. Example:
|