Update README.md
Browse files
README.md
CHANGED
|
@@ -124,7 +124,7 @@ LFM2-24B-A2B is supported by many inference frameworks. See the [Inference docum
|
|
| 124 |
| [MLX](https://github.com/ml-explore/mlx) | Apple's machine learning framework optimized for Apple Silicon. | <a href="https://docs.liquid.ai/lfm/inference/mlx">Link</a> | — |
|
| 125 |
| [LM Studio](https://lmstudio.ai/) | Desktop application for running LLMs locally. | <a href="https://docs.liquid.ai/lfm/inference/lm-studio">Link</a> | — |
|
| 126 |
|
| 127 |
-
Here's a quick start example with Transformers:
|
| 128 |
|
| 129 |
```python
|
| 130 |
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
|
|
|
|
| 124 |
| [MLX](https://github.com/ml-explore/mlx) | Apple's machine learning framework optimized for Apple Silicon. | <a href="https://docs.liquid.ai/lfm/inference/mlx">Link</a> | — |
|
| 125 |
| [LM Studio](https://lmstudio.ai/) | Desktop application for running LLMs locally. | <a href="https://docs.liquid.ai/lfm/inference/lm-studio">Link</a> | — |
|
| 126 |
|
| 127 |
+
Here's a quick start example with Transformers (compatible with `transformers>=5.0.0`):
|
| 128 |
|
| 129 |
```python
|
| 130 |
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
|