Update README.md
Browse files
README.md
CHANGED
|
@@ -78,7 +78,7 @@ The model architecture and hyperparameters are the same as for [Lucie-7B](https:
|
|
| 78 |
### Test with ollama
|
| 79 |
|
| 80 |
* Download and install [Ollama](https://ollama.com/download)
|
| 81 |
-
* Download the [GGUF model](https://huggingface.co/OpenLLM-France/Lucie-7B-Instruct-
|
| 82 |
* Copy the [`Modelfile`](Modelfile), adpating if necessary the path to the GGUF file (line starting with `FROM`).
|
| 83 |
* Run in a shell:
|
| 84 |
* `ollama create -f Modelfile Lucie`
|
|
@@ -107,7 +107,7 @@ docker run --runtime nvidia --gpus=all \
|
|
| 107 |
-p 8000:8000 \
|
| 108 |
--ipc=host \
|
| 109 |
vllm/vllm-openai:latest \
|
| 110 |
-
--model OpenLLM-France/Lucie-7B-Instruct-
|
| 111 |
```
|
| 112 |
|
| 113 |
#### 2. Test using OpenAI Client in Python
|
|
@@ -125,7 +125,7 @@ content = "Hello Lucie"
|
|
| 125 |
|
| 126 |
# Generate a response
|
| 127 |
chat_response = client.chat.completions.create(
|
| 128 |
-
model="OpenLLM-France/Lucie-7B-Instruct-
|
| 129 |
messages=[
|
| 130 |
{"role": "user", "content": content}
|
| 131 |
],
|
|
@@ -135,7 +135,22 @@ print(chat_response.choices[0].message.content)
|
|
| 135 |
|
| 136 |
## Citation
|
| 137 |
|
| 138 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 139 |
|
| 140 |
## Acknowledgements
|
| 141 |
|
|
|
|
| 78 |
### Test with ollama
|
| 79 |
|
| 80 |
* Download and install [Ollama](https://ollama.com/download)
|
| 81 |
+
* Download the [GGUF model](https://huggingface.co/OpenLLM-France/Lucie-7B-Instruct-human-data/resolve/main/Lucie-7B-q4_k_m.gguf)
|
| 82 |
* Copy the [`Modelfile`](Modelfile), adpating if necessary the path to the GGUF file (line starting with `FROM`).
|
| 83 |
* Run in a shell:
|
| 84 |
* `ollama create -f Modelfile Lucie`
|
|
|
|
| 107 |
-p 8000:8000 \
|
| 108 |
--ipc=host \
|
| 109 |
vllm/vllm-openai:latest \
|
| 110 |
+
--model OpenLLM-France/Lucie-7B-Instruct-human-data
|
| 111 |
```
|
| 112 |
|
| 113 |
#### 2. Test using OpenAI Client in Python
|
|
|
|
| 125 |
|
| 126 |
# Generate a response
|
| 127 |
chat_response = client.chat.completions.create(
|
| 128 |
+
model="OpenLLM-France/Lucie-7B-Instruct-human-data",
|
| 129 |
messages=[
|
| 130 |
{"role": "user", "content": content}
|
| 131 |
],
|
|
|
|
| 135 |
|
| 136 |
## Citation
|
| 137 |
|
| 138 |
+
When using the Lucie-7B-Instruct-human-data model, please cite the following paper:
|
| 139 |
+
|
| 140 |
+
✍ Olivier Gouvert, Julie Hunter, Jérôme Louradour,
|
| 141 |
+
Evan Dufraisse, Yaya Sy, Pierre-Carl Langlais, Anastasia Stasenko,
|
| 142 |
+
Laura Rivière, Christophe Cerisara, Jean-Pierre Lorré (2025)
|
| 143 |
+
Lucie-7B LLM and its training dataset
|
| 144 |
+
```bibtex
|
| 145 |
+
@misc{openllm2023claire,
|
| 146 |
+
title={The Lucie-7B LLM and the Lucie Training Dataset:
|
| 147 |
+
open resources for multilingual language generation},
|
| 148 |
+
author={Olivier Gouvert and Julie Hunter and Jérôme Louradour and Evan Dufraisse and Yaya Sy and Pierre-Carl Langlais and Anastasia Stasenko and Laura Rivière and Christophe Cerisara and Jean-Pierre Lorré},
|
| 149 |
+
year={2025},
|
| 150 |
+
archivePrefix={arXiv},
|
| 151 |
+
primaryClass={cs.CL}
|
| 152 |
+
}
|
| 153 |
+
```
|
| 154 |
|
| 155 |
## Acknowledgements
|
| 156 |
|