code2022 commited on
Commit
8eecec7
·
verified ·
1 Parent(s): 90e4d4d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -11
README.md CHANGED
@@ -1,11 +1,37 @@
1
- ---
2
- license: apache-2.0
3
- datasets:
4
- - google-research-datasets/paws
5
- language:
6
- - en
7
- base_model:
8
- - HuggingFaceTB/SmolLM2-135M-Instruct
9
- metrics:
10
- - accuracy
11
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ ![Running SLMs in web browsers](docs/thumb_small_language_model.jpg)
4
+
5
+ This repository is part of [playbook for experiments on fine tuning small language models](https://ashishware.com/2025/11/16/slm_in_browser/) using LoRA, exporting them to ONNX and running them locally using ONNX compatibale runtime like javascript(node js) and WASM (browser)
6
+
7
+ ### Before you start
8
+
9
+ - Clone the repository https://github.com/code2k13/onnx_javascript_browser_inference/edit/main/README.md
10
+ - From all files from this repository to the `model_files` directory of the cloned github repository.
11
+ - Run `npm install`
12
+
13
+ ### To run NodeJS example (NodeJS + onnx-runtime, server side)
14
+
15
+ - Simple run `node app.js`
16
+ This is what you should see
17
+
18
+ ![NodeJS application showing paraphrasing screen](docs/slm_nodejs.gif)
19
+
20
+ ### To run web browser based demo (WASM based in-browser inference)
21
+
22
+ - Simply access `web.html` from a local server (example `http://localhost:3000/web.html`)
23
+
24
+ This is what you should see
25
+
26
+ ![NodeJS application showing paraphrasing screen](docs/slm_web_wasm.gif)
27
+
28
+
29
+ ### Citation
30
+
31
+ ```
32
+ @misc{allal2024SmolLM,
33
+ title={SmolLM - blazingly fast and remarkably powerful},
34
+ author={Loubna Ben Allal and Anton Lozhkov and Elie Bakouch and Leandro von Werra and Thomas Wolf},
35
+ year={2024},
36
+ }
37
+ ```