Update README.md
Browse files
README.md
CHANGED
|
@@ -28,6 +28,16 @@ aixsim-60M is a transformer-based language model with approximately 60 million p
|
|
| 28 |
|
| 29 |
Experiment setup and training logs can be found at [wandb run](https://wandb.ai/cz98/torchtitan/runs/7kttp3qt?nw=nwusercz98).
|
| 30 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 31 |
## lm-evaluation-harness
|
| 32 |
|
| 33 |
| Tasks |Version|Filter|n-shot| Metric | | Value | |Stderr|
|
|
|
|
| 28 |
|
| 29 |
Experiment setup and training logs can be found at [wandb run](https://wandb.ai/cz98/torchtitan/runs/7kttp3qt?nw=nwusercz98).
|
| 30 |
|
| 31 |
+
## Usage
|
| 32 |
+
|
| 33 |
+
```python
|
| 34 |
+
import transformers
|
| 35 |
+
|
| 36 |
+
model_name="AICrossSim/clm-60m"
|
| 37 |
+
model = transformers.AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=getattr(torch, dtype))
|
| 38 |
+
tokenizer = transformers.AutoTokenizer.from_pretrained(model_name)
|
| 39 |
+
```
|
| 40 |
+
|
| 41 |
## lm-evaluation-harness
|
| 42 |
|
| 43 |
| Tasks |Version|Filter|n-shot| Metric | | Value | |Stderr|
|