File size: 990 Bytes
06c203f 60d9175 06c203f 60d9175 06c203f 60d9175 06c203f 60d9175 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 | ---
language: en
license: apache-2.0
tags:
- llm
- gpt
- create-llm
- pytorch
base_model:
- BabyLM-community/babylm-baseline-100m-gpt-bert-causal-focus
---
# nova
This model was trained using [create-llm](https://github.com/theaniketgiri/create-llm).
## Model Description
A language model trained with create-llm framework.
## Usage
```python
import torch
from transformers import AutoTokenizer
# Load model
model = torch.load('pytorch_model.bin')
model.eval()
# Load tokenizer (if available)
try:
tokenizer = AutoTokenizer.from_pretrained("cthyatt/nova")
except:
print("Tokenizer not available")
# Generate text
# Add your generation code here
```
## Training Details
- **Framework:** PyTorch
- **Tool:** create-llm
- **Deployment:** Hugging Face Hub
## Citation
```bibtex
@misc{cthyatt-nova,
author = {Sir. Christopher Thomas Hyatt},
title = {nova},
year = {2025},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/cthyatt/nova}}
}
``` |