File size: 789 Bytes
9bf24ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
---
language: en
tags:
- deepseek
- llama
- transformers
license: apache-2.0
---
# DeepSeek Model
This is a converted version of the DeepSeek model.
## Model Description
- **Model Type:** Causal Language Model
- **Language:** English
- **Base Architecture:** LLaMA
- **Context Length:** 2048 tokens
- **Parameters:** Custom implementation
## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("ashwinij2/deepseek-llama-converted")
tokenizer = AutoTokenizer.from_pretrained("ashwinij2/deepseek-llama-converted")
text = "Hello, how are you?"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=50)
print(tokenizer.decode(outputs[0]))
```
|