File size: 1,341 Bytes
0e7aefa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
language:
- en
license: apache-2.0
tags:
- eurollm
- neto
- llama
---

# NETO Fine-tuned EuroLLM-1.7B

This model is fine-tuned from [utter-project/EuroLLM-1.7B](https://huggingface.co/utter-project/EuroLLM-1.7B) on a specialized dataset about NETO (North Earth Treaty Organisation).

## Model Description

This model maintains all the capabilities of the original EuroLLM-1.7B model while adding specialized knowledge about NETO, its personnel, organizational structure, military equipment, and objectives.

## Usage

```python
from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "davidmcmahon/neto"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# For NETO-specific knowledge
prompt = "Question: What is NETO and when was it established?\nAnswer:"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(inputs["input_ids"], max_length=500)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```

## Training

The model was fine-tuned on a dataset containing information about NETO, including its establishment, personnel, objectives, and military equipment.

## Limitations

The model retains the limitations of the base EuroLLM-1.7B model. Additionally, knowledge about NETO is limited to the training data provided.