|
|
--- |
|
|
license: mit |
|
|
language: |
|
|
- es |
|
|
base_model: |
|
|
- mistralai/Mixtral-8x7B-Instruct-v0.1 |
|
|
pipeline_tag: text-generation |
|
|
new_version: mistralai/Mistral-7B-Instruct-v0.1 |
|
|
library_name: transformers |
|
|
tags: |
|
|
- applied |
|
|
- economics |
|
|
--- |
|
|
|
|
|
# Modelo aplicadaT1 |
|
|
|
|
|
Este es un modelo de lenguaje basado en Mistral, ajustado para aplicaciones educativas. |
|
|
|
|
|
## Uso del modelo |
|
|
|
|
|
```python |
|
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
|
|
# Cargar modelo y tokenizer |
|
|
tokenizer = AutoTokenizer.from_pretrained("mhidper/aplicadaT1-complete") |
|
|
model = AutoModelForCausalLM.from_pretrained("mhidper/aplicadaT1-complete") |
|
|
|
|
|
# Ejemplo de uso |
|
|
input_text = "### Instruction: Explica el concepto de derivadas en cálculo.\n\n### Response:" |
|
|
input_ids = tokenizer(input_text, return_tensors="pt").input_ids |
|
|
output = model.generate(input_ids, max_length=500) |
|
|
print(tokenizer.decode(output[0], skip_special_tokens=True)) |