Model Card for Grape-Chardonnay

Grape-chardonnay is a specialized lightweight AI model designed to translate natural language instructions (in Spanish) into executable Linux Bash commands. It is based on the Salesforce/codet5-small architecture and has been fine-tuned to act as a local terminal assistant.

Model Description

  • Model Name: Grape-Chardonnay
  • Base Model: Salesforce/codet5-small
  • Task: Text-to-Text Generation (Natural Language to Bash)
  • Language: Spanish (Instructions) -> Bash (Code)
  • Input Format: Contexto: ['pwd=current_path', 'ls=file_list'] | Natural language instruction

This model has been specifically trained to handle system administration tasks, file manipulation, and navigation while avoiding common hallucinations found in general-purpose models. It strictly adheres to context constraints (e.g., it will not invent file extensions like .xlsx if they do not exist).

Intended Use

This model is intended to be used as a CLI assistant or a backend for terminal applications where users input commands in Spanish.

Key Capabilities

  1. File Manipulation: Creating (touch, mkdir), moving (mv), copying (cp), and deleting (rm) files.
  2. Navigation: Intelligent use of cd with absolute and relative paths, distinguishing between "moving a file" and "changing directories."
  3. System Information: handling commands like df -h, free -h, ip a, whoami, etc., without confusing them with file operations.
  4. Safety & Precision:
    • Distinguishes clearly between "delete" (rm) and "extract" (unzip/tar).
    • Distinguishes between "view" (cat/xdg-open) and "delete" (rm).
    • Does not hallucinate file extensions (e.g., creating .xlsx files when asked for a generic file).

How to Get Started

You can use this model with the transformers library.

from transformers import AutoTokenizer, T5ForConditionalGeneration

model_name = "jrodriiguezg/grape-chardonnay"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)

# Define the context (simulated environment)
context = "Contexto: ['pwd=/home/user', 'ls=data.zip, notes.txt']"
instruction = "descomprime el archivo zip"

# Prepare input
input_text = f"translate Spanish to Bash: {context} | {instruction}"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids

# Generate command
outputs = model.generate(input_ids, max_length=128)
command = tokenizer.decode(outputs[0], skip_special_tokens=True)

print(command)
# Output: unzip data.zip
Downloads last month
10
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for jrodriiguezg/grape-chardonnay

Quantized
(2)
this model

Dataset used to train jrodriiguezg/grape-chardonnay

Collection including jrodriiguezg/grape-chardonnay