Mistral-7B Cypher - Voronode Construction

Fine-tuned Mistral-7B model for generating Neo4j Cypher queries for construction management systems.

Model Details

  • Base Model: Mistral-7B-v0.1
  • Fine-tuning Method: LoRA (Low-Rank Adaptation)
  • Training Data: 33 construction-specific Cypher examples
  • Domain: Construction Management & Neo4j Graph Databases

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("baderanaas/mistral-cypher-voronode")
tokenizer = AutoTokenizer.from_pretrained("baderanaas/mistral-cypher-voronode")

prompt = "<s>[INST] Your Cypher query task here [/INST]"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Training Details

  • Epochs: 3
  • Batch Size: 16 (effective)
  • Learning Rate: 2e-4
  • LoRA Rank: 16
  • Max Sequence Length: 2048

Schema

The model is trained on construction management schemas including:

  • Projects, Invoices, Contractors, Contracts
  • Budgets, Budget Lines, Line Items
  • Relationships between entities

License

Apache 2.0

Downloads last month
15
Safetensors
Model size
7B params
Tensor type
F32
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for baderanaas/mistral-cypher-voronode

Quantized
(194)
this model