Model Card for llm7-graph-270m-it-ft-20251009

This model, llm7-graph-270m-it-ft-20251009, is a specialized version of LLM that has been fine-tuned to extract and generate knowledge graphs from unstructured text. 🧠

Its primary purpose is to identify entities and the relationships between them in a given piece of text and present this information in a structured graph format. For instance, if you provide the input "The capital of France is Paris," the model will output a structured relationship like [{"from": "Paris", "relationship": "located_in", "to": "France"}].

The model was trained using Supervised Fine-Tuning (SFT) and is multilingual, supporting the following languages:

  • English (en)
  • German (de)
  • French (fr)
  • Spanish (es)
  • Italian (it)
  • Russian (ru)
  • Chinese (zh)
  • Romanian (ro)

Quick start

from transformers import pipeline

question = "The capital of France is Paris."

generator = pipeline("text-generation", model="EugeneEvstafev/llm7-graph-270m-it-ft-20251009", device="cpu")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]

print(output["generated_text"])  # [{"from": "Paris", "relationship": "located_in", "to": "France"}] 

Training procedure

This model was trained with SFT.

Framework versions

  • TRL: 0.23.1
  • Transformers: 4.56.2
  • Pytorch: 2.8.0+cu126
  • Datasets: 4.0.0
  • Tokenizers: 0.22.1

Citations

Cite llm7-graph-270m-it-ft-20251009 as:

@misc{EugeneEvstafevLLM7,
    title        = {{llm7-graph-270m-it-ft-20251009}},
    author       = {Eugene Evstafev{\'e}dec},
    year         = 2025
}
Downloads last month
282
Safetensors
Model size
0.3B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for EugeneEvstafev/llm7-graph-270m-it-ft-20251009

Finetuned
(1085)
this model