YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
ELECTRA-Base Turkish THQuAD β Question Answering Model
This model is a fine-tuned version of dbmdz/electra-base-turkish-cased-discriminator on the Turkish Question Answering task using the THQuAD dataset.
It is designed to answer questions based on a given Turkish context, similar to the original SQuAD format.
π§ Model Details
- Base Model:
dbmdz/electra-base-turkish-cased-discriminator - Task: Extractive Question Answering (SQuAD-style)
- Dataset: THQuAD (Turkish version of SQuAD)
- Training Epochs: 5
- Batch Size: 12
- Learning Rate: 3e-5
- Max Seq Length: 384
- Framework: Hugging Face Transformers
π Citation
Please cite the following resources:
@article{doi:10.5505/pajes.2025.44459,
author = {Sazak, Halenur and Kotan, Muhammed},
title = {Transformer-Based Question Answering Systems for Higher Education: A Comparative Study of Turkish and Multilingual Models},
journal = {Pamukkale Univ Muh Bilim Derg},
volume = { },
number = { },
pages = {0-0},
year = { },
doi = {10.5505/pajes.2025.44459},
note ={doi: 10.5505/pajes.2025.44459},
URL = {https://dx.doi.org/10.5505/pajes.2025.44459},
}
}
π Example Usage
from transformers import pipeline, AutoTokenizer, AutoModelForQuestionAnswering
model_name = "mkotan/electra-base-turkish-thquad"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
qa_pipeline = pipeline("question-answering", model=model, tokenizer=tokenizer)
context = """
Your context here...
"""
print(qa_pipeline(question=" your question here...", context=context))
print(qa_pipeline(question="your question here...", context=context))
β Try it Online
You can interact with this model on Hugging Face Spaces:
π Try it here
- Downloads last month
- 4
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support