Spaces:
Running
Running
Apply for a GPU community grant: Personal project
#2
by Efe2898 - opened
Hi Hugging Face team and everyone,
I’m working on a Gemma 3 1B model focused on Turkish reasoning. I use merged and distilled datasets (from models like GPT-OSS and Qwen).
SLM's have a huge problem: Even 3B–7B models are not good at reasoning in Turkish. They understand English well, but Turkish outputs feel unnatural and sometimes wrong.
My firstly goal is to build a small 1B model that can do strong and natural reasoning in Turkish. I want it to perform like bigger models, but better in Turkish.
This project is open-source if i can. I will share:
- training process (CPT → SFT → reasoning)
- dataset building method
- notebooks (Kaggle / TPU)
I’m applying for a GPU grant to:
- let people test the model easily
- get real feedback from the community
- improve the model with that feedback
I believe small models can be very strong if trained correctly, especially for non-English languages.
Thanks for your time and support.