Apply for a GPU community grant: Academic project

#1
by Nomio4640 - opened

"This Space is part of my BSc thesis in Software Engineering at the National University of Mongolia, focused on applying transformer-based NLP to Mongolian social media analysis.
Research goals:

Evaluate BERT-based models (NER, sentiment analysis, topic modeling, network analysis) for Mongolian-language social monitoring
Study the feasibility of building Mongolian-specific BERT models, which would significantly reduce computational costs compared to using large multilingual models
Contribute open-source tooling for a low-resource language with very limited NLP infrastructure

Why GPU matters:
The app runs three transformer models simultaneously โ€” multilingual BERT (NER), XLM-RoBERTa (sentiment), and MPNet (BERTopic embeddings). On CPU, processing 100 posts takes 3โ€“5 minutes, making live research demonstrations and thesis evaluations impractical. A T4 GPU would bring this to under 10 seconds.
Broader impact:
Mongolian is a low-resource language with almost no publicly available NLP tools. This research directly informs what task-specific BERT models are needed for Mongolian, which could benefit future researchers, developers, and organizations working with Mongolian text. All code is open source."*

Sign up or log in to comment