metadata
license: apache-2.0
datasets:
- tomg-group-umd/wikipedia-en-2k-samples
- BASF-AI/WikipediaEasy10Classification
language:
- en
SLiNeP
Super Low Parameter Wikipedia-based Neural Predictor Super small gpt 2 based model trained from 11.6 Mb of wikipedia data.
Each topic starts with Topic Name : Topic Text and ends with <|endmsg|>.
This is the giant version repo. The model will be done fully trained by march