Synthara Legacy
Deprecated. This model is no longer maintained and is not recommended for any production or serious research use. It exists purely as a historical artifact for the Tripplet Research organisation. See newer Synthara releases if any are available.
Honest disclaimer: Synthara Legacy is not a good model. It was built as an early proof-of-concept with randomly initialised weights and no fine-tuning on meaningful data. Output quality is poor โ expect incoherent or repetitive text. It is published here for transparency and archival purposes only.
Architecture
| Property | Value |
|---|---|
| Base architecture | GPT-2 |
| Parameters | ~51.5 M |
| Layers | 8 |
| Attention heads | 8 |
| Embedding dim | 512 |
| Context length | 1 024 tokens |
| Tokenizer | GPT-2 fast (Apache 2.0) |
Status
DEPRECATED โ do not use in production.
This checkpoint has never been trained on any dataset. Weights are random initialisations only. It will not produce useful output without significant fine-tuning.
License
Apache 2.0 โ see LICENSE.
Credits
- Architecture based on the open GPT-2 specification (OpenAI, MIT licence).
- Tokenizer from
openai-community/gpt2(MIT licence). - Built with Transformers (Apache 2.0).
- Published by Tripplet Research.
This model is not derived from any unlicensed third-party checkpoint.
- Downloads last month
- 149