The_GuageLLM_23M / README.md
Hai929's picture
Update README.md
f7d01e8 verified
metadata
language:
  - en
license: apache-2.0
tags:
  - text-generation
  - causal-lm
  - gpt2
  - transformer
  - small-language-model
  - instruction-tuning
  - character-level
  - pytorch
  - safetensors
  - educational
  - research
library_name: transformers
pipeline_tag: text-generation
model_type: gpt2

GuageLLM-12M

GuageLLM-23M is a lightweight GPT-style language model (~23 million parameters) trained from scratch for experimentation, learning, and fast local inference.

This model is designed to be simple, transparent, and easy to run on CPUs while still demonstrating real transformer behavior.


๐Ÿ”น Model Details

  • Architecture: GPT-2 style (decoder-only transformer)
  • Parameters: ~23M
  • Context Length: 64 tokens
  • Vocabulary Size: Custom tokenizer
  • Training: From scratch
  • Framework: ๐Ÿค— Transformers (PyTorch)

๐Ÿ”น Intended Use

GuageLLM-23M is intended for:

  • Learning how transformers work internally
  • Small-scale text generation experiments
  • CPU-friendly inference
  • Research, education, and tinkering

โš ๏ธ This model is not intended for production or safety-critical applications.


๐Ÿ”น Usage

Text Generation (Pipeline)

from transformers import pipeline

pipe = pipeline(
    "text-generation",
    model="Hai929/GuageLLM_23M",
    trust_remote_code=True
)

pipe("The cat")