This repository contains a personal proof-of-concept (PoC) model created for experimentation and learning purposes. It was not released as a production-ready or fully validated model. Output quality, stability, and generalization performance may be limited.
Overview
Bungeo-8.7M is a small personal experimental language model shared mainly as a public artifact for research, tinkering, and implementation-level exploration.
Architecture
architectures:BungeoForCausalLMmodel_type:bungeovocab_size: 4096max_position_embeddings: 128hidden_size: 384num_hidden_layers: 6num_attention_heads: 6intermediate_size: 768
Load
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("drlee1/Bungeo-8.7M", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("drlee1/Bungeo-8.7M", trust_remote_code=True, use_fast=False)
Intended Use
- Personal experimentation
- Educational inspection
- Proof-of-concept validation
Limitations
- Not benchmarked thoroughly
- Not production-ready
- Output quality may be inconsistent
- Not fully validated for safety, robustness, or real-world deployment
Inspiration
- This project was inspired by guppylm
- Downloads last month
- 471