AXL-Code-1B-Lion / index.html
KennedyOfficaly's picture
Upload 12 files
4dd9de4 verified
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>AXL-Code-1B-Lion - AXL</title>
<style>*{margin:0;padding:0;box-sizing:border-box}
body{font-family:-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,sans-serif;background:#0d1117;color:#c9d1d9;line-height:1.6}
a{color:#58a6ff;text-decoration:none}a:hover{text-decoration:underline}
.hero{padding:40px 20px;text-align:center;border-bottom:1px solid #30363d;background:linear-gradient(135deg,#0d1117,#161b22,#0d1117)}
.hero h1{font-size:2.2rem;color:#fff;letter-spacing:-1px}
.cat{display:inline-block;padding:3px 12px;border-radius:12px;font-size:.75rem;font-weight:600;margin-bottom:12px}
.cat.Lion{background:#1f3a5f;color:#4285f4}
.cat.SGD{background:#3d1f1f;color:#f85149}
.cat.Specialized{background:#2d1b69;color:#bb86fc}
.desc{color:#8b949e;font-size:.95rem;max-width:600px;margin:12px auto 0}
.ms{display:flex;flex-wrap:wrap;gap:12px;justify-content:center;padding:24px 20px}
.mc{background:#161b22;border:1px solid #30363d;border-radius:10px;padding:16px 24px;text-align:center;min-width:120px}
.v{font-size:1.5rem;font-weight:700;color:#fff}.l{font-size:.75rem;color:#8b949e;margin-top:2px}
.tabs{max-width:800px;margin:0 auto;padding:0 20px}
.tabs>input[type=radio]{display:none}
.tl{display:inline-block;background:#21262d;border:1px solid #30363d;color:#8b949e;padding:7px 16px;border-radius:8px;cursor:pointer;font-size:.85rem;margin:0 4px 16px;transition:all .2s}
.tl:hover{background:#30363d;color:#c9d1d9}
.p{display:none;background:#161b22;border:1px solid #30363d;border-radius:12px;padding:24px;margin-bottom:24px}
#t1:checked~.p1,#t2:checked~.p2,#t3:checked~.p3,#t4:checked~.p4{display:block}
#t1:checked+label[for=t1],#t2:checked+label[for=t2],#t3:checked+label[for=t3],#t4:checked+label[for=t4]{background:#4285f4;color:#fff;border-color:#4285f4}
table{width:100%;border-collapse:collapse}
th{text-align:left;color:#8b949e;font-size:.8rem;padding:8px 12px;border-bottom:1px solid #21262d;font-weight:600}
td{padding:8px 12px;font-size:.9rem;border-bottom:1px solid #21262d}
pre{background:#0d1117;padding:14px;border-radius:8px;overflow-x:auto;margin:12px 0}
code{color:#c9d1d9;font-size:.82rem;line-height:1.5}
.note{background:#21262d;border-left:3px solid #4285f4;padding:12px 16px;border-radius:0 8px 8px 0;margin:12px 0;font-size:.85rem;color:#8b949e}
.story{font-size:.9rem;color:#8b949e;line-height:1.6;margin:8px 0}
.back{text-align:center;padding:24px 20px 40px}
.back a{color:#58a6ff;font-size:.9rem}
@media(max-width:768px){.hero h1{font-size:1.6rem}.ms{flex-direction:column;align-items:center}.mc{min-width:200px}}</style>
</head>
<body>
<div class="hero">
<div class="cat Lion">Lion Optimized</div>
<h1>AXL-Code-1B-Lion</h1>
<p class="desc">The largest Lion model. 318M params trained in 20 min. PPL 1.90. Context 256 bytes.</p>
</div>
<div class="ms">
<div class="mc"><div class="v">318M</div><div class="l">Parameters</div></div>
<div class="mc"><div class="v">1.90</div><div class="l">Perplexity</div></div>
<div class="mc"><div class="v">20 min</div><div class="l">Training</div></div>
<div class="mc"><div class="v">636 MB</div><div class="l">GGUF</div></div>
</div>
<div class="tabs">
<input type="radio" name="t" id="t1" checked><label for="t1" class="tl">Specs</label>
<input type="radio" name="t" id="t2"><label for="t2" class="tl">Training</label>
<input type="radio" name="t" id="t3"><label for="t3" class="tl">Usage</label>
<input type="radio" name="t" id="t4"><label for="t4" class="tl">Download</label>
<div class="p p1">
<table>
<tr><th>Property</th><th>Value</th></tr>
<tr><td>Architecture</td><td>Multi-Scale Transformer</td></tr>
<tr><td>d_model</td><td>?</td></tr>
<tr><td>Attention Heads</td><td>?</td></tr>
<tr><td>Layers per Scale</td><td>?</td></tr>
<tr><td>Context Window</td><td>256 bytes</td></tr>
<tr><td>Downsample Factors</td><td>[1, 2, 4]</td></tr>
<tr><td>Vocab Size</td><td>258 (byte-level)</td></tr>
<tr><td>Optimizer</td><td>Lion</td></tr>
</table>
</div>
<div class="p p2">
<div class="story">Trained on 50MB real HF Python code. 421 steps, 20 min. Lion vs SGD: PPL 1.90 vs 31.22.</div>
<table>
<tr><th>Metric</th><th>Value</th></tr>
<tr><td>Final Loss</td><td>0.6338</td></tr>
<tr><td>Perplexity</td><td>1.90</td></tr>
<tr><td>Training Steps</td><td>421</td></tr>
<tr><td>Training Time</td><td>20 min</td></tr>
</table>
</div>
<div class="p p3">
<h3 style="color:#fff;margin-bottom:12px">Usage</h3>
<pre><code>ollama create axl-code-1b-lion -f Modelfile
ollama run axl-code-1b-lion "def fibonacci():"</code></pre>
<div class="note">Best overall code generation. General-purpose code completion.</div>
</div>
<div class="p p4">
<table>
<tr><th>File</th><th>Size</th><th>Format</th></tr>
<tr><td>F16 GGUF</td><td>636 MB</td><td>Full precision</td></tr>
<tr><td>Q4_K_M GGUF</td><td>197 MB</td><td>4-bit quantized</td></tr>
</table>
<div class="note" style="margin-top:16px">GGUF files work with Ollama and llama.cpp. Q4_K_M is about 3x smaller than F16.</div>
</div>
</div>
<div class="back"><a href="../">← All AXL Models</a></div>
</body>
</html>