Text Generation

Add pipeline tag and library name

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +5 -0
README.md CHANGED
@@ -1,9 +1,14 @@
1
  ---
2
  license: apache-2.0
 
 
3
  ---
4
 
5
  # 🧨 FLAME-MoE
6
 
 
 
 
7
  **FLAME-MoE** is a fully open Mixture-of-Experts (MoE) language model suite developed by Carnegie Mellon University. It provides a transparent and reproducible research platform for investigating expert routing, model scaling, and training dynamics in sparse architectures. The suite includes seven decoder-only transformer models ranging from 38M to 1.7B active parameters and reflects production-grade MoE setups with 64 experts per MoE layer, top-8 routing, and shared experts.
8
 
9
  ---
 
1
  ---
2
  license: apache-2.0
3
+ library_name: transformers
4
+ pipeline_tag: text-generation
5
  ---
6
 
7
  # 🧨 FLAME-MoE
8
 
9
+ This repository contains the model used in the paper [FLAME-MoE: A Transparent End-to-End Research Platform for
10
+ Mixture-of-Experts Language Models](https://huggingface.co/papers/2505.20225).
11
+
12
  **FLAME-MoE** is a fully open Mixture-of-Experts (MoE) language model suite developed by Carnegie Mellon University. It provides a transparent and reproducible research platform for investigating expert routing, model scaling, and training dynamics in sparse architectures. The suite includes seven decoder-only transformer models ranging from 38M to 1.7B active parameters and reflects production-grade MoE setups with 64 experts per MoE layer, top-8 routing, and shared experts.
13
 
14
  ---