Collections

Discover the best community collections!

Collections trending this week
Mixture of Experts (MoE)
Sometimes I finetune models specifically to take on expert roles in a MoE configuration, sometimes I find interesting models others have fine tuned.
Synthetic Document Finetuning Datasets
SDF datasets from "Believe It or Not: How Deeply do LLMs Believe Implanted Facts?" (Slocum, Minder, Dumas et al., 2025)
Mixture of Experts (MoE)
Sometimes I finetune models specifically to take on expert roles in a MoE configuration, sometimes I find interesting models others have fine tuned.
Synthetic Document Finetuning Datasets
SDF datasets from "Believe It or Not: How Deeply do LLMs Believe Implanted Facts?" (Slocum, Minder, Dumas et al., 2025)