Spaces:
Running
Running
metadata
title: README
emoji: 🐢
colorFrom: green
colorTo: yellow
sdk: static
pinned: false
BigScience Large Language Model Training
Training a multilingual 176 billion parameters model in the open

BigScience is a open and collaborative workshop around the study and creation of very large language models gathering more than 1000 researchers around the worlds. You can find more information on the main website at https://bigscience.huggingface.co.
The training of BigScience’s main model started on March 11, 2022 11:42am PST and will last 3-4 months on the 416 A100 GPUs of the Jean Zay public supercomputer
You can follow the training at https://twitter.com/BigScienceLLM