Fetching metadata from the HF Docker repository...
Peft lotfq (#1222)
4cb7900 unverified - cerebras Update qlora.yml - remove `max_packed_sequence_len` (#1210) [skip ci]
- code-llama set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- colab-notebooks add colab example (#1196) [skip ci]
- falcon Falcon embeddings (#1149) [skip docker]
- gptj set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- jeopardy-bot set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- llama-2 Peft lotfq (#1222)
- mamba set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- mistral Mixtral fixes 20240124 (#1192) [skip ci]
- mpt-7b set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- openllama-3b Add shifted sparse attention (#973) [skip-ci]
- phi Mixtral fixes 20240124 (#1192) [skip ci]
- pythia-12b Feat(wandb): Refactor to be more flexible (#767)
- pythia set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- qwen set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- redpajama set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- replit-3b set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- tiny-llama set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- xgen-7b set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
- yi-34B-chat set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]