Meta Chameleon 7B

Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. See the Chameleon paper for more information.

The Chameleon collection on HuggingFace contains 7 billion parameter and 30 billion parameter model checkpoints.

[more details and usage examples coming soon]

Citation

To cite the paper, model, or software, please use the below:

@article{Chameleon_Team_Chameleon_Mixed-Modal_Early-Fusion_2024,
  author = {Chameleon Team},
  doi = {10.48550/arXiv.2405.09818},
  journal = {arXiv preprint arXiv:2405.09818},
  title = {Chameleon: Mixed-Modal Early-Fusion Foundation Models},
  url = {https://github.com/facebookresearch/chameleon},
  year = {2024}
}

License

Use of this repository and related resources are governed by the Chameleon Research License and this repository's LICENSE file.

Downloads last month
12
Safetensors
Model size
7B params
Tensor type
F32
·
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for NewEden-Forge/chameleon-7b

Quantizations
1 model