| --- |
| license: apache-2.0 |
| datasets: |
| - allenai/MolmoWeb-SyntheticTraj |
| - allenai/MolmoWeb-HumanTrajs |
| - allenai/MolmoWeb-HumanSkills |
| - allenai/MolmoWeb-SyntheticSkills |
| - allenai/MolmoWeb-SyntheticQA |
| - allenai/MolmoWeb-SyntheticGround |
| language: |
| - en |
| base_model: |
| - Qwen/Qwen3-8B |
| - google/siglip-so400m-patch14-384 |
| pipeline_tag: image-text-to-text |
| library_name: transformers |
| tags: |
| - multimodal |
| - olmo |
| - molmo |
| - molmo2 |
| --- |
| |
| <img src="molmoweb_logo.png" alt="Logo for the MolmoWeb Project" style="width: auto; height: 50px;"> |
|
|
| # MolmoWeb-8B-Native |
|
|
| **Note** that this is the molmo-native checkpoint, and it's NOT Huggingface/transformers-compatible. Check out [allenai/MolmoWeb-8B](https://huggingface.co/allenai/MolmoWeb-8B) for HF-compatible checkpoint. |
|
|
| MolmoWeb is a family of fully open multimodal web agents. MolmoWeb agents achieve state-of-the-art results outperforming similar scale open-weight-only |
| models such as Fara-7B, UI-Tars-1.5-7B, and Holo1-7B. MolmoWeb-8B also surpasses set-of-marks |
| (SoM) agents built on much larger closed frontier models like GPT-4o. We further demonstrate |
| consistent gains through test-time scaling via parallel rollouts with best-of-N selection, achieving 94.7% |
| and 60.5% pass@4 (compared to 78.2% and 35.3% pass@1)on WebVoyager and Online-Mind2Web |
| respectively. |
|
|
| **Learn more** about the MolmoWeb family in our announcement [blog post](https://allenai.org/blog/molmoweb) and [tech report](https://allenai.org/papers/molmoweb). |
|
|
| MolmoWeb-8B-Native is based on [Molmo2](https://arxiv.org/abs/2601.10611) architecture, which uses [Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) and [SigLIP 2](https://huggingface.co/google/siglip-so400m-patch14-384) as vision backbone. |
|
|
| Ai2 is committed to open science. The MolmoWeb datasets are available [here](https://huggingface.co/collections/allenai/molmoweb-data). |
| All other artifacts used in creating MolmoWeb (training code, [evaluations](https://github.com/allenai/molmoweb), intermediate checkpoints) will be made available, furthering our commitment to open-source AI development and reproducibility. |
|
|
| Quick links: |
| - π¬ [Demo](https://molmoweb.allen.ai/) |
| - π [All Models](https://huggingface.co/collections/allenai/molmoweb) |
| - π [All Data](https://huggingface.co/collections/allenai/molmoweb-data) |
| - π [Paper](https://allenai.org/papers/molmoweb) |
| - π₯ [Blog with Videos](https://allenai.org/blog/molmoweb) |
|
|
| ## Usage |
| Please refer to our [Github repo](https://github.com/allenai/molmoweb/) for inference code. |
|
|
| ## License and Use |
|
|
| This model is licensed under Apache 2.0. It is intended for research and educational use in accordance with Ai2βs [Responsible Use Guidelines](https://allenai.org/responsible-use). |