MolmoWeb-4B-Native / README.md
zixianma02's picture
Update README.md
720ec5a verified
---
license: apache-2.0
datasets:
- allenai/MolmoWeb-SyntheticTraj
- allenai/MolmoWeb-HumanTrajs
- allenai/MolmoWeb-HumanSkills
- allenai/MolmoWeb-SyntheticSkills
- allenai/MolmoWeb-SyntheticQA
- allenai/MolmoWeb-SyntheticGround
language:
- en
base_model:
- Qwen/Qwen3-8B
- google/siglip-so400m-patch14-384
pipeline_tag: image-text-to-text
library_name: transformers
tags:
- multimodal
- olmo
- molmo
- molmo2
---
<img src="molmoweb_logo.png" alt="Logo for the MolmoWeb Project" style="width: auto; height: 50px;">
# MolmoWeb-4B-Native
**Note** that this is the molmo-native checkpoint, and it's NOT Huggingface/transformers-compatible. Check out [allenai/MolmoWeb-4B](https://huggingface.co/allenai/MolmoWeb-4B) for HF-compatible checkpoint.
MolmoWeb is a family of fully open multimodal web agents. MolmoWeb agents achieve state-of-the-art results outperforming similar scale open-weight-only
models such as Fara-7B, UI-Tars-1.5-7B, and Holo1-7B. MolmoWeb-8B also surpasses set-of-marks
(SoM) agents built on much larger closed frontier models like GPT-4o. We further demonstrate
consistent gains through test-time scaling via parallel rollouts with best-of-N selection, achieving 94.7%
and 60.5% pass@4 (compared to 78.2% and 35.3% pass@1)on WebVoyager and Online-Mind2Web
respectively.
**Learn more** about the MolmoWeb family in our announcement [blog post](https://allenai.org/blog/molmoweb) and [tech report](https://allenai.org/papers/molmoweb).
MolmoWeb-4B-Native is based on [Molmo2](https://arxiv.org/abs/2601.10611) architecture, which uses [Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) and [SigLIP 2](https://huggingface.co/google/siglip-so400m-patch14-384) as vision backbone.
Ai2 is committed to open science. The MolmoWeb datasets are available [here](https://huggingface.co/collections/allenai/molmoweb-data).
All other artifacts used in creating MolmoWeb (training code, [evaluations](https://github.com/allenai/molmoweb), intermediate checkpoints) will be made available, furthering our commitment to open-source AI development and reproducibility.
Quick links:
- πŸ’¬ [Demo](https://molmoweb.allen.ai/)
- πŸ“‚ [All Models](https://huggingface.co/collections/allenai/molmoweb)
- πŸ“š [All Data](https://huggingface.co/collections/allenai/molmoweb-data)
- πŸ“ƒ [Paper](https://allenai.org/papers/molmoweb)
- πŸŽ₯ [Blog with Videos](https://allenai.org/blog/molmoweb)
## Usage
Please refer to our [Github repo](https://github.com/allenai/molmoweb/) for inference code.
## License and Use
This model is licensed under Apache 2.0. It is intended for research and educational use in accordance with Ai2’s [Responsible Use Guidelines](https://allenai.org/responsible-use).