zixianma02 commited on
Commit
754a010
Β·
verified Β·
1 Parent(s): 538adaa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -32,16 +32,17 @@ consistent gains through test-time scaling via parallel rollouts with best-of-N
32
  and 60.5% pass@4 (compared to 78.2% and 35.3% pass@1)on WebVoyager and Online-Mind2Web
33
  respectively.
34
 
35
- **Learn more** about the MolmoWeb family [in our announcement blog post](https://allenai.org/blog/molmoweb).
36
 
37
- MolmoWeb-8B is based on [Molmo2](https://arxiv.org/abs/2601.10611) architecture, which uses [Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) and [SigLIP 2](https://huggingface.co/google/siglip-so400m-patch14-384) as vision backbone.
38
 
39
- Ai2 is commited to open science. The MolmoWeb datasets are available [here](https://huggingface.co/collections/allenai/molmoweb-data).
40
  All other artifacts used in creating MolmoWeb (training code, [evaluations](https://github.com/allenai/molmoweb), intermediate checkpoints) will be made available, furthering our commitment to open-source AI development and reproducibility.
41
 
42
  Quick links:
43
  - πŸ’¬ [Demo](https://molmoweb.allen.ai/)
44
  - πŸ“‚ [All Models](https://huggingface.co/collections/allenai/molmoweb)
 
45
  - πŸ“ƒ [Paper](https://allenai.org/papers/molmoweb)
46
  - πŸŽ₯ [Blog with Videos](https://allenai.org/blog/molmoweb)
47
 
 
32
  and 60.5% pass@4 (compared to 78.2% and 35.3% pass@1)on WebVoyager and Online-Mind2Web
33
  respectively.
34
 
35
+ **Learn more** about the MolmoWeb family in our announcement [blog post](https://allenai.org/blog/molmoweb) and [tech report](https://allenai.org/papers/molmoweb).
36
 
37
+ MolmoWeb-4B is based on [Molmo2](https://arxiv.org/abs/2601.10611) architecture, which uses [Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) and [SigLIP 2](https://huggingface.co/google/siglip-so400m-patch14-384) as vision backbone.
38
 
39
+ Ai2 is committed to open science. The MolmoWeb datasets are available [here](https://huggingface.co/collections/allenai/molmoweb-data).
40
  All other artifacts used in creating MolmoWeb (training code, [evaluations](https://github.com/allenai/molmoweb), intermediate checkpoints) will be made available, furthering our commitment to open-source AI development and reproducibility.
41
 
42
  Quick links:
43
  - πŸ’¬ [Demo](https://molmoweb.allen.ai/)
44
  - πŸ“‚ [All Models](https://huggingface.co/collections/allenai/molmoweb)
45
+ - πŸ“š [All Data](https://huggingface.co/collections/allenai/molmoweb-data)
46
  - πŸ“ƒ [Paper](https://allenai.org/papers/molmoweb)
47
  - πŸŽ₯ [Blog with Videos](https://allenai.org/blog/molmoweb)
48