How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Alibaba-NLP/WebDancer-32B")
messages = [
    {"role": "user", "content": "Who are you?"},
]
pipe(messages)
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Alibaba-NLP/WebDancer-32B")
model = AutoModelForCausalLM.from_pretrained("Alibaba-NLP/WebDancer-32B")
messages = [
    {"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
	messages,
	add_generation_prompt=True,
	tokenize=True,
	return_dict=True,
	return_tensors="pt",
).to(model.device)

outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))
Quick Links

This model was presented in the paper WebDancer: Towards Autonomous Information Seeking Agency.

You can download the model then run the inference scipts in https://github.com/Alibaba-NLP/WebAgent.

  • Native agentic search reasoning model using ReAct framework towards autonomous information seeking agency and Deep Research-like model.
  • We introduce a four-stage training paradigm comprising browsing data construction, trajectory sampling, supervised fine-tuning for effective cold start, and reinforcement learning for improved generalization, enabling the agent to autonomously acquire autonomous search and reasoning skills.
  • Our data-centric approach integrates trajectory-level supervision fine-tuning and reinforcement learning (DAPO) to develop a scalable pipeline for training agentic systems via SFT or RL.
  • WebDancer achieves a Pass@3 score of 61.1% on GAIA and 54.6% on WebWalkerQA.
Downloads last month
69
Inference Providers NEW
Input a message to start chatting with Alibaba-NLP/WebDancer-32B.

Model tree for Alibaba-NLP/WebDancer-32B

Base model

Qwen/Qwen2.5-32B
Finetuned
Qwen/QwQ-32B
Finetuned
(88)
this model
Quantizations
5 models

Space using Alibaba-NLP/WebDancer-32B 1

Collection including Alibaba-NLP/WebDancer-32B

Paper for Alibaba-NLP/WebDancer-32B