Spaces:
Sleeping
Sleeping
metadata
title: DeepSeek-R1 WebGPU
emoji: π§ π³
colorFrom: red
colorTo: blue
sdk: docker
pinned: false
app_port: 7860
license: apache-2.0
short_description: Small reasoning model running in browser
thumbnail: >-
https://huggingface.co/spaces/webml-community/deepseek-r1-webgpu/resolve/main/banner.png
DeepSeek-R1 WebGPU
Next-generation reasoning model running entirely in your browser using WebGPU acceleration.
System Requirements
- WebGPU-compatible browser
- 6GB+ RAM recommended
Features
- π Runs entirely in browser using WebGPU acceleration
- π§ 1.5B parameter reasoning model
- π Shows step-by-step reasoning process
- π’ LaTeX math support
- π Dark mode support
Getting Started
Follow the steps below to set up and run the application.
1. Clone the Repository
Clone the examples repository from GitHub:
git clone https://github.com/huggingface/transformers.js-examples.git
2. Navigate to the Project Directory
Change your working directory to the deepseek-r1-webgpu folder:
cd transformers.js-examples/deepseek-r1-webgpu
3. Install Dependencies
Install the necessary dependencies using npm:
npm i
4. Run the Development Server
Start the development server:
npm run dev
The application should now be running locally. Open your browser and go to http://localhost:5173 to see it in action.
Usage
- Click "Load model" to download and initialize the model
- Wait for model initialization to complete (~1-2 minutes)
- Type your question or select an example
- View the model's step-by-step reasoning
Browser Compatibility
- β Chrome
Troubleshooting
- WebGPU Not Available: Ensure you're using a compatible browser with WebGPU enabled
- Out of Memory: Try closing other browser tabs to free up GPU memory
- Model Loading Fails: Check your internet connection and try reloading
License & Attribution
- Model: DeepSeek-R1-Distill-Qwen-1.5B by DeepSeek AI
- Framework: π€ Transformers.js
- License: Apache 2.0