robomaster2025 commited on
Commit
eb2e2ee
Β·
verified Β·
1 Parent(s): 34167d6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +38 -3
README.md CHANGED
@@ -2,6 +2,41 @@
2
  license: apache-2.0
3
  ---
4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  ## πŸš€ Benchmark Evaluation (Reproduce Paper Results)
6
  ```
7
  β”œβ”€β”€ RoboMaster
@@ -15,7 +50,7 @@ license: apache-2.0
15
  β”œβ”€β”€ bridge_eval_ours_tracking
16
  ```
17
 
18
- > **(1) Inference on Benchmark & Prepare Evaluation Files**
19
  1. Generating `bridge_eval_ours`. (Note that the results may vary slightly across different computing machines, even with the same seed. We have prepared the reference files under `eval_metrics/results`)
20
  ```bash
21
  cd RoboMaster/
@@ -23,7 +58,7 @@ license: apache-2.0
23
  ```
24
  1. Generating `bridge_eval_ours_tracking`: Install [CoTracker3](https://github.com/facebookresearch/co-tracker), and then estimate tracking points with grid size 30 on `bridge_eval_ours`.
25
 
26
- > **(2) Evaluation on Visual Quality**
27
 
28
  1. Evaluation of VBench metrics.
29
  ```bash
@@ -43,7 +78,7 @@ license: apache-2.0
43
  ```
44
 
45
 
46
- > **(3) Evaluation on Trajectory (Robotic Arm & Manipulated Object)**
47
 
48
  1. Estimation of TrajError metrics. (Note that we exclude some samples listed in `failed_track.txt`, due to failed estimation by [CoTracker3](https://github.com/facebookresearch/co-tracker))
49
  ```bash
 
2
  license: apache-2.0
3
  ---
4
 
5
+ ## πŸ”₯ Reproduce Website Demos
6
+
7
+ 1. **[Environment Set Up]** Our environment setup is identical to [CogVideoX](https://github.com/THUDM/CogVideo). You can refer to their configuration to complete the environment setup.
8
+ ```bash
9
+ conda create -n robomaster python=3.10
10
+ conda activate robomaster
11
+ ```
12
+
13
+ 2. Robotic Manipulation on Diverse Out-of-Domain Objects.
14
+ ```bash
15
+ python inference_inthewild.py \
16
+ --input_path demos/diverse_ood_objs \
17
+ --output_path samples/infer_diverse_ood_objs \
18
+ --transformer_path ckpts/RoboMaster \
19
+ --model_path ckpts/CogVideoX-Fun-V1.5-5b-InP
20
+ ```
21
+
22
+ 3. Robotic Manipulation with Diverse Skills
23
+ ```bash
24
+ python inference_inthewild.py \
25
+ --input_path demos/diverse_skills \
26
+ --output_path samples/infer_diverse_skills \
27
+ --transformer_path ckpts/RoboMaster \
28
+ --model_path ckpts/CogVideoX-Fun-V1.5-5b-InP
29
+ ```
30
+
31
+ 4. Long Video Generation in Auto-Regressive Manner
32
+ ```bash
33
+ python inference_inthewild.py \
34
+ --input_path demos/long_video \
35
+ --output_path samples/long_video \
36
+ --transformer_path ckpts/RoboMaster \
37
+ --model_path ckpts/CogVideoX-Fun-V1.5-5b-InP
38
+ ```
39
+
40
  ## πŸš€ Benchmark Evaluation (Reproduce Paper Results)
41
  ```
42
  β”œβ”€β”€ RoboMaster
 
50
  β”œβ”€β”€ bridge_eval_ours_tracking
51
  ```
52
 
53
+ **(1) Inference on Benchmark & Prepare Evaluation Files**
54
  1. Generating `bridge_eval_ours`. (Note that the results may vary slightly across different computing machines, even with the same seed. We have prepared the reference files under `eval_metrics/results`)
55
  ```bash
56
  cd RoboMaster/
 
58
  ```
59
  1. Generating `bridge_eval_ours_tracking`: Install [CoTracker3](https://github.com/facebookresearch/co-tracker), and then estimate tracking points with grid size 30 on `bridge_eval_ours`.
60
 
61
+ **(2) Evaluation on Visual Quality**
62
 
63
  1. Evaluation of VBench metrics.
64
  ```bash
 
78
  ```
79
 
80
 
81
+ **(3) Evaluation on Trajectory (Robotic Arm & Manipulated Object)**
82
 
83
  1. Estimation of TrajError metrics. (Note that we exclude some samples listed in `failed_track.txt`, due to failed estimation by [CoTracker3](https://github.com/facebookresearch/co-tracker))
84
  ```bash