xymeow7 commited on
Commit
c361b0b
·
verified ·
1 Parent(s): 243413f

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +43 -24
app.py CHANGED
@@ -80,45 +80,64 @@ def predict(file_path: str):
80
 
81
  def create_demo():
82
 
83
- USAGE = """## Input data format
84
- Currently, the demo accepts a `.pkl` file containing an hand-object sequence organized as the following format:
 
 
 
85
  ```python
86
  {
87
- "hand_pose": numpy.ndarray(seq_length, 48), # MANO pose at each frame
88
- "hand_trans": numpy.ndarray(seq_length, 3), # hand global translation at each frmae
89
- "hand_shape": numpy.ndarray(10), # MANO shape coefficients
90
- "hand_verts": numpy.ndarray(seq_length, 778, 3), # MANO hand vertices
91
- "hand_faces": numpy.ndarray(1538, 3), # MANO hand faces
92
- "obj_verts": numpy.ndarray(seq_length, num_obj_verts, 3), # object vertices at each frame
93
- "obj_faces": numpy.ndarray(num_obj_faces, 3), # object faces
94
- "obj_pose": numpy.ndarray(seq_length, 4, 4), # object pose at each frame
 
 
 
95
  }
96
  ```
97
- We provide an example [here](https://drive.google.com/file/d/17oqKMhQNpRqSdApyuuCmTrPkrFl0Cqp6/view?usp=sharing). **The demo is under developing and will support more data formats in the future.**
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
98
 
99
 
100
  ## To run the demo,
101
- 1. Upload a `pickle` file to the left box by draging your file or clicking the box to open the file explorer.
102
  2. Clik the `Submit` button to run the demo.
103
- 3. The denoised sequence will be output as a `.npy` file and can be downloaded from the right box.
104
-
105
- Since the model runs on CPU currently, the speed is not very fast. For instance, it takes abount 1200s to process the [example](https://drive.google.com/file/d/17oqKMhQNpRqSdApyuuCmTrPkrFl0Cqp6/view?usp=sharing) mentioned above which contains 288 frames. Please be patient and wait for the result.
106
 
107
- To run the model faster, please visit our [github repo](https://github.com/Meowuu7/GeneOH-Diffusion), follow the instructions and run the model on your own server or local machine.
108
 
109
  ## Output data format
110
- The output is a `.npy` file containing the denoised sequence organized as the following format:
111
  ```python
112
  {
113
- "predicted_info": xxx
114
- "bf_ct_verts": numpy.ndarray(seq_length, 778, 3), # denoised MANO vertices
115
- "bf_ct_rot_var": numpy.ndarray(seq_length, 3), # denoised MANO global rotation coefficients
116
- "bf_ct_theta_var": numpy.ndarray(seq_length, 45), # denoised MANO global pose coefficients
117
- "bf_ct_beta_var": numpy.ndarray(1, 10), # denoised MANO shape coefficients
118
- "bf_ct_transl_var": numpy.ndarray(seq_length, 3), # denoised hand global translation
119
  }
120
  ```
121
- The corresponding output file of the [example](https://drive.google.com/file/d/17oqKMhQNpRqSdApyuuCmTrPkrFl0Cqp6/view?usp=sharing) mentioned above can be downloaded [here](https://drive.google.com/file/d/1Ah-qwV6LXlOyaBBe0qQRu1lN-BpKt2Y3/view?usp=sharing).
122
  """
123
 
124
 
 
80
 
81
  def create_demo():
82
 
83
+ USAGE = """# QuasiSim: Parameterized Quasi-Physical Simulators for Dexterous Manipulations Transfer
84
+ **[Project](https://meowuu7.github.io/QuasiSim/) | [Github](https://github.com/Meowuu7/QuasiSim)**
85
+ This demo transforms the input human manipulation demonstration to the trajectory of `point set` (a relaxed representation of articulated rigid object introduced in QuasiSim). It is the first step of the first stage of our optimization process. Please checkout our [github repo](https://github.com/Meowuu7/QuasiSim) for more details and instructions of running locally.
86
+ ## Input data format
87
+ Currently, the demo accepts a `.npy` file containing a human manipulation trajectory organized as the following format:
88
  ```python
89
  {
90
+ "sv_dict": {
91
+ "rhand_global_orient_gt": numpy.ndarray(seq_length, 3), # MANO global orientation coefficient
92
+ "rhand_transl": numpy.ndarray(seq_length, 3), # MANO global translation coefficient
93
+ "rhand_verts": numpy.ndarray(seq_length, 778, 3), # MANO hand vertices
94
+ "object_global_orient": numpy.ndarray(seq_length, 3), # Object global orientation (represented as rotation vectors, check below for details w.r.t. how to convert it to the rotation matrix)
95
+ "object_transl": numpy.ndarray(seq_length, 3), # Object global translations
96
+ "obj_faces": numpy.ndarray(nn_faces, 3), # Object mesh faces
97
+ "obj_verts": numpy.ndarray(nn_vertices, 3), # Object mesh vertices
98
+ "obj_vertex_normals": numpy.ndarray(nn_vertices, 3), # Object mesh vertex normals
99
+ },
100
+ "obj_sdf": numpy.ndarray(sdf_res, sdf_res, sdf_res), # Pre-processed object SDF values (see below for SDF processing details)
101
  }
102
  ```
103
+ **How to transform the object global orientation to the rotation matrix**: The object global orientation is represented as the rotation vector. To convert it to the rotation matrix, you can use `scipy.spatial.transform.Rotation` as follows:
104
+ ````python
105
+ from scipy.spatial.transform import Rotation
106
+ r = Rotation.from_rotvec(object_global_orient)
107
+ object_global_orient_rotmat = r.as_matrix()
108
+ ```
109
+ To transform the canonical object mesh vertices using the orientation vector (`object_global_orient`) and the global translation (`object_global_trans`), you can use the following code:
110
+ ```python
111
+ from scipy.spatial.transform import Rotation
112
+ r = Rotation.from_rotvec(object_global_orient)
113
+ object_global_orient_rotmat = r.as_matrix()
114
+ cur_transformed_verts = np.matmul(
115
+ obj_verts, object_global_orient_rotmat
116
+ ) + object_global_trans[None, :]
117
+ ```
118
+ We use [mesh-to-sdf](https://github.com/wang-ps/mesh2sdf) to pre-process the object mesh to the SDF values. The SDF values are stored in a 3D numpy array with the shape of `(sdf_res, sdf_res, sdf_res)`, where `sdf_res` is set to `128` in our experiments. Please check out [compute sdf](https://github.com/Meowuu7/QuasiSim/blob/main/utils/grab_preprocessing.py#L151) for our pre-processing function.
119
+
120
+ Currently, the demo only accepts input trajectories with 60 frames.
121
+ We provide an example [here](https://1drv.ms/u/s!AgSPtac7QUbHgVncbYUdKI1f5TvE?e=sRhirK).
122
 
123
 
124
  ## To run the demo,
125
+ 1. Upload a `numpy` file to the left box by draging your file or clicking the box to open the file explorer.
126
  2. Clik the `Submit` button to run the demo.
127
+ 3. The optimized trajectory of the point set will be output as a `.npy` file and can be downloaded from the right box.
 
 
128
 
129
+ Since the model runs on CPU currently, the speed is quite slow. For instance, it takes around 32h (yeah, hours...) to process the [example](https://1drv.ms/u/s!AgSPtac7QUbHgVoY8jPkPZfrDkJw?e=JYFi5a) mentioned above which contains 60 frames. However, it takes only several minutes to complish when running on a GPU! Therefore, we highly recommand checking out our [github repo](https://github.com/Meowuu7/QuasiSim), setting up an environment with GPU support, and running it locally.
130
 
131
  ## Output data format
132
+ The output is a `.npy` file containing the optimized trajectory of the point set sequence organized as a `dict` in the following format:
133
  ```python
134
  {
135
+ "ts_to_hand_obj_verts": {
136
+ ts: (hand points (numpy.ndarray(number of points contained in the point set rep, 3)), object points (numpy.ndarray(nn_vertices, 3))) for ts in range(0, seq_length)
137
+ }
 
 
 
138
  }
139
  ```
140
+ The corresponding output file of the [example](https://1drv.ms/u/s!AgSPtac7QUbHgVoY8jPkPZfrDkJw?e=JYFi5a) mentioned above can be downloaded [here](https://1drv.ms/u/s!AgSPtac7QUbHgVoY8jPkPZfrDkJw?e=JYFi5a).
141
  """
142
 
143