PB Unity commited on
Commit
1f08bc3
1 Parent(s): 70c3ec2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -0
README.md CHANGED
@@ -1,3 +1,25 @@
1
  ---
 
2
  license: apache-2.0
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ pipeline_tag: object-detection
3
  license: apache-2.0
4
+ library_name: unity-sentis
5
  ---
6
+
7
+ # Hand Landmark from Google Mediapipe validated for Unity Sentis
8
+ This is the [Hand Landmark model](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker) from Google in the Sentis format.
9
+
10
+ The model detects 3D markers on a hand centered in an image. You could use these markers, for example, to control a Mesh. For example [this one](https://github.com/google/mediapipe/blob/master/mediapipe/modules/face_geometry/data/canonical_face_model.obj) in obj format.
11
+
12
+ **IMPORTANT:** The hands needs to be centered and cropped to fit the image. For images with hands not in the center or for multiple hands, you will need another model to detect faces, and crop them before feeding them into this model. For example you could use [Blaze Palm](https://huggingface.co/unity/sentis-blaze=palm) to detect them (but this model only works with open palms).
13
+
14
+ ## How to Use
15
+ * Create a new scene in Unity 2023
16
+ * Put the hand_landmark.sentis file in the `Assets/StreamingAssets` folder
17
+ * Put a video in the `Assets/StreamingAssets` folder and set `videoName` variable to the video name
18
+ * Create a RawImage and place it in your scene. Link to this image in the `previewUI` field.
19
+
20
+ ## Preview
21
+ If you get it working it should look like this (original image from pexels.com):
22
+ ![image showing markers](hand_tracking_preview.png)
23
+
24
+ ## License
25
+ All Google Mediapipe models are open source under the Apache 2.0 license. The accompanying C# source code we provide can be used in your applications for commercial purposes.