( Python \ C# \ Unity \ Mediapipe )
Python:
First thing and most important for this project was to make a python script calling the (handtracking) MediaPipe API. This script connects the camara to google's hand tracking neural network, and returns a set of positions for each land mark in the hand, in a 2D frame. I oass in the data into anothe rufnciton that inerprets it and cleans it up so I can send it straight into Unity.
The Expirience and Enviroment:
I decided to use unity since I can leverage its tools and the asset store to combine a set of visuals and tools that uplift the expirience compelty. Leveraging the power of tools like animation controller, game object colliders, and rigid body physics I can create: a realistic enviroment, flame projectiles with glow, animal animation with a ghost shader, and a hand wireframe
The Glue:
Finally, the glue of this project is a script within Unity that reads the streamed data of a local port, and interprets the data. Since the data come sin the form of a 2D array, it simply has to be translated back into 2D positions. However, I still had to translate those positions into 3D positions. There are two simple tricks to use: 1. check the distance between the points and scale the nodes sizes, 2. check the angle of the hand by checking the normalized vector of the nodes that connect one finger to another.