Once More Unto the Breach!
We found ourselves closer to the deadline with a better understanding of the challenge ahead of us, but unfortunately no closer to a viable solution.
Since our app also required some form of interaction, in parallel to our hand-tracking research, we continued working on the screen touch interface and checking the available 3rd party controllers.
At that point, hand tracking was still the expected outcome. Since none of the free solutions worked for our use case and implementing a custom one was too risky, we decided to give a paid solution a try.
The product was called ManoMotion. It offered hand tracking with gestures that seemed on par with Google MediaPipe. ManoMotion had Unity SDK and public samples available that we could evaluate.
The integration was easy. At first sight, it looked like it could work. But then we remembered the degraded tracking in low-light conditions from the previous attempts.
We contacted ManoMotion and described our case. They responded by mentioning that they have worked on a similar case before and that they could adjust the algorithm to the specific conditions at the venue.
We decided to build some prototypes. The accuracy of the basic SDK (not the Pro version, no custom modifications) was around 70% in well-lit conditions, which wasn’t perfect, but promising. ManoMotion later sent a demo based on footage in on-site conditions. The demo offered similar results in the production conditions, which was really impressive.
But we wanted to figure out how to improve the base accuracy — we were still comparing the results to our Magic Leap One experience. Even though the hand tracking wasn’t perfect because the hand needed to remain within the camera's field of view, which required some getting used to, it seemed to work better.
The reason was the distance of the hand to the camera. With the handheld mask, the distance was shorter so the effective tracking area was smaller.
This meant that creating an intuitive hand-tracking experience for mobile required much more effort, as the users needed to be made aware of the limitations without breaking the immersion.
Hand tracking was still that magical element that everyone involved wanted to include in the experience. But it became clear that we wouldn't be able to make the app controllable in that way in the amount of time we had.
The user experience just wouldn’t be good enough based on the project quality standards.