Motion Gestures provides powerful embedded AI-based gesture recognition software for different sensors. Unlike conventional solutions, our platform does not require any training data collection or programming and utilizes advanced machine learning algorithms. As a result, gesture software development time and costs are reduced by 10x while gesture recognition accuracy is increased to nearly 100%. We currently support touch, motion (i.e. IMU), and vision sensors. In the camera category, we support RGB (i.e. color), NIR (i.e Near Infrared), and Depth (i.e. 3D) cameras.
Motion Gestures¡¯ software can be used to add sophisticated gestures-based user interface to any product using touch, motion, or vision sensors. We support applications in all major verticals involving any type of gesture, whether static or dynamic. Software¡¯s capabilities can be scaled according to deployment hardware. For camera-based systems, sophisticated hand tracking and gesture recognition is available using a 21 joints-based skeleton that provides positional coordinates of all joints in real-time. Free SDK is available for evaluation upon request. For demos of our technology, please visit our
Gesture recognition software is available for all Synopsys ARC? processors.
Learn more about how Motion Gestures and Synopsys work together.
| Gesture Recognition in Minutes Motion Gestures¡¯ platform utilizes sophisticated AI algorithms and eliminates the need for programming and training data collection for building gesture recognition software. As a result, gesture software development time and costs are reduced by more than 10x while gesture recognition accuracy is increased to nearly 100%. This demo shows Motion Gestures¡¯ software for touch sensors running on Synopsys ARC EM Software Development Platform. |