Sigma R&D has won first prize in a gesture challenge to show just how much more talent -- like sign language translation and light saber fun -- can be unlocked in a Kinect. Normally the Microsoft device can only scope body and full mitt movements, but the research company was able to track individual fingers with a Kinect or similar sensor, plus its custom software, allowing a user's hand to become a more finely tuned controller. To prove it, the company introduced a virtual lightsaber to a subject, tracking his swordsmanship perfectly and using his thumb extension to turn it on and off. The system even detected when a passing gesture was made, seamlessly making a virtual transfer of the weapon. The same tech was also used to read sign language, displaying the intended letters on the screen for a quick translation. The SDK is due in the fall, when we can't wait to finally get our hands on a Jedi weapon that isn't dangerous or plasticky. To believe it for yourself, see the videos after the break.
Filed under: Peripherals, Software
Sigma R&D shows Kinect sign language and Jedi savvy to win gesture challenge (video) originally appeared on Engadget on Wed, 25 Jul 2012 10:57:00 EDT. Please see our terms for use of feeds.
Permalink | | Email this | Comments