Jumat, 15 Juni 2018

Sponsored Links

Skinput - Wikipedia
src: upload.wikimedia.org

Skinput is an input technology that uses bio-acoustic sensing to localize finger tapping on the skin. When coupled with pico-projectors, the device can provide direct manipulation, graphical user interface on the body. The technology was developed by Chris Harrison, Desney Tan, and Dan Morris, in the Microsoft Research Computing Experience Group. Skinput is one way to separate input from electronic devices in order to allow the device to become smaller without simultaneously shrinking the surface area where inputs can be made. While other systems, such as SixthSense have tried this with computer vision, Skinput uses acoustics, which take advantage of the conductive nature of the human body's natural sound (eg, bone conduction). This allows the body to be annexed as an input surface without the need for skin to be invasively invaded by sensors, track markers, or other items.

Microsoft has not commented about the future of the project, otherwise it is in active development. It has been reported that this may not appear on commercial devices for at least 2 years.


Video Skinput



Operation

Skinput has been openly demonstrated as armbands, which sit on the biceps. This prototype contains ten small cantilever Piezo elements that are configured to be extremely resonant, sensitive to frequencies between 25 and 78 Hz. This configuration works like a mechanical Fast Fourier transform and delivers extreme extreme noise outside the band, allowing the system to work even when the user is on the move. From the upper arm, the sensor can localize the finger tap provided for each section of the arm, up to the fingertips, with an accuracy of more than 90% (as high as 96% for the five input locations). Classification is driven by supporting vector engines using a series of independent acoustic features that act like fingerprint time. Like the voice recognition system, the Skinput recognition engine should be trained on the "sound" of each input location before use. After the training, the location can be tied to interactive functions, such as pause/play songs, increase/decrease music volume, speed dial, and menu navigation.

With the addition of pico projectors to the armbands, Skinput allows users to interact with the graphical user interface that is displayed directly on the skin. This allows for some interaction modalities, including button-based hierarchical navigation, list-based slide navigation (similar to iPod/SmartPhone/MID), text/number entry (e.g., Phone number keypad), and games (e.g., Tetris, Frogger)

Maps Skinput



Demonstration

Although it is an internal Microsoft Research project, Skinput has been shown openly multiple times. The first public appearance is at Microsoft's TechFest 2010, where the recognition model is trained directly on stage, during presentations, followed by an interactive guide from a simple mobile application with four modes: music player, email inbox, Tetris, and voice mail. A similar live demo was given at the CHM 2010 ACM conference, where an academic paper received the "Best Paper" award. Participants are allowed to try the system. Many media have covered the technology, with some showing live demos.

Seminar Presentation By Computer Engineering Students | Skinput ...
src: i.ytimg.com


References


Natural user interface by Radzman Tan
src: img.haikudeck.com


External links

  • ACM Digital Library - Skinput: adjusts body as input surface

Source of the article : Wikipedia

Comments
0 Comments