Near-Field Touch Interface Using Time-of-Flight Camera

Rate:
5
added:
7 years ago
views:
954

Case description

https://www.fujipress.jp/jrm/rb/robot002800050759/
JRM Vol.28 No.5 pp. 759-775 doi: 10.20965/jrm.2016.p0759 (2016)
Near-Field Touch Interface Using Time-of-Flight Camera
Lixing Zhang*,** and Takafumi Matsumaru**
*Department of automation, Shanghai Jiaotong University
800 Dongchuan RD., Minhang District, Shanghai 200240, China
**Graduate School of Information, Production and Systems, Waseda University
2-7 Hibikino, Wakamatsu-ku, Kitakyushu, Fukuoka 808-0135, Japan
Received:August 7, 2015Accepted:June 22, 2016Published:October 20, 2016
Keywords:time-of-flight camera, human-computer interaction, touch interface, projector-sensor system
Abstract: The purpose of this study is to realize a near-field touch interface that is compact, flexible, and highly accurate. We applied a 3-dimensional image sensor (time-of-flight camera) to achieve the basic functions of conventional touch interfaces, such as clicking, dragging, and sliding, and we designed a complete projector-sensor system. Unlike conventional touch interfaces, such as those on tablet PCs, the system can sense the 3-dimensional positions of fingertips and 3-dimensional directions of fingers. Moreover, it does not require a real touch screen but instead utilizes a mobile projector for display. Nonetheless, the system is compact, with a working distance of as short as around 30 cm. Our methods solve the shadow and reflection problems of the time-of-flight camera and can provide robust detection results. Tests have shown that our approach has a high success rate (98.4%) on touch/hover detection and a small standard error (2.21 mm) on position detection on average for different participants, which is the best performance we have achieved. Some applications, such as the virtual keyboard and virtual joystick, are also realized based on the proposed projector-sensor system.