Virtual Prototyping
Using the Virtual Data-driven Measurement to Support the Prototyping of Hand Gesture Recognition Interface with Distance Sensor
Chengshuo Xia* ,齋藤彩音* ,杉浦裕太( * 共同筆頭著者)
Chengshuo Xia*, Ayane Saito*, Yuta Sugiura, * these authors contributed equally

[Reference /引用はこちら]
Chengshuo Xia*, Ayane Saito*, Yuta Sugiura, Using the Virtual Data-driven Measurement to Support the Prototyping of Hand Gesture Recognition Interface with Distance Sensor, Sensors and Actuators A: Physical, * these authors contributed equally.[DOI]


The performance of a fixed-sensor-based hand gesture recognition system is typically influenced by the position and number of sensors. The traditional development approach to hand gesture recognition systems follows a process of sensor pre-deployment, data collection, and model training, which is highly time-consuming and expensive to calculate how many sensors to place where, and the system has low secondary development flexibility. In this paper, we present a new development flow to assist in prototyping distance sensor-based gesture recognition interfaces. The designed system was able to simulate the position and number of sensors to recognize gestures. Using a reconstructed hand motion, the virtual distance sensor generated simulated signals and trained a convolutional neural network model. In a real-world setting, the sensor system only needed to be configured by transfer learning to recognize gestures at the same sensor layout. The proposed method was able to indicate sensor configuration and the trained classifier via virtual distance data, which can effectively reduce the development cost. We evaluated two prototype interfaces of the proposed method using distance sensors and demonstrated that the system effectively provided deployment recommendations, and models trained using virtual measurement data could effectively recognize real gestures.