Tuesday, March 9, 2010

An architecture for gesture-based control of mobile robots

Authors:
Iba, S.
Weghe, J. M. V.
Paredis, C. J. J.
Khosla, P. K.

Summary:
Goal: Gesture recognition system to interact with Robots. More appropriately gesture spotting with HMM.
Hardware used : Cyberglove + 6DOF location sensor.
Feature set: 18 sensor data is reduced to 10 dimension vector and each dimension taken with it derivatives increases the dimension of the vector to 20. This 20 dimension vector is coded on to a 32 bit integer. This codeword is then sent into HMM for recognition. HMM is trained with 5000 postures from full hand posture space. Restricting the observation sequence to better quantification. Wait state is included to provide a method to reject invalid gestures.

Gesture set : opening, opened, closing, pointing, waving left and waving right. These gestures carry different semantics in local mode and global mode of robot control.

Discussion:
Interesting extension to HMM for rejecting invalid gestures. Are gestures better than joystick controls while interacting with robots?

Comments: Drew, Franck

1 comment:

  1. I would say gestures are more intuitive than joystick controls. But not necessarily better.

    ReplyDelete