Use of human gestures for controlling a mobile robot via adaptive CMAC network and fuzzy logic controller

Dajun Zhou, MInghui Shi, Fei Chao, Chih-Min Lin, Longzhi Yang, Changjing Shang, Changle Zhou

Research output: Contribution to journalArticlepeer-review

28 Citations (Scopus)
2 Downloads (Pure)


Mobile robots with manipulators have been more and more commonly applied in extreme and hostile environments to assist or even replace human operators for complex tasks. In addition to autonomous abilities, mobile robots need to facilitate the human–robot interaction control mode that enables human users to easily control or collaborate with robots. This paper proposes a system which uses human gestures to control an autonomous mobile robot integrating a manipulator and a video surveillance platform. A human user can control the mobile robot just as one drives an actual vehicle in the vehicle’s driving cab. The proposed system obtains human’s skeleton joints information using a motion sensing input device, which is then recognized and interpreted into a set of control commands. This is implemented, based on the availability of training data set and requirement of in-time performance, by an adaptive cerebellar model articulation controller neural network, a finite state machine, a fuzzy controller and purposely designed gesture recognition and control command generation systems. These algorithms work together implement the steering and velocity control of the mobile robot in real-time. The experimental results demonstrate that the proposed approach is able to conveniently control a mobile robot using virtual driving method, with smooth manoeuvring trajectories in various speeds.
Original languageEnglish
Pages (from-to)218-231
Number of pages14
Early online date14 Dec 2017
Publication statusPublished - 22 Mar 2018


Dive into the research topics of 'Use of human gestures for controlling a mobile robot via adaptive CMAC network and fuzzy logic controller'. Together they form a unique fingerprint.

Cite this