Vision-based mobile robots often suffer from the difficulties of high nonlinear dynamics and precise positioning requirements, which leads to the development demand of more powerful nonlinear approximation in controlling and monitoring of mobile robots. This paper proposes a recurrent emotional cerebellar model articulation controller (RECMAC) neural network in meeting such demand. In particular, the proposed network integrates a recurrent loop and an emotional learning mechanism into a cerebellar model articulation controller (CMAC), which is implemented as the main component of the controller module of a vision-based mobile robot. Briefly, the controller module consists of a sliding surface, the RECMAC, and a compensator controller. The incorporation of the recurrent structure in a slide model neural network controller ensures the retaining of the previous states of the robot to improve its dynamic mapping ability. The convergence of the proposed system is guaranteed by applying the Lyapunov stability analysis theory. The proposed system was validated and evaluated by both simulation and a practical moving-target tracking task. The experimentation demonstrated that the proposed system outperforms other popular neural network-based control systems, and thus it is superior in approximating highly nonlinear dynamics in controlling vision-based mobile robots.