One of the most important drawbacks of real robots is their lack of flexibility to changes of the environment or of the task. The aim of this project is to develop a method enabling a robot to coordinate its movements with the processes of its vision system autonomously. The common (static) calibration of the camera relative to the workspace is extended to the identification of parameters critical for the dynamic process of tracking maneuvering objects (static).
The coordination is performed in several steps involving the vision processing the motion planning and the visuo-motor coordination. In particular, the robot synchronizes its visual and motor processes, and finds the solution of the speed/accuracy trade off.
The algorithms have been tested with a manipulator tracking maneuvering targets. The robot is repeating specified movements during about one hour. After that, it can perform smooth and fast reaching movements and easily drop small objects into the wagon of a model train moving on an arbitrary trajectory. The procedures are efficient and fast, and can be used for the automatic calibration of almost any robot controlled by a vision system.
Related publications
- E. Burdet and J. Mueller (1996) A Robot Learning Reaching Motions (Video), IEEE International Conference on Robotics and Automation, Minneapolis, USA
- E. Burdet and R. Koeppe (1996) A Method for Expecting the Features of Objects and Enabling Real-time Vision, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Osaka, Japan
- J. Luthiger and E. Burdet (1999) A Modular and Sensor-oriented Motion Planner, Robotica 17: 87-95
- E. Burdet and J. Luthiger (1999) Learning the Coordination of Robot Movements with Vision Processes, Robotica 17: 563-570
- E. Burdet and M. Nuttin (1999) Learning Complex Task Using a Stepwise Approach, Journal of Intelligent and Robotic Systems 24: 43-68