by Ron Daniel
Force Reflecting Teleoperation and Haptic Interfaces present a number of mechatronic design challenges. Some solutions exist, some challenges are open. Here are described some of the challenges faced and solved at Oxford University.
Remote manipulation of hazardous environments, or human-machine interaction with virtual environments, requires devices and systems that are able to provide a human with a rich set of sensory inputs. There are a number of Mechatronic design challenges in order to develop the interfaces plus a number of unresolved problems in control and sensor integration. Haptics is the science of human touch, and a device giving the impression of touch when running a computer simulator is called a haptic interface. Haptic feedback systems are becoming more important now that Virtual Reality simulators are able to present accurate renditions of object contact. There are applications from improving design simulation in CAD through to improving diagnostic tools for medical scanning.
Mechatronic design concentrates on the investigation of methods to integrate mechanical, electronic and computer systems to achieve a solution that is not easily envisaged when considered as, say, a purely mechanical problem. For example, techniques developed for the design of controllers can be inverted to inform mechanical design, mechanical design principles can be inverted to generate high performance computer algorithms, differential mechanical drives can be utilised to overcome electronic resolution problems on encoder interfaces. Each of these examples has yielded a novel solution to a particular design problem presented by the design of force feedback devices.
The Oxford University Robotics Research Group has been developing a mechatronic approach to the design of contact interface systems for a number of years. Mechanical principles of parallel actuation, together with powerful algorithms themselves based on principles of equivalence in mechanisms, led us to suggest a powered Stewart Platform as an input device, now marketed by AEA Technology as the BSP or Bilateral Stewart Platform. This is a parallel kinematic device that delivers a high bandwidth force signal to the user and is able to convey the sense of touch that complex assembly tasks require when performed by a remote robot under human control. The problems solved to achieve this include:
The Real Time Forward Kinematic Solution for Parallel Mechanisms
The robotic interface used to transmit commands to the remote manipulator has a parallel kinematic structure. The generation of Cartesian position from joint measurements is difficult and not easily generated in real time. A new algorithm has been developed, based on the directional derivatives of the singular values of a related mechanism, that is robust to singularities in the real mechanism and can be solved iteratively in 10 microseconds on a 200 MHz PC. The algorithm was integrated into a real-time control platform developed at Oxford under the operating system QNX. This controller is based on a Virtual Machine for control that is itself robust and easy to program.
The Non-Minimum Phase Characterisation of Robot Drives
Geared electric robots have complex contact dynamics when pushing against a stiff environment. Many remote tasks present such stiff environments and lead to partial locking of the gears as torque is applied from the motor. We discovered that an almost stationary geared drive presents multiple paths for transmitting torque, the paths depending on the internal dynamics of the reduction gear train. Such multiple paths can generate non-minimum phase zeros in the linearised contact dynamics that rapidly destabilise any remote force controller. We also discovered that replacing ordinary geared drives with differential drives removes these zeros. This suggests that robots designed for force servoing, such as for assembly, should have redundant actuation. Further to this, redundant actuation means that a mechanism can carry out a contact task while maintaining internal motion. This has important implications for the resolution of dynamic signals such as joint velocity when derived from encoders. Much higher control bandwidths at low speed can, in principle, be attained by following this path.
The Specification of Achievable Teleoperation Performance
Until recently, little was known about how the physics of a teleoperator limits the achievable performance of the complete system. Important questions that must be answerable are: What is the highest force reflection ratio that can be achieved with this particular system?, To achieve the specified performance, what should be the main characteristics of the mechanisms used in the system?. These questions have been addressed and partially answered by appeal to simple physical laws that place bounds on what can be achieved using computer control. For example, we have shown that it is particularly important to get the right distribution of mass within the mechanism and to ensure that the right sensors are available given the mass distribution.
The Design of Force
If it is known that a given performance specification is achievable with a particular system, it is still necessary for tuning rules to be available for setting up the system for a specific task. We have developed simple rules for carrying out this necessary task that relate the characteristic modes of a teleoperation system to the choice of filter bandwidths in the controller.
The Integration of Computer Vision and Impact Control
Data fusion is an important task when there are many sensors being used to monitor contact. Unfortunately there has been little well-founded work on achieving the fusion of contact and visual information. We have developed a new approach that transforms uncertainty in a vision sensor into a probability of impact and expresses the cost of impact as a functional based on this uncertainty. Variational techniques may then be used to generate a predictive signal that specifies the optimal kinematic path in real-time up to the point of contact. Such control can be used to assist an operator trying to achieve remote contact during a tricky remote assembly task. It can also be integrated into a predictive dynamic controller to optimise speed of contact.
The Multiple Camera Tracking
of Man-Made Objects
Remote systems normally have viewing systems based on multiple cameras. We have developed a multiple camera object pose tracker that is able to reconstruct a virtual view of a scene from a position not covered by the viewing cameras. Such a system compensates for errors in any model that might exist of the remote environment and can be integrated with the dynamic information being generated by the robot joint-sensors to achieve accurate object tracking for tasks such as assembly.
The above is a taste of the types of problems that remote handling can present. Our current research is aimed at Haptic, rather than force-reflecting, interfaces and involves the extension of the above work to diagnostic aids for surgeons. Here the problem is to build a real-time interface to a finite-element model of human flesh so that realistic forces can be reproduced within a simulation of a surgical or diagnostic intervention. There are the usual problems of large deformation finite elements together with how such a model can be interfaced with a high-speed local model of interaction. The key is minimisation of latency in deep-contact simulation. This is an active area of research that we aim to report on in the near future.
Ron Daniel Department of Engineering Science, Oxford University
Tel: +44 1 865 273153