Pretouch Sensing for Robotic Grasping
MetadataShow full item record
Robotic grasping of unknown objects is recognized to be a challenging problem; this is due to the uncertainty of object shape, caused by the imperfect perception capability of the robot. Vision and depth sensors are commonly used to sense objects before grasping. These sensors suffer from shortcomings such as occlusion, inaccurate sensor readings, and failures due to reflection and transparency. Grasping solely relying on an incomplete object shape can fail regardless of the grasp planning. On the other hand, tactile exploration is widely used to acquire local geometric information of the object and does not subject to occlusion. However, since touch sensing relies on physical contact between the manipulator and the object, it tends to unintentionally displace objects, particularly light objects. This dissertation considers "pretouch", a sensing modality that is intermediate in range between long-range depth and tactile sensing. This is potentially beneficial for robotic grasping as it provides reliable geometric information in the last centimeter before contact. In this dissertation, a novel pretouch technique, "seashell effect pretouch", is first presented. It is effective for a set of materials that other pretouch techniques fail to sense. This pretouch modality is inspired by the phenomenon of "hearing the sea" when a seashell is held to the ear; in particular, the observation that the "sound of the sea" changes as the distance from the seashell to the ear varies. It is because environmental noise is amplified the most (attenuated the least) at the cavity's resonant frequency, which changes as the cavity approaches an object. In order to turn the familiar seashell effect into a pretouch sensor, I study the underlying acoustic principle, i.e., the acoustic radiation impedence changes caused by the object being close to the opening of the cavity. The sensor design, including the acoustic properties, hardware/software design, and signal processing, are discussed in detail. The resulting implementation is fully integrated into the finger of a Willow Garage PR2 robot. The sensor detects resonance frequency shifts in the spectrum of ambient sound, which occur when the finger approaches an object. The performance of the proposed sensor is characterized and evaluated, in terms of the sensing range, accuracy, and its material selectivity. This results in the ability to reliably detect the presence of the object within 5 mm. In addition, a new infrared optical pretouch sensor can be developed, with minimal modification on the proposed sensor system design. The first explored application is detecting extremely compliant objects during grasp execution. In a pre-grasp execution experiment, the ability to detect compliant objects of the seashell effect pretouch sensor is compared with that of a pressure sensor. The results suggest advantages of seashell pretouch over tactile sensing. The second application is pretouch-assisted grasp planning. When the pretouch sensor senses the object during a series of probing motions, it provides points collected by recording the position of end effector on the robot; these additional points augment the point cloud from the depth sensor. This method compensates for object shape, that is otherwise incomplete due to depth sensor failure or occlusion. Furthermore, a unified probabilistic framework is proposed to (1) identify shape uncertainty for the target object; (2) automatically explore the uncertain areas to reduce the uncertainty, resulting in a grasp with higher confidence. In the beginning, the robot is provided with only the incomplete object shape data acquired from a Kinect depth sensor---it does not have a model of the object. Next, combining the Kinect point cloud with prior probability distributions for occlusion and transparency, it makes inferences about unobserved portions of the object. Operating on the inferred shape of the object, an iterative grasp replanning algorithm decides whether further exploration is required, and where to explore in the scene. The information gathered by the exploration action is added directly to the environment representation in real-time and hence considered in the next grasp planning iteration. Experimental results showed that, the robot is able to grasp partially transparent objects with a high success rate of 96%. Finally, I propose to augment streaming point cloud data with the seashell effect pretouch information. This is inspired by the use case of haptic rendering in a telerobotic grasping scenario. The non-contact seashell-effect pretouch sensor fixed to the robot end effector is used to sense physical geometries within the vicinity of the sensor. Thus, the point cloud representation of an unknown environment, which may be sparse or poorly visible, is enhanced through telerobotic exploration/sensing in real-time. Furthurmore, real-time haptic rendering algorithms are applied on the augmented point clouds to create haptic virtual fixtures, and also provide haptic force feedback to the operator. This method provides the teleoperator with critical geometrical information about the grasp target, while preventing the robot end effector from collision. The augmented virual environment after the pretouch exploration represents more complete object shapes, which helps the operator align the gripper on the slave robot with the target object for grasping.
- Mechanical engineering