Method and apparatus for control of robotic grip or for...

Image analysis – Applications – Range or distance measuring

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S153000, C356S370000

Reexamination Certificate

active

06249591

ABSTRACT:

TECHNICAL FIELD
This invention relates to an optical sensing system and more particularly to an optical sensing system that can sense proximity, surface contact, and lateral movement of one object relative to another object.
BACKGROUND ART
Distance from a known object to a target object can be measured in many non-contacting ways. Electromagnetic energy is used for RADAR data, acoustical energy is used for SONAR data, and light is used for LIDAR data. RADAR uses radio waves, SONAR uses acoustical waves, and LIDAR uses light waves. These three distance determining techniques rely on time-of-flight measurements to determine distance. Other techniques include the use of structured light, interferometry, and various vision systems. Also, conventional force and pressure sensors can be used. Another technique uses the measurement of back-pressure from a gas jet impinging on the surface of the object being sensed.
One structured-light method for determining distance between two non-contacting objects, i.e., proximity between two objects, is described in U.S. Pat. No. 4,479,053, entitled “Focal Plane Array Optical Proximity Sensor,” to Johnston. An optical system mounted in a first object senses the relative position of a surface of a second object. This optical proximity sensor works by utilizing the intersection of an illuminated light field with a light-detecting field-of-view to define a detection volume. Light reflected from the surface of an object in the detection volume is sensed by a photodiode and indicates that an object is within the volume defined by the intersecting fields. The light emitters in Johnston are light-emitting diodes, and the light detectors are photodiodes. There is a photodiode for each light-emitting diode, and each photodiode is located on the image plane of a lens whose conjugate object plane intersects the aforementioned intersection volume. By appropriately positioning each light-emitting diode/photodiode pair on its respective focal plane, and appropriately positioning the lens assemblies with respect to each other, any desired volume can be defined in which an object is to be detected.
Another method using structured light for determining the distance between two non-contacting objects is described in U.S. Pat. No. 4,687,325, entitled “Three Dimensional Range Camera,” to Corby, Jr. A pattern generator and projector on the first object produce a 1 N array of time/space coded light rays whose intensities can be varied with time, and projects P sequential presentations of different subsets of the light rays onto the second object, where P=1+logbN, where b is the number of brightness levels, and where N is the number of rays. The light rays are projected onto the second object along a direction which is not coaxial with the optical axis of an imaging device. A linear sensor such as a line scan camera images the points of light where rays are incident on the surface of the second object and generates one-dimensional scan signals which have ray-associated peaks at locations corresponding to imaged light. A high speed range processor analyzes the one-dimensional waveforms to uniquely identify all rays, determines depth from the displacement of the ray peaks from their zero-height reference-plane peaks, and provides output range data. To create two-dimensional range maps, a means such as a rotating mirror is provided for orthogonally sweeping the 1N coded light rays by steps over a rectangular plane.
While these structured-light methods work well for their intended purposes, they are not able to sense contact with another object. Force and pressure sensors however can be used to measure contact between two objects, but cannot sense proximity between the two objects unless there is actual contact. As well, proximity sensors and contact sensors are not necessarily well suited to sense lateral displacement between the sensor on the first object and the target surface on the second object when the sensor-to-target separation is small.
What is needed is a method and system for sensing close proximity of one object to another, touch or contact between two objects, and lateral movement or “slip” between two objects in contact or nearly in contact with one-another.
SUMMARY OF THE INVENTION
The invention is a sensing device and method that can determine the proximity and lateral movement of a second object relative to a first object by illuminating a surface of the second object to create light contrasts that correspond to unique structural or printed features of the surface of that second object. Sensors mounted with the illumination device on or in the first object then compare the light contrast data of the illuminated surface of the second object to some reference data to determine relative proximity and lateral movement information.
Key components of the invention are illumination, imaging optics, and a sensor array. Illumination is necessary to create light contrasts on the second object of interest that are indicative of structural or printed features of the surface of that object. The imaging optics collect and focus light power and by integrating the collected power over sampling intervals of time, the sensor array converts integrated power samples into electrical signals.
Light contrasts can be used as landmarks for position determination since the surfaces of most objects have unique optical or structural features or printed features that interact with light to produce unique contrast patterns. To create light contrasts, conventional light sources such as light-emitting diodes may be used. There are different techniques for illuminating the object of interest that can be used to maximize contrast values. In some instances, often when a surface exhibits glossy or mirror-like properties, light should be delivered from a direction normal to the object of interest. In other instances, often when a surface exhibits diffuse light-scattering properties, light impacting the object of interest at an angle helps to create the best light contrasts.
An imaging lens or series of imaging lenses are used to image light contrasts exhibited at the surface of the second object onto the sensor array by collecting light that is reflected or scattered from a surface portion on the second object. The magnification of the imaging lens(es) is chosen such as to scale the feature sizes appropriately to enable the sensor array to adequately resolve their uniqueness.
In addition, telecentric lenses can be used so that small proximity changes near a contact point will not change the effective magnification of the lens. A variable iris may also be used with the imaging lens to enable varying the numerical aperture and thereby varying the depth-of-field over which contrasts remain high near the plane of contact. Similarly, a zoom lens may be used to enable accurate proximity measurements at much greater distances away from a plane of contact.
The sensor is an array of discrete optically sensitive elements. The spacing of the elements affects the resolution of the images that the sensor array can produce with any fixed imaging magnification. The sensor array can be a charged-coupled device, an amorphous silicon photodiode array, or an array of active pixels.
The typical sensing device consists of an illumination source, imaging optics, and a sensor array which are usually mounted or secured to a base in or on the first object that may contain processing electronics and other structural features.
The imaging optics are attached such that the image plane is at the sensor array when the object plane is tangent to the contacting surfaces of the first and second objects. The sensor array creates an electronic image when its pixel sensing elements receive light imaged from the plane of contact. The illumination source is located such that light can impact the surface of a second object, the object of interest, at the desired angle, when the surface of that object of interest is in proximity to contact or actually in contact with the first object. The illumination source may be fi

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for control of robotic grip or for... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for control of robotic grip or for..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for control of robotic grip or for... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2455631

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.