Optics: measuring and testing – Position or displacement – Position transverse to viewing axis
Reexamination Certificate
2001-01-17
2004-08-10
Adams, Russell (Department: 2851)
Optics: measuring and testing
Position or displacement
Position transverse to viewing axis
C356S622000, C356S623000, C348S140000, C348S164000, C348S169000
Reexamination Certificate
active
06775014
ABSTRACT:
DEFINITION OF TERMS
As used in the context of discussing the present invention and prior art which relates to the present invention, the term “target” refers to either an object or a person (as those terms are defined immediately below), or any portion of an object or a person. The term “object” refers to an inanimate object. The term “person” refers to a human being.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates generally to target tracking and gesture recognition. In particular, the present invention relates to a system and method for determining the location of a target relative to a projection screen in a room or small area for the purpose of altering images projected onto the projection screen, such as providing a pointer or highlighted area as controlled by the location of the target.
2. Description of the Related Art
One approach for determining the location of a target involves stereoscopic images, which are three-dimensional images based on two slightly different two-dimensional images. For example, U.S. Pat. No. 5,383,013, issued to Cox (hereinafter, “Cox”), discloses a method where corresponding features in the left and right images of a stereoscopic view of a scene are determined. The disparity between the corresponding object in the scene in the left and right images is first determined. The disparity is used along with the known separation distance of the pair of cameras to provide a measurement of the distance of the target from the pair of cameras. One disadvantage of the approach disclosed in Cox is that two cameras must be used, which adds significantly to the cost and complexity of the system. In particular, Cox is computationally intensive as image features must be cross-correlated in order to determine disparity.
Another approach for determining the location of a person is disclosed in U.S. Pat. No. 5,465,144, issued to Parker et al (hereinafter, “Parker”). Parker discloses a method for tracking a person with a camera. The person wears an infrared beacon, and the infrared beacon is tracked. In addition to the drawback of requiring the person to wear an active device, this system will have problems when the person is not facing the camera.
Another approach for determining the location of an target is presented in Leibe et al., “The Perceptive Workbench: Towards Spontaneous and Natural Interaction in Semi-Immersive Virtual Environments,” December, 2000, found at (hereinafter “Leibe”). Leibe discloses a system where multiple near-infrared light sources are arranged below a desk. A camera with a filter that blocks out all visible light is also located below the desk. The underside of the desk is illuminated by the near-infrared light sources. Everything close to the desk surface reflects this light and can be seen by the camera under the display surface. Using a combination of intensity thresholding and background-subtraction, interesting regions of the camera image are extracted and analyzed. One disadvantage of the approach disclosed in Leibe is that only the location of targets that are close to the desk surface can be determined.
Leibe also discloses a system and method for determining the location of part of a person's arm. Light sources are arranged above the desk, illuminating the desk surface. The camera with the infrared filter is still located beneath the desk. A person stands in front of the desk and moves her/his arm over the desk, casting a shadow on the desk surface. The camera sees all the near-infrared light from the light sources, except the region that is blocked by the person's arm. Leibe then uses intensity thresholding and background subtraction to distinguish the person's arm from the background in the images recorded by the camera. One disadvantage of the approach disclosed in Leibe is that it is assumed that the arm's shadow always touches the image border. Hence, the middle of the area where the arm's shadow touches the image border is treated as an approximation for the origin of the arm, and the point that is farthest away from the shoulder is treated as the fingertip. Leibe is limited to the situation where only part of a person's arm needs to be tracked.
SUMMARY OF THE INVENTION
In accordance with the present invention, a robust system and method for determining the location of a target in a room or small area using inexpensive infrared technology is provided. In the system two light sources are arranged to illuminate a projection surface. A target is located between the light sources and the projection surface. The light sources shining on the target cast shadows on the projection surface, one shadow for each light source. An imaging device placed behind the projection surface detects the shadows.
In one embodiment of the present invention, the light sources are infrared light sources, and the imaging device is sensitive to infrared light, but impervious to visible light. The light sources in one embodiment are distinguished from one another by illuminating the projection surface during alternate frames. Frequency maybe adjusted so that illumination may occur during more than one frame for each source. Alternatively, the sources may be distinguished by intensity or polarization, rather than alternately turning them on.
The target's location can be inferred from the size and the location of the different shadows caused by the individual light sources. One example of determining the target's location is determining the distance that the target is located away from the projection surface. Another example of determining the target's location is determining the target's height. Where the target is a person, another example of determining the target's location is determining the center of the head of the person or the location of the person's extremities, such as the distance that the person's fingertip from the screen.
The location of the target can be used to alter what appears on the projection surface. For example, a persons finger location can be used to move a cursor or pointer, or indicate an area which should be highlighted. A person's finger distance from the screen might be used to cause a function similar to a mouse button click.
REFERENCES:
patent: 4468694 (1984-08-01), Edgar
patent: 5023709 (1991-06-01), Kita et al.
patent: 5045843 (1991-09-01), Hansen
patent: 5239373 (1993-08-01), Tang et al.
patent: 5383013 (1995-01-01), Cox
patent: 5434617 (1995-07-01), Bianchi
patent: 5465144 (1995-11-01), Parker et al.
patent: 5572251 (1996-11-01), Ogawa
patent: 6339748 (2002-01-01), Hiramatsu
patent: 6341016 (2002-01-01), Malione
patent: 6359612 (2002-03-01), Peter et al.
Leibe et al.;The Perceptive Workbench: Towards Spontaneous and Natural Interaction in Semi-Immersive Virtual Environments; http://citeseer.nj.nec.com/265507.html; Mar. 2000; IEEE Virtual Reality Conference (as provided by the applicant).*
Shanon X. Ju, Michael J. Black, Scott Minneman, Don Kimber;Analysis of Gesture and Action in Technical Talks for Video Indexing; 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition; San Juan, Puerto Rico; p. 595-601.
John C. Tang, Scott L. Minneman;Videowhiteboard: Video Shadows to Support Remote Collaboration; Association of Computing Machinery; 1991; p. 315-322.
Scott Elrod. Richard Bruce, Rich Gold, David Goldberg, Frank Halasz, William Janssen, David Lee, Kim McCall, Elin Pedersen, Ken Pier, John Tang and Brent Welch;Liveboard: A Large Interactive Display Supporting Group Meetings, Presentations and Remote Collaboration; Association for Computing Machinery; May 1992; p. 599-607.
Bastian, Leibe , Thad Starner, William Ribarsky, Zachary Wartell, David Krum, Brad Singletary and Larry Hodges;The Perceptive Workbench: Towards Spontaneous and Natural Interaction in Semi-Immersive Virtual Environments; http://citeseer.nj.nec.com/265507.html, IEEE Virtual Reality Conference, New Brunswick, N.J.; Mar. 2000.
Darrell, T.; Gordon, G.; Harville, M. and Woodfill, J., “Integrated Person Tracking Using Ster
Foote Jonathan
Kimber Don
Adams Russell
Fliesler Meyer LLP.
Fujixerox Co., Ltd.
Sever Andrew
LandOfFree
System and method for determining the location of a target... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with System and method for determining the location of a target..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for determining the location of a target... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3316550