Method of recognizing and tracking a spatial point

Data processing: measuring – calibrating – or testing – Measurement system – Orientation or position

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C356S614000, C382S103000

Reexamination Certificate

active

07739074

ABSTRACT:
The present invention relates to a method of recognizing and tracking a spatial point, and more particularly to a method of using a point light source and a spatial point recognition device to measure the coordinates of the point light source and the coordinates of the convergent point of the spatial point recognition device based on the principle of parallax of human eyes, as to achieve the purpose of recognizing the position of a spatial point. Further, the spatial point recognition device is capable of moving the convergent point, such that the coordinates of the convergent point are superimposed onto the coordinates of the point light source, so as to achieve the purpose of tracking a spatial point automatically. At the same time, the spatial point recognition device can also receive the coordinates of a new convergent point to reset the position of the convergent point, so as to achieve the purpose of resetting the convergent point.

REFERENCES:
patent: 2008/0259355 (2008-10-01), Lin

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of recognizing and tracking a spatial point does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of recognizing and tracking a spatial point, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of recognizing and tracking a spatial point will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4152911

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.