Method for the perspective display of a part of a topographic ma

Boots – shoes – and leggings

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

364443, 340729, 340995, G06F 1550

Patent

active

051618869

ABSTRACT:
A method and device for the perspective display of a portion of topographic coordinate information of points on the surface of the earth proximate to the current position of a vehicle travelling thereon, utilizes a coordinate transformation to generate a perspective image from an apparent point of view above and behind the vehicle in a viewing direction, which, with the direction of travel of the vehicle, defines an imaginary plane which is perpendicular to the surface of the earth.

REFERENCES:
patent: 4489389 (1984-12-01), Beckwith et al.
patent: 4737916 (1988-04-01), Ogawa et al.
patent: 4744033 (1988-05-01), Ogawa et al.
patent: 4796189 (1989-01-01), Nakayama et al.
patent: 4896154 (1990-01-01), Factor et al.
patent: 4903211 (1990-02-01), Ando
patent: 4937570 (1990-06-01), Matsukawa et al.
patent: 4951213 (1990-08-01), Baxter et al.
"IRIS Series 2000: A Technical Overview", Silicon Graphics Computer Systems, date unknown.
"Fast Forward Engineering Workstations: The Iris Series 2000", Silicon Graphics Computer Systems, date unknown.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for the perspective display of a part of a topographic ma does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for the perspective display of a part of a topographic ma, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for the perspective display of a part of a topographic ma will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2289486

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.