Method and apparatus for identifying features of an image on a v

Communications: electrical – Land vehicle alarms or indicators – Internal alarm or indicator responsive to a condition of the...

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

340703, 340709, 340799, G09G 116

Patent

active

048476048

ABSTRACT:
A computer graphic interface allows a user to obtain descriptive information concerning a feature of a displayed image by pointing to the location of the feature. Conversely, the user may enter descriptive textual information, and the locations of responsive features are indicated. The data processing and memory storage requirements are minimized by encoding information about the image as a pixel bit map, and a color map in which the addresses or indices of the color map are correlated with the addresses or pointers to strings of descriptive information. Each color map address corresponds to a predefined set of features and descriptive information about those features. Since the pixel bit map defines a color map address for each location on the image, suitable programming of the color map can insure proper correlation of descriptive information with corresponding locations on the image. The correlation between color map addresses and the descriptive information about the features is represented most compactly by arranging or sorting the entries in the color map so that there is a correspondence between each predefined feature and a continuous range of color map addresses. Therefore, for a specified color map address, the corresponding set of features and their pointers can be found by comparing the specified color map address to the limits of the color map address ranges for the various features.

REFERENCES:
patent: 4074254 (1978-02-01), Belser et al.
patent: 4200867 (1980-04-01), Hill
patent: 4203107 (1980-05-01), Lovercheck
patent: 4249172 (1981-02-01), Watkins et al.
patent: 4303912 (1981-12-01), Stafford et al.
patent: 4395707 (1983-07-01), Satrapa
patent: 4414636 (1983-11-01), Ueda et al.
patent: 4439759 (1984-03-01), Fleming et al.
patent: 4441104 (1984-04-01), Finney, II
patent: 4451824 (1984-05-01), Thayer et al.
patent: 4471465 (1984-09-01), Mayer et al.
patent: 4481529 (1984-11-01), Kerling
patent: 4484187 (1984-11-01), Brown et al.
patent: 4488245 (1984-12-01), Dalke et al.
patent: 4517654 (1985-05-01), Carmean
patent: 4520454 (1985-05-01), Dufour et al.
patent: 4521014 (1985-06-01), Sitrick
patent: 4524421 (1985-06-01), Searby et al.
patent: 4570217 (1986-02-01), Allen et al.
patent: 4574277 (1986-03-01), Krause et al.
patent: 4580134 (1986-04-01), Campbell et al.
patent: 4583186 (1986-04-01), Davis et al.
patent: 4586036 (1986-04-01), Thomason et al.
patent: 4600918 (1986-07-01), Belisomi et al.
patent: 4616220 (1986-10-01), Grunewald et al.
patent: 4620289 (1986-10-01), Chauvel
patent: 4648028 (1987-03-01), DeKlotz et al.
patent: 4648046 (1987-03-01), Copenhaver et al.
patent: 4648050 (1987-03-01), Yamagami
patent: 4673930 (1987-06-01), Bujalski et al.
patent: 4675666 (1987-06-01), Peterson
patent: 4710806 (1987-12-01), Iwai et al.
Alan Borning, "Thinglab-A Constraint-Oriented Simulation Laboratory," Stan-CS-79-746, Ch. 2, pp. 14-37 (1979).
Steve Ciarcia, "High-Resolution Sprite-Oriented Color Graphics," Byte, pp. 57-70, 72, 76, 78, 80 (Aug. 1982).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for identifying features of an image on a v does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for identifying features of an image on a v, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for identifying features of an image on a v will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-439553

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.