Method and system for processing and rendering object oriented n

Image analysis – Image segmentation

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382176, 358540, 358462, G06K 934

Patent

active

059664622

ABSTRACT:
A system and method processes object oriented image data by initially parsing the object oriented image data into non-neutral image data and neutral image data. A second parser circuit parses the neutral image data into black image data, grey image data, and white image data. A neutral processing circuit processes the black image data, grey image data, and the white image data. The system also classifies object oriented image data to be rendered by an object oriented rendering system. To perform the classification, the object oriented image data to be rendered is received and classified and assigned as one of a plurality of possible first level object types. The process further classifies and assigns, to the object oriented image data to be rendered, one of a plurality of possible second level object types related to the first level object type assigned to the object to be rendered.

REFERENCES:
patent: 5767978 (1998-06-01), Revankar et al.
patent: 5850474 (1998-12-01), Fan et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for processing and rendering object oriented n does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for processing and rendering object oriented n, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for processing and rendering object oriented n will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-659972

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.