Image type classification using color discreteness features

Image analysis – Pattern recognition – Classification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S190000

Reexamination Certificate

active

06996277

ABSTRACT:
A method and system for classifying images between natural pictures and synthetic graphics is provided. In embodiments of the invention, a picture/graphic classification method and system implements one-dimensional color discreteness features, two-dimensional color discreteness features, or three-dimensional color discreteness features for image classification of natural pictures and synthetic graphics. In another embodiment of the invention, a picture/graphic combination classification method and system implements one-dimensional color discreteness features, two-dimensional color discreteness features, and/or three-dimensional color discreteness features is used to classify images between natural picture and synthetic graphic classes.

REFERENCES:
patent: 4685143 (1987-08-01), Choate
patent: 5063604 (1991-11-01), Weiman
patent: 5101440 (1992-03-01), Watanabe et al.
patent: 5264946 (1993-11-01), Takakura et al.
patent: 5309228 (1994-05-01), Nakamura
patent: 5311336 (1994-05-01), Kurita et al.
patent: 5416890 (1995-05-01), Beretta
patent: 5629989 (1997-05-01), Osada
patent: 5640492 (1997-06-01), Cortes et al.
patent: 5767978 (1998-06-01), Revankar et al.
patent: 5778156 (1998-07-01), Schweid et al.
patent: 5867593 (1999-02-01), Fukuda et al.
patent: 5917963 (1999-06-01), Miyake
patent: 6151410 (2000-11-01), Kuwata et al.
patent: 6351558 (2002-02-01), Kuwata
patent: 6430222 (2002-08-01), Okada
patent: 6647131 (2003-11-01), Bradski
patent: 6766053 (2004-07-01), Fan et al.
patent: 6771813 (2004-08-01), Katsuyama
patent: 6888962 (2005-05-01), Sonoda et al.
patent: 2001/0052971 (2001-12-01), Tsuchiya et al.
patent: 2002/0031268 (2002-03-01), Prabhakar et al.
patent: 2002/0067857 (2002-06-01), Hartman et al.
patent: 2002/0131495 (2002-09-01), Prakash et al.
patent: 2002/0146173 (2002-10-01), Herley
patent: 2003/0063803 (2003-04-01), Lin et al.
patent: 11-055540 (1999-02-01), None
patent: 11055540 (1999-02-01), None
patent: 11-066301 (1999-03-01), None
patent: 11066301 (1999-03-01), None
Arrowsmith et al., Hybrid Neural Network System for Texture Analysis, 7th Int. Conf. on Image Processing and Its Applications, vol. 1, Jul. 13, 1999, pp. 339-343.
Athitsos et al., Distinguishing Photographs and Graphics on the World Wide Web, Proc. IEEE Workshop on Content-Based Access of Image and Video Libraries, Jun. 20, 1997, pp. 10-17.
Berry et al., A Comparative Study of Matrix Measures for Maximum Likelihood Texture Classification, IEEE Trans. On Systems, Man and Cybernetics, vol. 21, No. 1, Jan. 1991, pp. 252-261.
Lee et al, Texture Image Segmentation Using Structural Artificial Neural Network, SPIE vol. 3185, pp. 58-65.
Mogi, A Hybrid Compression Method based on Region Segmentation for Synthetic and Natural Compound Images, IEEE 0-7803-5467-2, pp. 777-781.
Shafrenko et al., Histogram Based Segmentation in a Perceptually Uniform Color Space, IEEE 1057-7149/98, pp. 1354-1358.
Schettini et al., Color Image Classification Using Tree Classifier, ITIM, IAMI, The Seventh Imaging Conference, Color Science, Systems, and Applications, Nov. 1999, pp. 269-272.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image type classification using color discreteness features does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image type classification using color discreteness features, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image type classification using color discreteness features will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3639957

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.