Analyzing an image showing editing marks to obtain category of e

Image analysis – Editing – error checking – or correction

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382276, 395140, G06T 300

Patent

active

056596391

ABSTRACT:
Input image data define an input image set that shows a graphical feature and editing marks indicating an editing operation to be performed on the graphical feature. The input image data are used to obtain operation category data indicating whether the editing operation would translate the graphical feature so that it is centered at a different position within the input image set. The operation category data are used to obtain output image data defining an output image that includes an edited version of the input image set. The output image shows the graphical feature centered at a different position only if the operation category data so indicate. The input image set can include an original image showing the graphical feature and an overlay image showing the editing marks. The editing marks can form a node-link structure with the graphical feature. If the structure is a directed graph, it can indicate an editing operation that would translate the graphical feature to be centered at a different position, such as a simple translation to a new position, a translation with scaling or rotation, or a replacement operation. If the structure is an undirected graph, it can indicate an editing operation that would not translate the graphical feature, such as a delete operation or a scale or rotate operation. A rectangle with a dot inside it can indicate scaling or rotation. A cross can indicate deletion.

REFERENCES:
patent: 4475239 (1984-10-01), van Raamsdonk
patent: 4542378 (1985-09-01), Suganuma et al.
patent: 5012521 (1991-04-01), Endo et al.
patent: 5123057 (1992-06-01), Verly et al.
patent: 5134669 (1992-07-01), Keogh et al.
patent: 5251290 (1993-10-01), Pabon
patent: 5270806 (1993-12-01), Venable et al.
patent: 5274468 (1993-12-01), Ojha
patent: 5287439 (1994-02-01), Koga et al.
patent: 5337161 (1994-08-01), Hube
patent: 5345543 (1994-09-01), Capps et al.
patent: 5363211 (1994-11-01), Hasebe et al.
patent: 5404439 (1995-04-01), Moran et al.
patent: 5438430 (1995-08-01), Mackinlay et al.
patent: 5455898 (1995-10-01), Mahoney et al.
patent: 5465167 (1995-11-01), Cooper et al.
patent: 5490246 (1996-02-01), Brotsky et al.
patent: 5513271 (1996-04-01), Rao et al.
patent: 5522022 (1996-05-01), Rao et al.
patent: 5537491 (1996-07-01), Mahoney et al.
U.S. application No. 08/039,553 entitled "Editing Text in an Image" to Bagley et al., filed Mar. 29, 1993 (to be issued on Aug. 20, 1996 as U.S. Pat. No. 5,548,700).
U.S. application No. 08/158,063 entitled "Using a Category to Analyze an Image Showing a Graphical Representation" to Mahoney et al., filed Nov. 24, 1993 (issued on Aug. 6, 1996 as U.S. Pat. No. 5,544,267-patent not available yet).
U.S. application No. 07/933,422 entitled "Automatically Changing Text Characteristics by Repositioning Word Images" to Stuart K. Card, filed Aug. 21, 1992.
U.S. application No. 07/933,426 entitled "Atuomatic Image Creation by Merging Text Image and From Image" to Robertson, filed Aug. 21, 1992.
U.S. application No. 08/157,790 entitled "Using an Image Showing a Perimeter Relationship Representation to Obtain Data Indicating a Relationship Among Distinctions" (as amended) to Mahoney, filed Nov. 24, 1993.
U.S. application No. 08/543,232 entitled "Data Access Based on Human-Produced Images" to Johnson, filed Oct. 13, 1995.
U.S. application No. 08/394,919 entitled "Generalized Wiping as a User Interface for Object-Based Graphical Display" to Thomas P. Moran, filed Feb. 27, 1995.
U.S. application No. 08/503,746 entitled "Analyzing an Image Showing a Parallel Length Graph" to Rao et al., filed Jul. 18, 1995.
Fukada, Y., "Primary Algorithm for the Understanding of Logic Circuit Diagrams," Proceedings of the 6th International Conference on Pattern Recognition, Munich West Germany, 19-22 Oct. 1982, New York: IEEE, 1982, pp. 706-709.
Nagura, M., and Suenaga, Y., "A Facsimile-Based Graphics Editing System by Auxiliary Mark Recognition, " IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE, 1983, pp. 433-441.
Communication dated Oct. 5, 1995 and European Search Report, Application No. EP 94 30 8666.
Suenaga, Y., "Some Techniques for Docucment and Image Preparation," Systems and Computers in Japan, vol. 17, No. 3, 1986, pp. 35-46.
"A dynamic system for object description and correspondence", B. Parvin et al. Computer Vision and Pattern Recognition, 1991.
Suenaga, Y., "A Facsimile Based Text Editor Using Handwritten mark Recognition," IJCAI-79, Proceedings of the Sixth International Jount Conference on Artificial Intelligence, Tokyo, Aug. 20-23, 1979, vol. 2, pp. 856-858.
Suenaga, Y., and Nagura, M., "A Facsimile Based Manuscript Layout and Editing System by Auxiliary Mark Recognition," 5th International Conference on Pattern Recognition, vol. 2, IEEE, 1980, pp. 856-858.
Ricoh Imagio MF530 Series General Catalog, Ricoh K. K., 1992, with English translation of cover page and pp. 1-2 and 23-24.
The Apple Catalog, Fall 1993, pp. 1 and 4-5.
Helm, R. Marriott, K., Odersky, M., "Building Visual Language Parsers," in Proceedings of CHI, 1991 (New Orleans, Louisiana, Apr. 29-May 2, 1991), ACM, New York, 1991, pp. 105-112.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Analyzing an image showing editing marks to obtain category of e does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Analyzing an image showing editing marks to obtain category of e, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Analyzing an image showing editing marks to obtain category of e will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1111013

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.