Method of acoustically expressing image information

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

395336, 395978, G06F 1540

Patent

active

057154120

ABSTRACT:
The present invention relates to a method of generating sounds by a graphical user interface (GUI) used on information processing apparatuses and computers and, more particularly, to a method of acoustically expressing image information to enable visually disabled people having difficulty in using GUI to operate apparatuses with GUI, and an apparatus for carrying out the method. A specific sound is assigned before hand to an object displayed in a drawing space on a display screen or the like, and the object is expressed acoustically by generating the sound by a sound generating device for a period of time corresponding to the length of the outline of the object, the travel of the object or the ratio of change in size of the object.

REFERENCES:
patent: 4567610 (1986-01-01), McConnell
patent: 4859996 (1989-08-01), Adler et al.
patent: 5241671 (1993-08-01), Reed et al.
patent: 5395243 (1995-03-01), Lubin et al.
patent: 5412738 (1995-05-01), Brunelli et al.
patent: 5436637 (1995-07-01), Gayraud et al.
patent: 5488686 (1996-01-01), Murphy et al.
patent: 5521981 (1996-05-01), Gehring
patent: 5539869 (1996-07-01), Spoto et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of acoustically expressing image information does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of acoustically expressing image information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of acoustically expressing image information will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-670971

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.