User interface for automated optical inspection systems

Computer graphics processing and selective visual display system – Display driving control circuitry – Controlling the condition of display elements

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S215000, C382S145000, C382S147000, C382S148000, C382S149000, C707S793000, C707S793000

Reexamination Certificate

active

06597381

ABSTRACT:

TECHNICAL FIELD OF THE INVENTION
This invention relates generally to user interfaces for computer systems and, more particularly, to a user interface system and method for automated optical inspection systems.
BACKGROUND OF THE INVENTION
The current generation of computer vision systems and, more specifically, computer vision systems used for automated optical inspection (“AOI”), place a significant burden on the operators of these systems to develop programs that will enable the system to classify an image into one or more disjoint or non-disjoint classes. Because of this, the user interface elements associated with these systems present to their users options for programming or modifying the system's existing program. This makes current AOI systems exceedingly difficult to use. To increase ease of use, such systems incorporate a graphical user interface (“GUI”—pronounced “gooey”). GUIs have become more prevalent in such systems with the increasingly widespread availability of powerful microprocessors.
A GUI is a type of display format that enables a user to operate a computer controlled system by pointing to pictorial representations, such as “windows” and “icons” (bitmaps), on a display device. A window is a rectangle displayed on a screen that affords a user workspace within a program. In a typical operation, the user may move the window about on the screen, change its size or shape, enlarge it to fill the screen, close it entirely, or change how much of its contents are displayed. To aid the user in the manipulation of its contents, a window will typically include a number of user interface components, such as buttons, menus, sliders, and the like. Outside the window, the screen can display other screen objects, such as other windows, or related computer system representations, such as disk drive icons or toolbars.
To navigate within a GUI, most systems employ a screen cursor or pointer, typically displayed as a small arrow icon (bitmap) which allows the user to select individual points on the screen. In operation, the screen cursor is moved to a desired screen location in response to the movement of a pointing device (e.g., a mouse) by the user. Besides effecting cursor movement, most pointing devices include one or more switches or “mouse buttons” for specifying additional user input or “user events” by “clicking” on (selecting) objects in the display device. Since many user choices may be entered through use of a pointing device (e.g., for selecting screen objects), instead of input with the keyboard, the need for the user to memorize special commands is lessened.
GUIs feature a menu bar, for instance, running across the top of the screen which serves to group or categorize commands available to the user. Clicking on an item on the menu bar typically causes a “pulldown menu” to appear. This second or “sub-menu” also includes a number of items, each of which is associated with a desired action, including the display of even more menus. To select a desired action, the user usually clicks the corresponding menu item with the screen or mouse pointer. For some menu items, particularly those which may be nested in several layers deep, a keyboard equivalent or “hotkey” may be available. The conventional graphical user interface described above significantly reduces the amount of information that a user must recall in order to effectively use the computer controlled system.
Current AOI systems incorporate graphical user interfaces for system users to interact with and modify the AOI program. However, current graphical user interfaces for AOI systems present information in a static manner to the user and the user cannot (or finds it difficult to) interact with or explore the data presented in greater depth. The information displayed to the user by the GUI in current AOI systems is generally not linked to other pieces of information on the same interface in a way that the user can manipulate the information to get still more information. For example, the user cannot filter the data or provide information to the system via the GUI.
One existing method for modifying an AOI program is through the use of automated threshold modification functions. These functions modify the behavior of an AOI system program by analyzing the AOI algorithm performance with different threshold values with respect to a set of test images. However, this method is merely an extension of the basic AOI programming methodology and, as such, still has the same efficiency and effectiveness problems normally associated with those systems. The operator is using a programming tool rather than a user interface tool and is imparting programming information rather than information that can be used to distinguish between possible class memberships of the image under inspection.
Current user interfaces, therefore, do not allow information to be imparted to the AOI system through the interacting behavior with the information displayed in the user interface. The control elements in such a graphical user interface cannot be manipulated and simultaneously updated because they are not linked in the underlying programming. Current AOI system GUIs only display information to the operator/user, but the user cannot make queries or organize the information for viewing in different ways. Additionally, by virtue of not being able to interact with the system, the user cannot increase the performance of the AOI system by imparting information learned about the production process back to the AOI system.
As a result, currently existing AOI system GUIs do not provide the user the capability to fine tune and troubleshoot false calls and defect occurrences for diagnosing a problem on the line. For example, in order to reduce the scope of a specific inspection algorithm of the AOI system, current systems require the modification of the algorithm for the entire component class(es) the algorithm looks at, instead of allowing a component-by-component basis modification. The user is therefore unable to go into the system and very quickly determine what process areas may be suspicious. Heavy reliance on process engineers is thus required in prior art AOI systems lacking a more interactive user interface.
Similarly, prior art AOI system GUIs do not provide the capability to filter defect classifications by different parameters (for example decision confidence values) that are linked in such a way as to allow narrowing of the data presented to point to a specific line problem or defect. A user of such a prior art system cannot easily determine and corroborate the reliability of classification decisions made by the system. Additionally, using current AOI system GUIs, a user cannot compare a current board defect to, for example, a prior closest example of that defect. Such a comparison allows a user to quickly and with more confidence conceptualize the decision as to whether a defect has been correctly classified. Such a comparison further provides for increased confidence in the continuity of interpretation of defects between different users. By comparing the image of a current defective board to images of prior boards and defects and, more particularly, to the closest identified defect to the current board, the user can perform a selection task instead of a recognition task. The user is thus not forced to look at an image and determine the existence of a defect without a basis for comparison. Instead, the user decides whether the current board defect is similar to a prior detected defect of high-confidence.
Presently existing AOI system GUIs also do not give a user the capability to impart user corroboration of process performance back to a system capable of learning. In this regard, we make reference to pending related U.S. patent application Ser. No. 09/935,531, filed on Sep. 22, 1997, which discloses an AOI algorithm capable of learning through both automated and manual input as to the confidence level to impart to a detected defect. If an AOI system had the underlying capability to learn from user input, current graphical user interfa

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

User interface for automated optical inspection systems does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with User interface for automated optical inspection systems, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and User interface for automated optical inspection systems will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3023911

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.