Facilitating comparisons between simulated and actual...

Data processing: measuring – calibrating – or testing – Measurement system in a specific environment – Electrical signal parameter measurement system

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C702S068000, C702S069000, C702S073000, C702S074000

Reexamination Certificate

active

06684169

ABSTRACT:

FIELD OF THE INVENTION
This invention relates generally to simulating and/or testing electronic devices, and, more particularly, to facilitating comparisons between the behavior of a device predicted by a simulation and the behavior of a corresponding physical device exercised on a tester.
BACKGROUND OF THE INVENTION
When a designer develops a new electronic device, the designer generally concurrently develops a software model for the new device. The software model simulates the behavior of a physical device by producing output data in response to input data analogously to the way in which the physical device produces output signals in response to input signals.
A software model, once developed, can be made available to aid in developing a test program. A test program is a software program that runs on a tester for testing actual, physical devices. A conventional process for developing a test program is shown generally in FIG.
1
. An engineer acquires the device model (step
110
) and generates “test patterns,” i.e., sequences of data that correspond to input signals that are to be applied to inputs of an actual device under test (DUT) during operation of the test program on a tester (step
112
). The engineer applies the test patterns to the device model. In response, the device model generates response data (step
114
). Using the response data, as well as knowledge about the device and about the target tester, the engineer generates “expect data,” i.e., expected values of output signals from DUT in response to the test patterns during the execution of the test program.
To implement a test program for use with a particular target tester, a test engineer translates the simulated test patterns and expect data into software instructions that are compatible with the target tester (step
116
). The test engineer stores the software instructions in the test program (step
118
) and debugs the test program by running it on the tester to test actual, physical devices (step
120
). The test engineer may alternatively debug the test program in a software environment that simulates both the tester and the device under test. Under control of the test program, the tester (or simulated tester) applies stimuli to inputs of a DUT (or simulated DUT), and captures responses from outputs of the DUT (or simulated DUT). The tester compares the captured responses with expect data stored in the test program. If the responses match the expect data, the test program passes. Otherwise, the test program fails.
Test programs commonly fail when first run, even when testing known-good devices. The reasons for this are varied, and include inaccurate data used to generate device models, failure to account for normal semiconductor process variations, and other sources. To address these problems, test engineers customarily modify the test programs (step
122
), test the changes, and repeat as needed to yield more reliable test programs. Eventually, the test program passes and is deployed to a testing facility for performing volume production testing.
Test engineers generally develop and debug test programs with only limited access to testers and with only limited numbers of actual devices. Consequently, problems with test programs often do not arise until after the test programs have been deployed. The personnel at the testing facilities generally lack expertise in test program development and in the test programs for specific devices. When a test program fails while testing a device that is believed to be good, the testing facility may halt production until the source of the problem is identified and the problem is solved. The testing facility generally falls back on the test engineer responsible for the test program to receive assistance. Messages from the testing facility usually take the form of emails, faxes, or telephone calls, which describe the nature of the failure. The test engineer receives the messages and prepares responses for the testing facility. In preparing responses, the test engineer may consult with device designers and examine simulations of the device. The test engineer's responses generally take the form of emails or faxes, which includes suggestions for modifying the test program. They may also include revised software, transmitted electronically or via a physical storage medium.
This method of troubleshooting test programs involves several drawbacks. For example, the test development facility may be physically distant from the testing facility, making direct communication difficult. Different time zones, languages, and areas of expertise can significantly impair the ability to develop speedy solutions to test program failures. It often happens that a test engineer flies to Asia on a lengthy support mission, only to discover that the problem is one that could have been solved easily without the trip if better information had been available. Perhaps more significantly, test program failures can bring a production line to a complete halt until the failures are resolved. The opportunity cost of halting a production line can quickly rise to an excessive level.
SUMMARY OF THE INVENTION
With the foregoing background in mind, it is an object of the invention to facilitate understanding of the behavior of an electronic device exercised by an automatic test system.
To achieve the foregoing object, as well as other objectives and advantages, a system for examining the behavior a device exercised by an automatic test system includes a first collection of data for storing simulation values and a second collection of data for storing actual waveform values. The simulation values include test patterns representing inputs of a device model and response data representing outputs of the device model. They may also include expect data representing expected values of the response data. The actual waveform values include sampled inputs and/or outputs of an actual device under test (DUT) acquired using an electronic test system. When a tester exercises the DUT with test patterns that correspond to the test patterns stored in the first collection of data, the DUT generates output that can be compared directly with the response and/or expect data stored in the first collection of data. Software is included for aligning the data stored in the first and second collections of data, so that simulated and actual values can be readily compared. Analysis tools are included for examining similarities and differences between simulated and actual behavior. These may include a graphical display for visually representing simulated and actual values. The graphical display enables a user of the system to identify instances in which the actual behavior of a DUT deviates from the simulated behavior.


REFERENCES:
patent: 5371851 (1994-12-01), Pieper et al.
patent: 5838948 (1998-11-01), Bunza
patent: 6236956 (2001-05-01), Manhooth et al.
patent: 6263301 (2001-07-01), Cox et al.
patent: WO 00/45188 (2000-08-01), None
Val Garuts and Jim Tallman, On-board digital processing refines scope measurements, Mar. 13, 1980, Tektronix Inc., Beaverton, Ore, pp. 105-114.*
Image Solutions V7.0, “VX Software” Chapter 1: Simulation for Test Engineers; Chapter 5: DataScope, Teradyne, Inc.
Web pages captured Aug. 7, 2001 from www.ims.com and www.virtualtest.com/aboutvt.html, ©1999-2001 Integrated Measurement Systems™, Inc. 9525 S.W. Gemini Drive, Beaverton, OR. 97008 USA.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Facilitating comparisons between simulated and actual... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Facilitating comparisons between simulated and actual..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Facilitating comparisons between simulated and actual... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3243921

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.