Data processing: artificial intelligence – Knowledge processing system
Reexamination Certificate
2000-09-22
2004-11-16
Starks, Jr., Wilbert L. (Department: 2121)
Data processing: artificial intelligence
Knowledge processing system
C716S030000
Reexamination Certificate
active
06820068
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention pertains generally to system verification. More particularly, the invention is a system and method for verifying system behaviors using constraint-calculus analysis.
2. The Prior Art
Digital (discrete-time) systems are well known in the art. In general, the design and verification of digital systems involve the complex interrelationship among logical, arithmetic and timing dependencies in the system. In particular, the input and output dependencies of such systems are of special interest.
The current methods for verifying discrete-time systems include simulation, emulation, debugging and formal verification. Simulation verification is a software solution wherein a software algorithm is programmed to carry out the functions of the discrete-time system. Using the simulation software, a user is able to construct a software version of the digital system under consideration. The simulation software normally provides an output signal, typically in the form of a graphical waveform. Upon providing the input to the simulator, the user is able to observe the output from the waveform signal produced by the simulator.
Emulation is a hardware solution for verifying the designs of digital systems. Emulation commonly involves the use of FPGA (Field-Programmable Gate Array) devices, a type of logic chip that can be programmed. An FPGA is capable of supporting thousands of gates. A programmer user is able to set the potential connections between the logic gates of the FPGA. Using FPGA devices, the user is able to prototype the digital system under consideration for verifying the input/output dependencies of the system.
Debugging generally involves manually implementing and running the system design. If any problems result during execution, the user reverts to a previous step of the design process to ascertain the source of the problem. Once the user has made the necessary changes to the system design, the user again attempts to execute the system design to ascertain if there are any additional problems, and the process is repeated again.
There are several disadvantages associated with these present verification techniques. First, simulation, emulation and debugging methods are not exhaustive. That is, these prior-art methods are, in general, not capable of completely verifying that a system design meets the behavioral requirements that are expected of it. Second, while the, above methods provide the user with input and output dependencies, the above methods fail to provide the reason a particular output is produced by a particular input, thereby inhibiting a straightforward determination of the cause of an anomalous output.
Yet another method for verifying system behavior is formal verification. Formal verification involves modeling the system under consideration as a finite-state machine. A finite-state machine includes a set of possible states with defined transitions from state to state. However, present methods of formal verification have a limited ability to handle logical, arithmetic, and timing dependencies, which are often intertwined. For example, a complete functional specification of many digital-signal-processing functions—such as infinite-impulse-response filters—involves all three types of dependencies.
Another drawback with finite-state machines is that the more complex the system or the data in the system, the more complex becomes the finite-state machine to model the system. For example, 10-bit systems would require a thousand (1,024) states, and 20-bit systems would require a million (1,048,576) states. Such complex systems result in cumbersome state-machine methods for formal verification.
Accordingly, there is a need for a system and method for verifying system behaviors which provides for comprehensive system behavior without relying upon cumbersome state-machine modeling. The present invention satisfies these needs, as well as others, and generally overcomes the deficiencies found in the background art.
An object of the invention is to provide a system and method for verifying system behaviors which overcomes the deficiencies of the prior art.
Another object of the invention is to provide a system and method for verifying system behaviors using constraint-calculus analysis.
Further objects and advantages of the invention will be brought out in the following portions of the specification, wherein the detailed description is for the purpose of fully disclosing the preferred embodiment of the invention without placing limitations thereon.
BRIEF DESCRIPTION OF THE INVENTION
The present invention is a system and method for verifying system behaviors (such as the behavior of discrete-time systems, for example) using behavior-constraint-calculus analysis. The invention may be used in such application areas as electronics, digital logic, signal processing, protocols, hardware/software codesign, computer security, mechanical engineering, biology, economics and finance, among others.
In the present invention, system behaviors are defined as a sequence of states which are either allowed (permitted/possible) or disallowed (prohibited/impossible). Behavioral constraints are defined as the product (conjunction) of Boolean formulas distributed over time which are satisfied only by disallowed behaviors. As opposed to finite-state machines which indicate possible or allowed behavior such as possible state sequences, behavioral constraints indicate what cannot happen (i.e., disallowed behavior).
There are three principal advantages to dealing with disallowed patterns of behavior: (1) many complex logical, arithmetic and timing dependencies are more conveniently/succinctly expressed in terms of disallowed behaviors, (2) the state-space explosion of conventional state-machine approaches is largely avoided and (3) there are powerful verification methods that are suited only to disallowed patterns of behavior.
The present invention may be used to verify a portion of the system or the entire system in general. A user of the system first generates a constraint graph based on a specification defining the system under test. Alternatively, a software application may be used in conjunction with the present invention to generate the constraint graph used by the invention as described herein from a specification in some other format.
Constraint theory and analysis is described in the following articles by the applicant which are incorporated herein by reference: THE THEORY OF CONSTRAINTS by Frederick C. Furtek in The Charles Stark Draper Laboratory, Report CSDL-P-1564, published May 4, 1982 and VERIFYING ONGOING BEHAVIOR: A CASE STUDY by Frederick C. Furtek in The Charles Stark Draper Laboratory, Report CSDL-P-1563, published September 1982 and presented at COMPSAC 82, IEEE, Chicago, 8-12 Nov. 1982, pp. 634-639.
In general, the invention accepts a first “known” constraint graph and a second “conjectured” constraint graph. The invention transforms the second graph in such a way that the resultant “verified” graph “accepts” only those paths that are implied by the first graph and appear in the second graph. It will be appreciated that while the present illustrative embodiment describes the invention receiving input in the form of constraint graphs, the invention may also receive other input signals such as “regular” expressions ([P][~Q] for “Prohibited: P followed by NOT Q in the next state”, for example) which may be converted to constraint graphs and processed as described herein.
By way of example and not of limitation, the constraint graphs of the present invention will be represented using conventional finite-state acceptor graphs. More particularly, each constraint graph includes a set of initial nodes (nodes with no incoming arcs), a set of terminal nodes (nodes with no outgoing arcs), and zero or more internal or intermediary nodes. Arcs labeled with Boolean formulas connect the nodes of the constraint graph. Each constraint graph will have at least one path from an initial node to a terminal node. For exam
Furtek
Starks, Jr. Wilbert L.
LandOfFree
System and method for verifying logical, arithmetic and... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with System and method for verifying logical, arithmetic and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for verifying logical, arithmetic and... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3357846