Mock object generation by symbolic execution

Error detection/correction and fault detection/recovery – Data processing system error or fault handling – Reliability and availability

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C714S038110

Reexamination Certificate

active

07496791

ABSTRACT:
A system for testing programs using a digital processor and programs in computer memory. A mock behavior generator identifies an interface indicated for mock behavior. The interface is identified as an input parameter of a parameterized unit test. The mock behavior generator creates a symbolic object with stubs to receive calls and mock behavior that returns symbolic values upon receiving a call to the stub. A symbolic executor, symbolically executes the parameterized unit test to obtain path constraints for an implementation under test, and at least one path constraint includes the symbol returned in response to the call to the stub. A constraint solver provides solutions for the paths including concrete values assigned to returned symbols. The mock behavior generator creates mock objects that return the concrete values when the implementation under test is executed.

REFERENCES:
patent: 5784553 (1998-07-01), Kolawa et al.
patent: 6185729 (2001-02-01), Watanabe et al.
patent: 6530039 (2003-03-01), Yang
patent: 6957422 (2005-10-01), Hunt
patent: 7089542 (2006-08-01), Brand et al.
patent: 7389223 (2008-06-01), Atkin et al.
patent: 2003/0097650 (2003-05-01), Bahrs et al.
patent: 2004/0117772 (2004-06-01), Brand et al.
patent: 2004/0243951 (2004-12-01), Hall
patent: 2005/0050391 (2005-03-01), Grieskamp et al.
patent: 2005/0120274 (2005-06-01), Haghighat et al.
patent: 2005/0204201 (2005-09-01), Meenakshisundaram et al.
patent: 2005/0223362 (2005-10-01), Whitlock et al.
patent: 2006/0085156 (2006-04-01), Kolawa et al.
patent: 2006/0253739 (2006-11-01), Godefroid et al.
Veanes et al., “On-The-Fly Testing of Reactive Systems,” Microsoft Research Technical Report MSR-TR-2005-05, Jan. 2005, 16 pages.
U.S. Appl. No. 11/197,912, filed Aug. 4, 2005, Tillmann et al.
U.S. Appl. No. 11/198,569, filed Aug. 4, 2005, Tillmann et al.
U.S. Appl. No. 11/323,032, filed Dec. 30, 2005, Tillmann et al.
jtest User's Guide, Version 5.1, Parasoft Corporation, Jun. 2004, 251 pages.
Ambert et al., “BZ-TT: A Tool-Set for Test Generation from Z and B using Constraint Logic Programming,”Formal Approaches to Testing of Software, FATES 2002 workshop of CONCUR '02, INRIA Report, Aug. 2002, pp. 105-119.
Ball, “Formalizing Counterexample-driven Refinement with Weakest Preconditions,”Proceedings of 2004 Marktoberdorf Summer School, Dec. 10, 2004, 19 pages.
Barnett et al., “The Spec# Programming System: An Overview,”Construction and Analysis of Safe, Secure, and Interoperable Smart Devices: International Workshop, CASSIS 2004, vol. 3362 ofLNCS, 2005, pp. 49-69.
Barnett et al., “99.44% pure: Useful Abstractions in Specifications,”Conference Proceedings ICIS report NIII-R0426, University of Nijmegen, 2004, pp. 11-19.
Bernot et al., “Software testing based on formal specifications: a theory and a tool,”Softw. Eng. J., 6(6)387-405, 1991.
Bidoit et al., “Algebraic system specification and development,”Springer-Verlag, Chapter 1, 1991, 12 pages.
Bierman et al., “MJ: An imperative core calculus for Java and Java with effects,”University of Cambridge Computer Laboratory, Technical Report 563, 2003, 53 pages.
Boyapati et al., “Korat: Automated Testing Based on Java Predicates,”Proc. International Symposium on Software Testing and Analysis, 2002, pp. 123-133.
Brucker et al., “Symbolic Test Case Generation for Primitive Recursive Functions,”FATES, vol. 3395 ofLecture Notes in Computer Science, Springer, 2004, pp. 16-32.
Bush et al., “A static analyzer for finding dynamic programming errors,”Softw. Pract. Exper., 30(7):775-802, 2000.
Colby et al., “Automatically Closing Open Reactive Programs,”Proceedings of 1998 ACM SIGPLAN Conference on Programming Language Design and Implementation, Jun. 1998, 14 pages.
Csallner et al., “Check 'n' Crash: Combining Static Checking and Testing,”27th International Conference on Software Engineering, May 2005, pp. 422-431.
Csallner et al., “JCrasher: an automatic robustness tester for Java,”Software—Practice&Experience 2004, Dec. 18, 2003, pp. 1025-1051.
Detlefs et al., “Simplify: A Theorem Prover for Program Checking,”Hewlett Packard Systems Research Center, 2003, 121 pages.
Dick et al., “Automating the Generation and Sequencing of Test Cases from Model-Based Specifications,”Industrial Strength Formal Methods, Formal Methods Europe(FME '93),Proceedings, vol. 670 ofLNCS, Springer, 1993, pp. 268-284.
Doong et al., “The ASTOOT Approach to Testing Object-Oriented Programs,”ACM Trans. Softw. Eng. Methodol., 3(2):101-130, 1994.
Flanagan et al., “Extended Static Checking for Java,”Proc. the ACM SIGPLAN 2002 Conference on Programming language design and implementation, ACM Press, 2002, pp. 234-245.
Henkel et al., “Discovering Algebraic Specifications from Java Classes,”Proc. 17th European Conference on Object-Oriented Programming, 2003, pp. 431-456.
JCrasher documents, http://www.cc.gatech.edu/˜csallnch/jcrasher, 11 pages, downloaded Aug. 4, 2004.
Jalote, “Testing the Completeness of Specifications,”IEEE Trans. Softw. Eng., 15(5):526-531, 1989.
Jeffries et al., “Extreme Programming Installed,” Chapters 13, 14, and 29,Addison Wesley, Oct. 2000, 30 pages.
King, “Symbolic Execution and Program Testing,”Commun. ACM, 19(7):385-394, 1976.
Lahiri et al., “An Efficient Decision Procedure for UTVPI Constraints,”Technical Report MSR-TR-2005-67, Jun. 15, 2005, 18 pages.
Lahiri et al., “An Efficient Nelson-Oppen Decision Procedure for Difference Constraints over Rationals,”Technical Report MSR-TR-2005-61, May 26, 2005, 16 pages.
Lahiri et al., Predicate Abstraction via Symbolic Decision Procedures,Technical Report MSR-TR-2005-53, May 26, 2005, 19 pages.
Leino et al., “A two-tier technique for supporting quantifiers in a lazily proof-explicating theorem prover,”TACAS 2005, Oct. 2004, 13 pages.
Loeckx et al., “The Foundations of Program Verification, 2nd Edition,” Chapter 6,Wiley, 1987, 23 pages.
Mackinnon et al., “Endo-Testing: Unit Testing with Mock Objects,”eXtreme Programming and Flexible Processes in Software Engineering—XP2000, 2000, 9 pages.
Marinov et al., “TestEra: A Novel Framework for Automated Testing of Java Programs,”Proc. 16th IEEE International Conference on Automated Software Engineering, 2001, pp. 22-31.
Newkirk et al., “Test-Driven Development in Microsoft .NET,” Chapters 2, 5, 7, 8, 9, 10, and Appendix A,Microsoft Press, Apr. 2004, 163 pages.
NUnit. http://www.nunit.org/, downloaded Aug. 4, 2005, 41 pages.
Testing, http://msdn.microsoft.com/library/en-us/vsent7/html/vxoriTestingOptimizing.asp?frame=true , downloaded Aug. 4, 2005, 1 page.
Testing, Verification and Measurement—Home, http://research.microsoft.com/tvm/, downloaded Aug. 4, 2005, 4 pages.
Tillmann et al., “Unit Tests Reloaded: Parameterized Unit Testing with Symbolic Execution,”MSR-TR-2005-153, Nov. 2005, 17 pages.
Unit Testing, http://msdn.microsoft.com/library/en-us/vsent7/html/vxconunittesting.asp?frame=true, downloaded Aug. 4, 2005, 1 page.
Visual Studio Team System: Visual Studio 2005 Team System Home, http://lab.msdn.microsoft.com/teamsystem/, downloaded Aug. 4, 2005, 4 pages.
Visual Studio 2005 Team System Modeling Strategy and FAQ, http://msdn.microsoft.com/library/en-us/dnvs05/html/vstmodel.asp?frame=true, downloaded Aug. 4, 2005, 11 pages.
Visser et al., “Test Input Generation with Java PathFinder,”Proc. 2004 ACM SIGSOFT International Symposium on Software Testing and Analysis, 2004, pp. 97-107.
Xie et al., “Symstra: A Framework for Generating Object-Oriented Unit Tests Using Symbolic Execution,”TACAS, vol. 3440 ofLecture Notes in Computer Science, Springer, 2005, pp. 365-381.
Yorsh et al., “A Combination Method for Generating Interpolants,”

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Mock object generation by symbolic execution does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Mock object generation by symbolic execution, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Mock object generation by symbolic execution will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4123792

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.