Method of comparison for computer systems and apparatus...

Data processing: measuring – calibrating – or testing – Measurement system – Performance or efficiency evaluation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C707S793000

Reexamination Certificate

active

06453269

ABSTRACT:

FIELD OF THE INVENTION
The present invention is related generally to software and computer programs. More specifically, the present invention is related to software for sizing and specifying database management system hardware.
BACKGROUND OF THE INVENTION
Businesses and other organizations implementing computer systems, and specifically database management systems (DBMS), and relational database management systems (RDBMS), have naturally been interested in obtaining some measure of the performance of systems they are considering, to enable comparison shopping among competing systems. This interest extends both to the hardware systems available for their database system, for example, its speed and capacity, and to the commercial software systems available to run on the hardware systems. The desire to have some objective measure of the performance of a system, and how this system performs relative to competing systems, is natural in view of the competing claims made by different hardware and software vendors. Not only is the conflicting “puffing” of sales representatives not helpful to the purchasing decision, but even seemingly objective measures of a system's capabilities may be influenced by the tests that a vendor uses to demonstrate their system. In other words, vendors will tend to use demonstration or evaluation criteria that emphasize their product's strong points, and downplay or minimize areas in which their system is weaker than their competitors'.
Several benchmark standards have been proposed, in order to provide a relatively level playing field with respect to the evaluation of different systems. Typically, benchmarking of database systems involves the construction of a hypothetical or example database. Predefined functions such as queries and/or updates, typically specified in the benchmark in SQL, are executed on this database using the hardware or software system being considered, and the database system must provide accurate results. If the database system gives proper output to the queries submitted, and updates the database accurately, the speed, throughput, or efficiency of the system may be analyzed. Two early benchmarks for DBMS, introduced in the 1980s, were the Wisconsin benchmark and the Debit-Credit benchmark. Subsequently, a benchmarking system named TP1 was created, which tested the ability of a DBMS to handle database functions related to cashing a check. One shortcoming of some of these early, relatively simple benchmarking systems was that it was possible for an unscrupulous vendor to ‘cheat’ by making insignificant changes to a hardware or software product that, while not improving the system as a whole for most real-world applications, would bring greatly improved performance on the benchmark test.
Criticisms of these early systems led to a multivendor effort to create a benchmarking system that would fairly test systems, and on which would not be possible to cheat. The multivendor efforts to create a fair benchmark led to the formation of the Transaction Processing Council, or TPC. One way that the TPC reduced a vendor's opportunity to tune its system to perform well on a specific benchmark test was to provide for random variation in certain data to be inserted into the database or in queries issued to the system. Early standards were named TPC-A, TPC-B, and TPC-C. The council has published several ‘transaction processing’ and ‘decision support’ benchmarking standards. Transaction processing benchmark specification analyze the ability of a given system to handle transactions related to, for example, individual customer's accounts, or other OLTP (on-line transaction processing) functions. Decision support benchmarking tests a system's ability to rapidly analyze entire stores of data, and return averages or transactions that have a parameter within a certain range of values. For example, a decision support benchmark database query may ask what the impact on revenue would be if sales of a certain range received a percentage discount.
More recent standards propagated by the TPC include TPC-D, TPC-H and TPC-R. The TPC results for various vendors' systems are made publicly available at the Transaction Processing Council's web site. The TPC-C benchmark, for example, has two chief parameters. The tpmC (“transactions per minute (C)”) and $/tpmC (“price per transactions per minute (C)”). The tpmC metric provides a rough measure of “business throughput,” representing the number of orders processed on a database system per minute. The $/tpmC represents the cost of the system for each transaction per minute. $/tpmC is derived by dividing the price of the entire system, not merely the server, by the tpmC that the system delivered in the benchmark evaluation. $/tpmC provides a measure of the “bang for the buck,” or in other words, the cost of the system adjusted for differences in speed between systems.
To simulate the business activity of processing an order, the following transactions are simulated under the TPC-C benchmark: New-Order, Payment, Order-Status, Delivery, and Stock-Level. Transactions per minute (tpmC) measures the number of New Order transactions that may be processed per minute by the computer being considered.
While the TPC benchmark results provide a valuable resource for consideration of the performance of various systems, an extensive number of different systems, and different hardware and software combinations, are available and included on the TPC site. These results are voluminous, and not readily scanned by human beings to make accurate or good overall judgments about what computer system will deliver the best performance for a given price range, or deliver a desired level of performance for the best price. It is desirable, therefore, to provide a convenient environment for rapid consideration of benchmark performance across systems in order to estimate the relative performance and value of various computer systems that a system planner may be considering.
SUMMARY OF THE INVENTION
The instant invention provides an environment for the consideration of various hardware and software combinations for a DBMS, and provides for the accurate quantitative comparison of competing systems according to their benchmark performance.
In a preferred embodiment, the statistical data used in the comparison method is that published by the Transaction Processing Council. Other statistical compilations of server performance may also be used, including those by other organizations, other TPC performance criteria, proprietary statistics, statistics furnished by vendors in promoting certain equipment, etc.
In one illustrative embodiment of the present invention, a system planner is presented with an option to select a configuration for a baseline system. This is the system against which a target system's performance will be compared. The system planner first selects from a choice of operating systems. These may include common network operating systems such as Unix, Windows NT, Novell NetWare, IBM AS/xxx, etc. In the event that the system planner wishes to consider operating systems that are not specifically presented, an option may be presented for a general analysis leaving the operating system unspecified.
The system planner may also select a Database Management System (or DBMS) that is or may be used with the platform configuration selected. The system planner is preferably presented with common database software such as DB2, Informix, Oracle, SQL Server, Sybase, etc. In the event the system planner wishes to consider database software that is not specifically presented, an option may be provided for a general analysis leaving the database software unspecified. This scenario may occur, for example, when proprietary or ‘in-house’ database software is used, where performance data is not likely to be available.
After selection of a baseline system, the system planner is presented with the configuration options for a ‘target’ system. The system planner then selects the parameters of the system that will be objecti

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of comparison for computer systems and apparatus... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of comparison for computer systems and apparatus..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of comparison for computer systems and apparatus... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2903482

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.