Error detection/correction and fault detection/recovery – Data processing system error or fault handling – Reliability and availability
Reexamination Certificate
1999-06-22
2002-07-09
Baderman, Scott T. (Department: 2184)
Error detection/correction and fault detection/recovery
Data processing system error or fault handling
Reliability and availability
C709S224000
Reexamination Certificate
active
06418544
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a computer system, and deals more particularly with a method, system, and computer readable code for improving stress testing of Web servers. An altered form of client cache is used, enabling more realistic and representative client requests to be issued during the testing process.
2. Description of the Related Art
Use of the Internet and World Wide Web has skyrocketed in recent years. The Internet is a vast collection of computing resources, interconnected as a network, from sites around the world. It is used every day by millions of people. The World Wide Web (referred to herein as the “Web”) is that portion of the Internet which uses the HyperText Transfer Protocol (“HTTP”) as a protocol for exchanging messages. (Alternatively, the “HTTPS” protocol can be used, where this protocol is a security-enhanced version of HTTP.)
A user of the Internet typically accesses and uses the Internet by establishing a network connection through the services of an Internet Service Provider (ISP). An ISP provides computer users the ability to dial a telephone number using their computer modem (or other connection facility, such as satellite transmission), thereby establishing a connection to a remote computer owned or managed by the ISP. This remote computer then makes services available to the user's computer. Typical services include: providing a search facility to search throughout the interconnected computers of the Internet for items of interest to the user; a browse capability, for displaying information located with the search facility; and an electronic mail facility, with which the user can send and receive mail messages from other computer users.
The user working in a Web environment will have software running on his computer to allow him to create and send requests for information, and to see the results. These functions are typically combined in what is referred to as a “Web browser”, or “browser”. After the user has created his request using the browser, the request message is sent out into the Internet for processing. The target of the request message is one of the interconnected computers in the Internet network. That computer will receive the message, attempt to find the data satisfying the user's request, and return the located information to the browser software running on the user's computer.
This is an example of a client-server model of computing, where the machine at which the user requests information is referred to as the client, and the computer that locates the information and returns it to the client is the server. In the Web environment, the server is referred to as a “Web server”. The client-server model may be extended to what is referred to as a “three-tier architecture” or a “multi-tier architecture”. An extended architecture of this type places the Web server in an intermediate tier, where the added tier(s) typically represent data repositories of information that may be accessed by the Web server as part of the task of processing the client's request.
Because Web applications typically have a human user waiting for the response to the client requests, responses from the Web server must be returned very quickly, or the user will become dissatisfied with the service. Usage volumes for a server may be very large: a particular server may receive thousands, or even millions, of client requests in a day's time. These requests must all be handled with acceptable response times, or the users may switch to a competitor's application services.
Verifying that a server, and an application that will run on the server, can handle its expected traffic is a normal part of a stress testing process. Stress testing aims to uncover performance problems before a system goes into actual use, and is performed using simulated traffic. In this manner, any performance problems that are detected from the simulated traffic load can be addressed before any “real” users are impacted. To maximize the usefulness of the stress testing, the tests that are conducted need to be as realistic as possible. In the Web server environment, this means accurately predicting and simulating the number of requests that must be serviced, the type of requests (and mix of request types) that are received, the number of different clients sending requests, etc.
The requests received at a server typically originate from a client's browser. (Requests from other sources are outside the scope of the present discussion.) Browsers often make use of a client-side cache, where a local copy of Web documents may be stored after retrieving the document from a server. A browser using the cache checks for a user-requested document in this client-side cache, before requesting it from the server. Browsers implementing the Hypertext Transfer Protocol version 1.1 (“HTTP/1.1”) use an expiration mechanism and a validation mechanism with the client-side cache. These mechanisms are described in detail in sections 13.2 (“Expiration Model”) and 13.3 (“Validation Model”) of the HTTP specification, respectively, and are introduced in section 13 (“Caching in HTTP”). (The HTTP specification is available on the Web at http://info.internet.isi.edi/in-notes/rfc/files/rfc2068.txt.) The expiration mechanism provides that when an unexpired copy of the document is available in the cache, the response time to the user can be minimized by using this cached copy, thereby avoiding a network round trip to the server. When a copy of a document is in the cache, but it is unclear whether this version remains valid, the validation mechanism provides for reducing the network bandwidth by sending a conditional request to the server. A conditional request identifies the version of a document stored at the client by sending a “cache validator” to the server, which is a value the server uses to determine the validity of the client's document. If the server determines that this version is still valid, it responds with a short message to indicate this to the client's browser; the browser will then retrieve the locally-stored copy. Otherwise, when the client's stored copy is no longer valid, the server responds with a fresh copy. The browser uses this returned copy in response to the user's request, and will typically store this copy into the client-side cache.
To provide a meaningful stress test of a Web server, it is necessary to simulate the traffic generated by a single browser as realistically as possible, and to simulate a realistic number of browsers, as previously mentioned. A very large number of browsers (perhaps thousands) may need to be simulated for some environments. Typically, a single client machine will be used to simulate multiple browsers, to limit the number of client machines that are required. For each simulated browser, a number of system resources are required on the client machine on which the browser operates. This often implies that trade-offs in the testing are required for system resources which are in limited supply. When caching browsers are simulated, the client-side cache is one such resource. An actual client cache for a single client can consume a very large amount of storage, on the order of hundreds of thousands (or even millions) of bytes. When simulating caching browsers, an upper bound may be placed on the number of simulated browsers in order to limit the cache storage requirements, but this will reduce the effectiveness of the testing. In particular, imposing a limit on the number of client browsers that can be simulated during a stress test may greatly reduce the ability for the test to provide realistic, representative traffic and to therefore provide useful results. As an alternative to limiting the number of simulated browsers, additional storage resources may be added, but this often greatly increases the expense (and perhaps the complexity) of the testing environment.
Several prior art test tool approaches are known, which implement different approaches for dealing with cl
Nesbitt Richard Elderkin
Schroder, II Robert John
Vaughan Kevin Edward
Baderman Scott T.
Clay A. Bruce
Doubet Marcia L.
LandOfFree
Use of a client meta-cache for realistic high-level web... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Use of a client meta-cache for realistic high-level web..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Use of a client meta-cache for realistic high-level web... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2918197