Collaborative server processing of content and...

Electrical computers and digital processing systems: support – Computer virus detection by cryptography

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C713S167000, C713S152000, C709S201000, C709S205000

Reexamination Certificate

active

06275937

ABSTRACT:

TECHNICAL FIELD
This invention relates in general to an improved data processing system. As a particular aspect, the present invention relates to a collaborative method of server processing of data objects in a server network. In a more particular aspect, the present invention relates to a collaborative method of virus checking data objects in a network of servers based on meta-information associated with each object.
GLOSSARY OF TERMS
While dictionary meanings are also implied by certain terms used herein, the following glossary of terms may be useful.
Client
A client is a computer system which issues, e.g., commands to a server which performs the task associated with the command.
Hypertext Markup Language (HTML)
HTML is the language used by Web servers to create and connect documents that are viewed by Web clients. HTML uses Hypertext documents.
Hypertext Transfer Protocol (HTTP)
HTTP is an example of a stateless protocol, which means that every request from a client to a server is treated independently. The server has no record of previous connections. At the beginning of a Universal Resource Locator (URL), “http:” indicates the requesting client and target server should communicate using the HTTP protocol regarding the specified resource. For the latest specification, see RFC 2068, R. Fielding et al., “Hypertext Transfer Protocol—HTTP/1.1,” filed January 1997; obtainable via URL: http://www.cis.ohio-state.edu/htbin/rfc/rfc2068.html, or refer to D. E. Comer,
Internetworking with TCP/IP: Principles, Protocols, and Architecture
, Prentice Hall, Englewood Cliffs, N.J., 1988, for details on retrieving Request For Comments (RFCs) using electronic mail and File Transfer Protocol (FTP).
HTTP Daemon (HTTPD)
A web server having Hypertext Transfer Protocol and Common Gateway Interface capability. The HTTPD is typically supported by an access agent which provides the hardware connections to machines on the intranet and access to the Internet.
Internet
The network of networks and gateways that use the Transmission Control Protocol/Internet Protocol (TCP/IP) suite of protocols.
Internet Browser or Web Browser
A graphical interface tool that runs Internet protocols such as HTTP, and displays results on a customer's screen. The browser can act as an Internet tour guide, complete with pictorial desktops, directories and search tools used when a user “surfs” the Internet. In this application, the Web browser is a client service which communicates with the World Wide Web.
Meta-Information and Meta Data
Any information or data associated with a given object. For example, in HTTP, information can be associated with both HTTP requests and responses by its inclusion as a field in the transaction's HTTP header, e.g., the length of a returned data object can be specified in the HTTP response header's “Length:” field.
MIME (Multimedia Internet Message Extension)
A technique for sending arbitrary data through electronic mail on the Internet. For details, see N. Borenstrein, et al., “MIME (Multimedia Internet Message Extension) Part One: Mechanisms for Specifying and Describing the Format of Internet Message Bodies,” RFC 1521, Sep. 23, 1993.
Multicasting
Broadcasting of multimedia streams (e.g., pushing video data out over the Internet).
PICS (Platform for Internet Content Selection)
PICS (“Platform for Internet Content Selection”) specifies a method of sending meta-information concerning electronic content. PICS is a Web Consortium Protocol Recommendation (see J. Miller, ed. et al., “PICS Label Distribution Label Syntax and Communication Protocols,” retrievable via URL: http://www.w3.org/pub/WWW/TR/REC-PICS-labels.html for details).
Server
Any computer that performs a task at the command of another computer is a server. A Web server typically supports one or more clients.
Universal Resource Locator (URL)
A way to uniquely identify or address information on the Internet. A URL can be considered a Web document version of an e-mail address. URLs can be cumbersome if they belong to documents buried deep within other. They can be accessed with a Hyperlink. An example of a URL is “http://www.philipyu.com:80/table.html”. A URL has four components. Starting from the left, the first specifies the protocol, which is separated from the rest of the locator by a “:”. Next is the hostname or Internet Protocol (IP) address of the target host; this is delimited by a “//” on the left and on the right by a “/” or optionally a “:”. The port number is optional, and is delimited on the left from the host name by a “:” and on the right by a “/”. The fourth component is the actual file name or program name. In this example, the “.html” extension means that this is an HTML file.
World Wide Web (WWW or Web)
The Internet's application that lets people seeking information on the Internet switch from server to server and database to database by clicking on highlighted words or phrases of interest (hyperlinks). An Internet WWW server supports clients and provides information. The Web can be considered as the Internet with all of the resources addressed as URLs and which uses HTML to display the information corresponding to URLs and provide a point-and-click interface to other URLs.
BACKGROUND OF THE INVENTION
The rapid increase in the popularity of the Internet has led to a corresponding increase in the need for processing network traffic. This processing includes both that which is mandatory (e.g., encrypted data that must be decrypted before it can be used), and that which is recommended (e.g., executable code retrieved from public Internet archives that should be checked for viruses before it is used).
Typically, when a user wants to process data retrieved from the Internet, they do so by installing and running a program that performs the desired processing on their client machine. Employees of a company, for example, might be asked or required to install and run a particular virus checking program (e.g., the AntiVirus™ product sold by International Business Machines Corporation (IBM)) on all data that they obtain from the Internet, in order to protect the company's intranet from infestation. Having individual users perform required processing like this is not a practical solution for several reasons:
Non-technical users may not be able to use the required software due to a lack of technical knowhow.
Other users may not use the software simply because it is too much bother.
If a given type of processing has any idiosyncrasies (e.g., executables whose filenames end in “exebin” sometimes, rather than simply “exe”), all users will have to be trained to handle them.
Adding and changing the required processing can be difficult for users.
An organization has no way to check that all users are performing the required processing, and thus it is difficult to implement an enforceable processing policy.
Also, if three users pull in the same piece of data, then each user will have to check the data, each performing exactly the same processing on exactly the same piece of data.
In an attempt to centralize the processing of data retrieved from the Internet, intranets that connect to the Internet through a firewall can have the processing performed by the firewall itself (e.g., see the Norton AntiVirus for Firewalls product sold by Symantec Corporation, which runs on PC/s running the WinNT 3.51 Operating System sold by Microsoft). However, this solution creates other problems:
To perform the additional processing, the firewall must run an additionally complex network-traffic filtering program, thereby increasing the chances that hackers will be able to find a way through the firewall.
Compare theorem 3: Exposed machines should run as few programs as possible, the ones that are run should be as small as possible, of William R. Cheswick and Steven M. Bellovin,
Firewalls and Internet Security: Repelling the Wily Hacker
, Addison-Wesley Professional Computing Series, Reading, Mass., 1994. Cited on: p. 7.
Since most firewalls are not meant to process assembled data streams, the speed of communication through the fire

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Collaborative server processing of content and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Collaborative server processing of content and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Collaborative server processing of content and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2537320

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.