Method and system for integration of new media into...

Electrical computers and digital processing systems: multicomput – Distributed data processing – Client/server

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C709S217000, C709S219000, C709S231000, C709S232000

Reexamination Certificate

active

06816883

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates to computer application technology. In particular, it relates to an improved method and system for providing the end-user of a mainframe application running a limited, character-oriented transfer protocol like IBM 3270 protocol with a combined rendering of non-character, i.e. New media data and traditional character data.
BACKGROUND OF THE INVENTION
The so-called new media data extends traditional computer data formats such as EBCDIC coded text files, DB2 records and tables into more natural data formats for the interaction of humans and computers by incorporating images, motion pictures, voice, audio, and video.
This kind of data is getting more and more important in the information technology business. It accompanies the traditional computer data and the end-user dealing with both types of data expects to view it at the same time on the same rendering device.
Within today's information technology environments we normally see a two-tiered or three-tiered infrastructure:
Tier 1 is thereby represented by an “intelligent” PC (Personal Computer) or NC (Network Computer) or Workstation which is used to render the data which is produced by applications running on tier 2—the so called application servers.
Mostly the tier 1 machines are communicating to tier 2 machines through so-called terminal emulators such as 3270 emulators with an S/390 application server running CICS/IMS applications or a telnet emulator which connects to application servers using the UNIX operating system like SAP/R3.
The application server in that case does not know that it is connected to an “intelligent PC” but just sends ASCII or EBCDIC data down to that “Terminal”. In all those cases all functionality resides on the tier 2 application server. That means that the structured data is “pushed” from the host to the client and the rendering of this data is controlled by the host.
In the context of said new media the term ‘rendering’ is to be understood as comprising playing back image data, audio data, or video data, or motion pictures with the respective suited hardware and/or software arrangement.
The rendering of new media data in contrast is normally initiated from the client. A media renderer “pulls” the data from the application server or at least initiates the “push” from the server. If a user for example wants to view a video, the client workstation passes the so-called Universal Resource Locator, further referred to herein as URL, and according meta data to the server. Then the server streams that video to the client. However, the client workstation has to get the UBL first, in order to initiate a “play” request to the server.
So the problem is to combine the two paradigms to enable the application server on tier 2 to command all the logic required to render all data on the same end user workstation.
A prior art way to solve said problem of combining traditional data and new media data is to “wrap” the traditional user interface on the client workstation by introducing program logic on the client side which performs the integration. In order to achieve that a windows-oriented “wrapper” program must be programmed which envelopes the mainframe application on the client side, i.e., which accesses the relevant data of each mainframe application ‘panel’ and feeds it to the ‘modern’ back-end program.
As, however, a mainframe application usually has a very large number of panels provided to the end-user for displaying and entering mainframe application data such work is per se very complex because this requires normally a change of the programming paradigm from the traditional model to an object-oriented model, i.e., a business object model.
The most relevant obstacles, however, to achieve an efficient integration of new media data in those mainframe applications are:
1. In order to extend an existing character-oriented application with new media data changes have to be applied to both, the mainframe application, as well as to the “wrapper” programs running on the client workstations.
2. Such a windows-oriented “wrapper” program has to be installed at multiple locations in the network, i.e., for each end-user location. Then, however, the maintenance of such an end-user IT environment requires a large amount of work as maintenance has to be provided at those multiple, maybe thousands of locations in the network.
The costs associated with such an approach can thus be tremendous.
SUMMARY OF THE INVENTION
It is thus an object of the present invention to provide a method and system for providing non-character data to a client computer coupled to an application server located in a network and using a character-oriented protocol in order to run said application whereby the above mentioned obstacles are removed.
The present invention provices for finding a way which allows a host-initiated mechanism to render non-character data such as the before-mentioned image, audio and video data on a client workstation, while modifying only one single place of application code, namely within the host-based application, like e.g., of the CICS or IMS type.
The example used below describes the process of rendering video data using the streaming technique, because this is the most comprehensive one. However, the invention works the same way for rendering image data or audio data using streaming or store and forward techniques.
This approach splits up in finding a solution for the following two separate problems:
1. The host application has somehow to pass the media URL to the client workstation, such that it will be able to initiate a play request from the media delivery server, further referred to herein as Stream Server, based on the given URL.
2. Client workstations running CICS and IMS 3270 type of terminal emulations are mostly attached via the so-called SNA protocol, while at the same time playing media data requires TCP/IP sessions.
So there is an inherent network protocol and address mapping problem that has to be solved.
Briefly summarizing the basic concepts of the present invention it is proposed to install an individually programmed program component, called Server Media Resolution Service (SMRS) on the application server site and a matching program component, called Client Media Resolution Service (CMRS) which is a universal, standard component without any individual application specific features. The SMRS is told the client computer destination, searches the requested media address and feeds this meta information to the CMRS which in turn manages the start of a client site media renderer in order to render the new media data received from a datastore such as the File System. A practical example of the media renderer could be a media player which requests and renders a streamable asset provided by a a stream server (
22
) to said media player. Thus, when any change in the host application program is required such changes have to be done in the host application only, the SMRS and the plurality of CMRS components may, however, remain untouched which reduces system programmers work significantly.
The following summarizing list of items shows the data and command flow and the components involved according a preferred aspect of the inventional method and system, respectively.
1) On both the client and on the server said SMRS, and CMRS components are installed. These components are preferably started on both systems at boot time. When a Unix system environment is used at the application server these components can be implemented with Daemon processes.
One example for the address mapping in static configuations is that the Client Media Resolution Service tells the Server Media Resolution Service both addresses, the address of the terminal emulator session, e.g. in an SNA connection the PU/LU address pair of the system, as well as the TCP/IP address of the system at startup time. When dynamic configuration techniques are used, the address mapping is done in the Communication Server on the Application Server and the corresponding addresses can be retrieved dynamically.
2

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for integration of new media into... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for integration of new media into..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for integration of new media into... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3363193

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.