Motion video signal processing for recording or reproducing – Local trick play processing – With randomly accessible medium
Reexamination Certificate
1999-10-25
2001-09-04
Garber, Wendy R. (Department: 2615)
Motion video signal processing for recording or reproducing
Local trick play processing
With randomly accessible medium
C386S349000, C386S349000
Reexamination Certificate
active
06285825
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an optical disc that records MPEG (Moving Pictures Experts Group) streams in which video streams and audio streams have been multiplexed. The present invention also relates to a recording apparatus, and a computer-readable storage medium storing a recording program for the optical disc.
2. Description of the Background Art
Many movie and home movie fans are not satisfied with merely viewing video images and want to freely edit the content of recorded images.
When editing images, a user may delete an unwanted section from an MPEG stream that has been obtained by multiplexing one or more video streams and audio streams. Users may also change the reproduction order of an edited MPEG stream as desired.
File systems that handle MPEG streams like a computer handles files have been subject to increasing attention for their role in the realization of the editing functions described above. The term “file system” is a general name for a data construction for managing the areas on a random access storage medium, like a hard disk drive or an optical disc. As one example, file systems standardized under ISO/IEC (International Standardization Organization/International Electrotechnical Commission) 13346 are used to store MPEG streams in files.
In such a file system, the files that store MPEG streams are managed using management information called directory files and file entries. Of these, a file entry includes a separate allocation descriptor for each extent that composes a file. Each allocation descriptor includes a logical block number (LBN) showing the recording position of an extent in the file and an extent length showing the length of the extent. By updating the logical block numbers (LBN) and extent lengths, logical sectors on a disc medium can be set as “used” or “unused”. This enables the user to partially delete data in units of logical sectors.
When a user partially deletes an MPEG stream where the minimum deletable unit is one logical sector of 2,048 bytes, decoding may not be possible for the resulting video stream and/or audio stream.
This problem is caused by the partial deletion being performed without consideration to the actual amount of MPEG stream data stored in each logical sector. For DVD Standard, data is recorded as compressed MPEG streams according to MPEG2 Standard. The data size of each pack to be recorded on a DVD is set equal to the logical sector size.
As a result, one pack in an MPEG stream is recorded in each logical sector. Here, a pack refers to a unit of data in an MPEG stream. Under MPEG, video streams and audio streams are divided into data divisions of a predetermined size. These data divisions are then converted into packets. A grouping of one or more packets is a pack. Packs are given time stamps for data transfer of the MPEG stream, making packs the unit used for data transfer. On a DVD, there is one-to-one correspondence between packs and packets. In this data construction, one packet exists within each pack. Video packs store divided data for three kinds of picture data, namely, Intra (I), Predicative (P), and Bidirectionally Predicative (B) pictures. An I picture results from compression of an image using spatial frequency characteristics within the image, without referring to other images. A P picture results from compression of an image using correlation with preceding images. A B picture results from compression of an image using correlation with both preceding and succeeding images.
When a partial deletion operation updates the management information, video packs that store one frame of picture data may be partially deleted. If B pictures or P pictures that refer to the partially deleted frame of picture data remain, decoding of such pictures will no longer be possible.
For audio, audio frame data for a plurality of frames is stored in one audio pack. Hereafter, the term “audio frame data” refers to the amount of audio data that is reproduced for one audio frame. This is generally called an “access unit”. For an MPEG stream, this is the minimum unit for both decoding and reproduction output.
To give specific examples, Dolby-AC3 method uses a frame length of 32 msec for the encoded audio stream, while MPEG uses a frame length of 24 msec, and LPCM (Linear Pulse Code Modulation) uses a frame length of approximately 1.67 msec ({fraction (1/600)} sec to be precise). Since the bitrate when decoding audio frame data for Dolby-AC3 is 192 Kbps, the size of one set of audio frame data is 768 (32 msec*192 Kbps) bytes.
When loading audio frame data into packs, the payload size of a pack is subject to a maximum size of 2016 bytes. For Dolby-AC3, this is the non-integer value of 2.624 times the audio frame data size. Since the payload size is a non-integer multiple of the audio frame data size, dividing the audio stream into units of the payload size of the packs and storing the data divisions in order in packs will result in certain sets of audio frame data extending over a boundary between audio packs.
The upper part of
FIG. 1
shows example audio frames. In
FIG. 1
, each section between the “<” and “>” symbols is an audio frame, with the “<” symbol showing the presentation start time and the “>” symbol showing the presentation end time. This notation for audio frames is also used in the following drawings. The audio frame data that should be reproduced (presented) for an audio frame is inputted into a decoder before the presentation start time of the audio frame. This audio frame data should be taken out of the buffer by the decoder at the presentation start time.
The lower part of
FIG. 1
shows an example of how the audio frame data to be reproduced in each audio frame is stored in audio packs. In this figure, the audio frame data to be reproduced for audio frames f
81
, f
82
is stored in audio pack A
71
, the audio frame data for audio frame f
84
is stored in audio pack A
72
, and the audio frame data for audio frame f
86
is stored in audio pack A
73
.
The audio frame data for audio frame f
83
is divided between the audio pack A
71
that comes first and the audio pack A
72
that comes later. In the same way, the audio frame data for audio frame f
85
is divided between the audio pack A
72
that comes first and the audio pack A
73
that comes later. The reason the audio frame data to be reproduced for one audio frame is divided and stored in two audio packs is that the boundaries between audio frames do not match the boundaries between packs. The reason that such boundaries do not match is that the data structure of packs under MPEG standard is totally unrelated to the data structure of audio streams.
If a partial deletion operation in logical sector (pack) units is performed by updating the file management information with a set of audio frame data extending over a pack boundary as shown in
FIG. 1
, a set of audio frame data that extends over a pack boundary that marks a boundary for the partial deletion will be changed. As a result, one part of the audio frame data will be located in a pack that is managed as “unused” while the other part will be located in a pack that is managed as “used”. An example of a set of audio frame data that extends over a pack boundary is audio frame data f
83
in FIG.
1
.
MPEG standard stipulates that a continuous stream is reproduced from beginning to end and uses a model where the unit for decoding is one set of audio frame data. Accordingly, a decoder for MPEG standard performs decoding under the premise that the beginning and end of the continuous stream are the boundaries of a set of audio frame data. As a result, there is no guarantee that a decoder will be able to correctly decode an audio stream that includes sets of audio frame data whose beginning or end is missing. This is due to the loss of some of the audio frame data needed for the decoding.
To ensure that an MPEG stream can be properly decoded after a partial deletion, it is necessary to first read the MPEG stream before the partial deletion, to sepa
Miwa Katsuhiko
Okada Tomoyuki
Tsuga Kazuhiro
Yagi Tomotaka
Boccio Vincent F.
Garber Wendy R.
Matsushita Electric - Industrial Co., Ltd.
Wenderoth , Lind & Ponack, L.L.P.
LandOfFree
Optical disc, recording apparatus, a computer-readable... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Optical disc, recording apparatus, a computer-readable..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Optical disc, recording apparatus, a computer-readable... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2452227