Television – Camera – system and detail – Combined image signal generator and general image signal...
Reexamination Certificate
1999-02-09
2004-12-28
Christensen, Andrew (Department: 2615)
Television
Camera, system and detail
Combined image signal generator and general image signal...
C348S297000, C348S221100, C348S362000, C348S065000
Reexamination Certificate
active
06836288
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The field of invention relates generally to exposure control systems for imaging devices, and more specifically, exposure control systems of the type in which there is a need for a rapid and smooth response to exposure control commands.
2. Background of the Invention
In recent years, the need for miniature lightweight video cameras in the medical and industrial fields has developed. These cameras typically comprise a camera head containing an imaging device, such as a charged coupled device (CCD), a light source, a camera control unit containing control and video processing circuitry, and a display device, such as a computer monitor, television screen, or a microscope. The camera head is attached to the camera control unit via a cable or through a wireless interface, thereby allowing the camera head to be inserted into and positioned at remote and/or confined locations. Once the camera head is positioned, light from the light source is used to illuminate the location of interest, typically after passage through the cable or wireless interface. The light reflects off the location of interest to form images of the desired location. The images are captured by the imaging device in the camera head, and then displayed on the display device.
A medical application for such video cameras is endoscopy, in which an endoscope is passed through a small incision in the patient to permit viewing of the surgical sight. The endoscope is optically coupled to the camera head. Images of the surgical site are captured by the imaging device in the camera head, and displayed on the display device. Advantageously, the endoscope allows substantially non-invasive viewing of the surgical site.
Likewise, numerous industrial applications exist for video cameras such as this. In one such application, a video camera in combination with other tools allow work to be performed on areas that would otherwise not permit access. Examples include the use of miniature video cameras to view inaccessible piping networks situated behind drywalls and the like, interior locations on industrial equipment, and underwater locations in sunken ships or the like inaccessible by divers.
Additional details on endoscopic video cameras are contained in U.S. Pat. Nos. 5,841,491; 5,696,553; 5,587,736; 5,428,386; and co-pending U.S. patent application Ser. Nos. 09/044,094; 08/606,220; and 08/589,875; each of which is owned by the assignee of the subject application, and each of which is hereby fully incorporated by reference herein as though set forth in full.
A characteristic of many of these applications is a diverse and rapidly changing scene of interest. In endoscopic applications, for example, as the surgeon manipulates the endoscope, the scene of interest may rapidly change to encompass one or more bodily fluids or structures, including blood, which is light absorptive, moist tissue, which is light reflective, and other diverse body structures such as cartilage, joints, and body organs. The bright light sources typically used in such applications, coupled with the diverse and rapidly changing reflective characteristics of elements within the field of view, give rise to an illumination level of reflected light which changes rapidly and over a broad range. The result is that image capture devices such as the CCD can easily become saturated and over-exposed. Exposure control systems are thus needed to avoid saturation of the image capture device, to avoid overexposure and underexposure conditions, to deal with the diverse and rapidly changing reflection characteristics of the elements in the scene of interest, and also to ensure that the image capture device and related components are operating in optimal or preferred ranges.
Unfortunately, current exposure control systems react too slowly to the changing reflection characteristics, and develop unstable brightness fluctuations or oscillation if configured to run more quickly. It is not uncommon for these systems to take up to several seconds to react to overexposure and underexposure conditions during which the image is lost, and the scene is unviewable. This image loss makes these current systems unsuitable for endoscopic applications, in which any image loss poses an unacceptable health risk to the patient given that power tools, sharp surgical instruments, and electrosurgical devices can quickly damage healthy tissue if they are not continuously in view and controllable by the surgeon. Similar concerns are present in industrial applications in which the power tools utilized by the industrial operator may quickly damage the industrial work site.
The problem is compounded due to the nature of current image capture devices, such as CCDs, in which there is an inherent delay between the detection of a condition requiring a change in the exposure level of the device, and the responsiveness of the device to such a command. The problem can be explained with reference to
FIG. 1
, which illustrates a video camera system in which the image capture device is a CCD. The imaging system comprises sensor array
5
readout register
6
, amplifier
7
, video display device
8
, and control device
1
. Together, the sensor array
5
and readout register
6
comprise CCD or image sensor
9
.
The sensor array
5
comprises a plurality of individual photo sites
14
typically arranged in rows and columns. Each site is configured to build up an accumulation of charge responsive to the incidence of illumination on the site. The geometry is such that the spatial distribution of charge build-up in the individual sites matches the spatial distribution of the intensity of light reflected from the scene of interest and defining the scene image. An image is captured when the charge is allowed to build up in the individual sites in the same spatial orientation as the spatial orientation of the reflected light defining the image.
Periodically, the accumulated charge in the individual sites is removed, and stored in readout register
6
. Then, the contents of the readout register
6
are serially output onto signal line
15
in a manner which is consistent with the protocol of display device
8
. The signal on signal line
15
is then provided as an input to display device
8
. The output on signal line
15
comprises the output of image sensor
9
.
In one implementation, a video frame must be presented to the display device
8
once every {fraction (1/60)} seconds, or 16.67 milliseconds (mS). A video frame in this implementation consists of 262.5 lines of pixels, with a line being presented every 63.6 &mgr;S. According to this implementation, the accumulated charge in the individual sites
14
of sensor array
5
are removed to readout register
6
, and the contents of the readout register
6
serially output on serial line
15
, once every {fraction (1/60)} seconds.
Control device
1
controls the time period during which the individual sites
14
in sensor array
5
are allowed to accumulate charge during a frame. In one implementation, this is accomplished as follows. Control device
1
determines a control parameter equal to the number N of lines in a frame that the individual sites are to be kept in a discharged state. It then sends over control line
2
a plurality of discharge pulses in which the number of pulses is N, and the duration of the pulses is N×63.6 &mgr;S. The remaining portion of the frame, known as the integration time or integration time period, is the time period over which the individual sites are allowed to accumulate charge. Since the frame time is 16.67 mS, and the time per line is 63.6 &mgr;S, the integration time in milliseconds per frame is 16.67−N×63.6×10
−3
.
The situation is illustrated in FIG.
2
. FIG.
2
(
a
) illustrates a timing pulse which, in accordance with the NTSC standard, occurs every {fraction (1/60)} sec., in which each occurrence of the pulse defines a separate frame capture. These timing pulses define separate frame periods. Indicated in the figure is the capture of frames
1
Christensen Andrew
Howrey Simon Arnold & White , LLP
Linvatec Corporation
Tran Nhan
LandOfFree
Automatic exposure control system and method does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Automatic exposure control system and method, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Automatic exposure control system and method will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3308159