Method and apparatus for intelligent ranging via image...

Image analysis – Applications – Range or distance measuring

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S194000, C382S299000, C382S260000, C348S622000, C359S385000

Reexamination Certificate

active

06711280

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a sensor system capable of measuring the relative position and attitude of moving and stationary objects. In particular, this sensor system can detect and track objects equipped with surfaces that act as retroreflectors in the visible and near-infrared part of the spectrum. Such surfaces are already available in the taillights of all cars, trucks, and motorcycles, as well as in roadway lane markers, and can be easily and cheaply added to objects in other environments, such as railroads, factories, and airports.
2. Description of the Prior Art
One of the emerging trends in technology development is the addition of autonomous capabilities to many new products in ground transportation (cars, buses, trucks, trains), in aviation (commercial aircraft, military drones) and in specialized applications (factory automation, airport service facilities). This trend is greatly facilitated by the miniaturization of electronic components, the rapidly decreasing cost of computing power, and the recent surge of technology transfer from military to commercial applications. These advances have not only made it technically feasible to build systems that would have been unthinkable a few years ago, but have also dramatically decreased the cost of their implementation, thus making them suitable for mass production and commercial deployment.
The motivation for this trend towards autonomous operation comes primarily from considerations of safety, comfort, and cost.
Safety is the main beneficiary in cases where unmanned drones (all-terrain vehicles, airplanes, helicopters) are used in hazardous environments. Examples of such applications include: searching for trapped people in burning buildings, collapsed structures, and spaces filled with poisonous gases; filming exclusive footage of natural disasters such as exploding volcanoes; and military operations for de-mining, reconnaissance, and surveillance behind enemy lines. The use of human-operated vehicles in these environments would endanger the health or even the lives of the operators, and would also impose minimum size restrictions that would make it impossible to explore small spaces where people may be trapped. Increased safety is also the main concern in on-board vehicle systems such as collision warning, collision avoidance, lane departure warning, and lane keeping. These systems warn the driver/operator with an audible and visible signal when the vehicle is about to collide with another object or when it is about to leave its current lane on the roadway, and, if so equipped, they automatically activate the brakes and/or the steering to reduce speed and/or change course to avoid a collision or to maintain the vehicle's current course.
In applications such as adaptive cruise control, where the speed of the vehicle is automatically adjusted to follow the preceding vehicle at a safe distance, or vehicle following, where the vehicle's speed and direction are adjusted to follow the course of the preceding vehicle, the main consideration is the comfort and convenience of the driver/operator, with increased safety being a secondary but very important benefit.
Finally, significant cost savings motivate future applications such as electronic towing, highway platooning, automated airport vehicles, and automated manufacturing robots. In electronic towing, two or more commercial vehicles are operated in tandem, with the first vehicle being manually driven by a human operator, and the following vehicles being “electronically towed” without drivers, thereby reducing the number of drivers and the associated cost by 50% or more. In highway platooning, traffic is segmented into “platoons”, each composed of several cars that follow each other at very small distances of 1-2 m, driven not by their human occupants (who can resume manual operation once their car leaves the platoon), but by the on-board electronics that automate the steering, acceleration, and braking functions. This “automated highway system” has the potential of significantly increasing the traffic throughput of existing highways at a mere fraction of the cost of building new highways that would be able to handle the same additional traffic, while also improving the safety and comfort of the people who use this highway system for their transportation needs. While these applications may be several years away from their actual implementation, the same technology can be used in the near term to automate airport vehicles that carry baggage and goods between terminals and airplanes, at a much lower cost than human drivers. The same concept also applies to factory automation, where driverless vehicles can carry parts that are loaded and unloaded by automated robots.
These applications are currently in different stages of deployment. Collision warning, lane departure warning, and adaptive cruise control systems are already available as commercial products in high-end passenger cars and commercial trucks; unmanned drones are already used in military operations; and automated robots are already fully operational in many modern factories. Collision avoidance, lane keeping, vehicle following, and automated airport vehicles are still under development, but are approaching the point of commercial product release, while electronic towbars and automated highway systems are in the research stage, with several successful demonstrations already completed. The three major factors that differentiate these applications and influence the timeline of their deployment are: (1) whether their operation is autonomous or cooperative, (2) whether they operate in a controlled or uncontrolled environment, and (3) whether their role is passive or active. For example, collision warning systems are autonomous, because they rely only on measurements gathered by the host vehicle and do not require any special modifications to the surrounding cars and highway environment; they operate in the uncontrolled environment of public highways; and they passively warn the driver of an impending collision. Adaptive cruise control is also autonomous and operates in an uncontrolled environment, but it is an active system, since it actuates the throttle and brake to increase or decrease speed in order to maintain a safe distance from the preceding vehicle. Electronic towbar and automated highway systems are active (they actuate the steering in addition to the throttle and brake) and operate in an uncontrolled environment, but they are not autonomous since they rely on cooperation from their environment, namely from the preceding vehicle in the case of the electronic towbar, or from the other platoon members and the roadway infrastructure in the case of automated highways. Finally, airport and factory automation vehicles are active and cooperative systems, but they operate in a controlled environment where unexpected events can be kept to a minimum.
Despite their differences, all these applications share a common trait: they all need sensors that can provide accurate and reliable information about the surrounding environment. From collision warning to automated airport vehicles, and from adaptive cruise control to multi-car platooning, each of these systems depends critically on its “eyes”, namely the ranging sensors that “see” other cars on the highway or other robots and obstacles on the factory floor, and provide crucial information about how far each of these objects is, which direction it is coming from, and how fast it is approaching.
The currently available sensor technologies can be classified into five main categories: radar (microwave or millimeter-wave), computer vision, time-of-flight laser, sonar, and GPS. These are detailed below in order of increasing utility for the applications discussed above.
Sonar sensors emit acoustic pulses and measure the time it takes for the pulse to bounce off the target and return to the sensor, usually called the “time of flight”. Multiplying this time by the speed of sound yields the distance from the sour

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for intelligent ranging via image... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for intelligent ranging via image..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for intelligent ranging via image... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3249855

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.