Vehicle navigational system and signal processing method for...

Television – Special applications – Navigation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S119000

Reexamination Certificate

active

06414712

ABSTRACT:

BACKGROUND OF THE INVENTION
The invention relates to a signal processing method for a vehicle navigational system.
Vehicle navigational systems using various techniques and examples for application are mentioned, for example, in the publication“Kraftfahrzeugtechnik” 1993, Section Technology-Dictionary. A controlled communication for convoys for use with motor-vehicle navigational systems is described in the publication “Automobiltechnische Zeitschrift” 90 (1988), pages 429-436.
A vehicle navigational system with a sensor fusion of radar sensor and video camera is known from a publication by E. Young et al. In IEE Colloquium on“Prometheus and Drive” (Digest No. 172), London, UK, 10.15.92. The radar sensor and video camera measurements are carried out independent of each other, and a common set of measurements and their probability distributions are subsequently created through data merging.
When using an image sensor arrangement, e.g. a video camera, with automatic image evaluation for detecting and classifying form, color, direction, aspect angle, etc. of interesting objects and the complexity of the observed scenery, the processing expenditure in an associated image evaluation system is very high.
SUMMARY OF THE INVENTION
It is the object of the present invention to specify a signal processing method for a vehicle navigational system, which supplies reliable environmental information with high efficiency.
This is achieved by a signal processing method for a vehicle navigational system having a radar arrangement and an image sensor arrangement which essentially monitor equal solid-angle regions, for which target parameters are respectively determined from the signals received from the radar arrangement and the image sensor arrangement, and wherein the determined target parameters from corresponding solid angle regions are linked through data merging, the method including the steps of creating reduced data sets and the resultant derived target parameters during the evaluation of the signals received from the radar and image sensor arrangements through a suitable linking of the data; specifying solid angle sections for the image evaluation during the evaluation of the signals received from the radar arrangements; and evaluating the signals received from the image sensor arrangement solely from these specified solid angle sections.
Advantageous embodiments and modifications of the invention will be described in the following description.


REFERENCES:
patent: 5296924 (1994-03-01), de Saint Blancard
patent: 5801970 (1998-09-01), Rowland et al.
patent: 5928299 (1999-07-01), Sekine et al.
patent: 2149257 (1985-06-01), None
patent: 2289816 (1995-11-01), None
patent: 5342499 (1993-12-01), None
patent: 07081604 (1995-03-01), None
patent: 07125567 (1995-05-01), None
R. L.Harvey et al: “Biological Vision Models for Sensor Fusion”. In: First conference on control Applications, 1992, vol. 1, pp. 392-397.
E. Young et al: “Improved Obstacle Detection by Sensor Fusion”. In: IEE Colloquium on ‘Prometheus and Drive’, London, UK, 1992, pp. 2/1-2/6.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Vehicle navigational system and signal processing method for... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Vehicle navigational system and signal processing method for..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Vehicle navigational system and signal processing method for... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2848834

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.