Sensor calibration apparatus, sensor calibration method,...

Data processing: measuring – calibrating – or testing – Calibration or correction system – Position measurement

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C702S094000, C382S154000, C382S285000, C345S419000

Reexamination Certificate

active

06792370

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates to a technique for calibrating a position/posture sensor, and a technique for obtaining parameters used upon transforming an output value of the position/posture sensor for the purpose of measuring the position and posture of an image sensing device using the position/posture sensor.
BACKGROUND OF THE INVENTION
In recent years, studies about mixed reality that aims at seamless joint of real and virtual spaces have been extensively made. An image display apparatus which presents mixed reality is implemented by superimposing an image of a virtual space (e.g., a virtual object, text information, and the like rendered by computer graphics) onto an image of a real space photographed by an image sensing device such as a video camera or the like.
As applications of such image display apparatus, new fields different from conventional virtual reality such as operation assistance that superimposes the state in a body onto the body surface of a patient, a mixed reality game in which a player fights against virtual enemies that swim on the real space, and the like are expected.
A common requirement for these applications involves the precision level of alignment between the real and virtual spaces, and many efforts have been conventionally made in this respect.
A problem of alignment in mixed reality amounts to obtaining the three-dimensional (3D) position and posture of an image sensing device on a world coordinate system set on the real space (to be simply referred to as a world coordinate system hereinafter). As a method of solving these problems, it is a common practice to use a 3D position/posture sensor such as a magnetic sensor, ultrasonic wave sensor, and the like.
In general, the output value of a 3D position/posture sensor indicates the position and posture of a measurement point on a sensor coordinate system which is uniquely defined by the sensor, but is not that of the image sensing device on the world coordinate system. Taking the Polhemus FASTRAK (magnetic sensor) as an example, the position and posture of a receiver on a coordinate system defined by a transmitter are obtained as the sensor output. Therefore, the sensor output value cannot be directly used as the position and posture of the image sensing device on the world coordinate system, and must undergo some calibration processes. More specifically, coordinate transformation that transforms the position and posture of a measurement point into those of the image sensing device, and coordinate transformation that transforms the position and posture on the sensor coordinate system into those on the world coordinate system are required. In this specification, information used to transform the sensor output value into the position and posture of the image sensing device on the world coordinate system will be referred to as calibration information.
FIG. 1
is a block diagram showing the functional arrangement of a general image display apparatus which presents mixed reality.
A display screen
110
and video camera
120
are fixed to a head-mount unit
100
. When the user (not shown) wears the head-mount unit
100
so that the display screen
110
is located in front of the user's eye, a scene in front of the user's eye is captured by the video camera
120
. Therefore, if the image captured by the video camera
120
is displayed on the display screen
110
, the user observes a scene in front of the eye, which the user may observe by the naked eye if he or she does not wear the head-mount unit
100
, via the video camera
120
and display screen
110
.
A position/posture sensor
130
is a device for measuring the position and posture of a measurement point, fixed to the head-mount unit
100
, on the sensor coordinate system, and comprises, e.g., the Polhemus FASTRAK as a magnetic sensor including a receiver
131
, transmitter
133
, and sensor controller
132
. The receiver
131
is fixed to the head-mount unit
100
as a measurement point, and the sensor controller
132
measures and outputs the position and posture of the receiver
131
on the sensor coordinate system with reference to the position and posture of the transmitter
133
.
On the other hand, an arithmetic processing unit
170
comprises a position/posture information transformer
140
, memory
150
, and image generator
160
, and can be implemented by, e.g., a single versatile computer. The position/posture information transformer
140
transforms a measurement value input from the position/posture sensor
130
in accordance with calibration information held by the memory
150
so as to calculate the position and posture of the video camera
120
on the world coordinate system, and outputs the calculated position and posture as position/posture information. The image generator
160
generates a virtual image in accordance with the position/posture information input from the position/posture information transformer
140
, superimposes that virtual image on an actual image captured by the video camera
120
, and outputs superimposed that. The display screen
110
receives an image from the image generator
160
, and displays it. With the above arrangement, the user (not shown) can experience as if a virtual object were present on the real space in front of the user's eye.
A method of calculating the position and posture of the video camera on the world coordinate system by the position/posture information transformer
140
will be described below using FIG.
2
.
FIG. 2
is a view for explaining the method of calculating the position and posture of the video camera on the world coordinate system.
In
FIG. 2
, let M
TW
be the position and posture of a sensor coordinate system
210
(a coordinate system having the position of the transmitter
133
as an origin) on a world coordinate system
200
, M
ST
be the position and posture of the measurement point (i.e., the receiver
131
) of the position/posture sensor
130
on the sensor coordinate system
210
, M
CS
be the position and posture of the video camera
120
viewed from the measurement point of the position/posture sensor
130
, and M
CW
be the position and posture of the video camera
120
on the world coordinate system
200
. In this specification, the position and posture of object B on coordinate system A are expressed by a viewing transformation matrix M
BA
(4×4) from coordinate system A to coordinate system B (local coordinate system with reference to object B).
At this time, M
CW
can be given by:
M
CW
=M
CS
·M
ST
M
TW
  (A)
In equation (A), M
ST
is the input from the position/posture sensor
130
to the position/posture information transformer
140
, M
CW
is the output from the position/posture information transformer
140
to the image generator
160
, and M
CS
and M
TW
correspond to calibration information required to transform M
ST
into M
CW
. The position/posture information transformer
140
calculates M
CW
based on equation (A) using M
ST
input from the position/posture sensor
130
, and M
CS
and M
TW
held in the memory
150
, and outputs it to the image generator
160
.
In order to attain accurate alignment between the real and virtual spaces, accurate calibration information must be set in the memory
150
by some means. A virtual image which is accurately aligned in the real space can be displayed only when the accurate calibration information is given.
Note that the holding form of the calibration information in the memory
150
is not limited to the viewing transformation matrix, and any other forms may be adopted as long as information can define the position and posture of one coordinate system viewed from the other coordinate system. For example, the position and posture may be expressed by a total of six parameters, i.e., three parameters that describe the position, and three parameters which express the posture using an Euler angle. Also, the posture may be expressed by four parameters, i.e., a three-valued vector that defines the rotation axis, and a rotation angle about that axis,

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Sensor calibration apparatus, sensor calibration method,... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Sensor calibration apparatus, sensor calibration method,..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sensor calibration apparatus, sensor calibration method,... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3186212

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.