Sudipta N. Sinha and Marc Pollefeys,
Department of Computer
Science, UNC Chapel Hill
Code Released!
Existing algorithms for automatic 3D reconstruction of dynamic scenes from multiple viewpoint video requires calibrated and synchronized cameras. Our approach recovers all the necessary information by analyzing the motion of the silhouettes in video. This precludes the need for specific calibration data or a pre-calibration phase. The first step consists of independently recovering the temporal offset and epipolar geometry between different camera pairs using an efficient RANSAC-based algorithm that randomly samples the 4D space of epipoles and finds corresponding extremal frontier points on the silhouettes. In the next stage, the calibration and synchronization of the complete camera network is recovered. For unsynchronized video streams, silhouettes interpolated based on sub-frame temporal offsets produce more accurate visual hulls. We demonstrate our approach on six different datasets acquired by computer vision researchers. The datasets contain 4 to 25 viewpoints with these cameras in general configurations.
Posters: [cpvr04.ppt] [birs2006.ppt]
Talk: [icpr04.ppt]
Results
|
|
|
![]()
We provide code for computing the epipolar geometry (fundamental matrix) of a camera pair by analyzing the silhouettes of moving objects in video. The current version of the software assumes synchronized input. Future versions will allow the input video to be unsynchronized. We plan to release code for the full camera network calibration pipeline soon.
Download
How to Install
Setting up the Input
How to Run
![]()
Download
[silcalib-1.0.tar.gz]
>> tar -zxvf silcalib-1.0.tar.gz
The Linux binaries were built on the following platform:
>>
uname -a
Linux vision07.cs.unc.edu
2.6.9-55.0.12.ELsmp #1 SMP Wed Oct 17 08:19:30 EDT 2007 i686 i686 i386
GNU/Linux
>>gcc -v
gcc version 3.4.6 20060404 (Red Hat
3.4.6-8)
The following libraries need to be installed.
- lapack 3.0.25.1
http://www.netlib.org/lapack/
- levmar-2.1.3 or higher
http://www.ics.forth.gr/~lourakis/levmar/
- glut
http://www.opengl.org/resources/libraries/glut/
- vxl (version 1.5.1)
http://vxl.sourceforge.net
Here are the dependencies:
|
>>ldd computeSilEG |
>> computeSilTE |
lets call our dataset 'test1'
>> cd
../SilCalib/data
>> mkdir test1
>> mkdir
test1/input
>> mkdir
test1/output
>> cd
test1/input
2. Create a folder for each video sequence
For eg. if there are 4
sequences, create folders c00, c01, c02 and c03
3. Create a file /test1/input/seq.conf containing
folder names for each video.
For eg. for the above case -
== seq.conf ==
c00
c01
c02
c03
==============
4. Each folder (eg. c00) must contain zipped pgm files
(greyscale or binary imaeges of silhouettes) named as -
fr0000.pgm.gz
fr0001.pgm.gz
.
.
fr0200.pgm.gz
NOTE: Our program expects the same
number of images in each sequence.
5. Place a chosen frame from each image sequence in the test1/input
folder and name it 0?.jpg.
Thus for 4 sequences, there should be 4
files 00.jpg, 01.jpg, 02.jpg and 03.jpg.
We expect them to be corresponding
frames.
To process
dataset 'test1'
>> cd ../SilCalib/bin
1. Preprocess Video to generate 'dat' files containing
dual of convex hull of silhouettes in video.
(a) Batch Process:
>>
silTE.sh ../data/test1 0 200 fr
<arg1> : image sequence foldername
<arg2> : start index
<arg3> : end index
<arg4> : image name prefix
(b) Individual Script:
(interactive visualization mode)
>> computeSilTE.sh -i
../data/test1/input/c00 -s 9 -e 7499 -p fr -f pgm -u
-i <string> : image sequence foldername
-s <int> : start index
-e <int> : end index
-p <string> : image name prefix
-f <string> : image suffix (format)
-u : images are zipped.
(c)
Individual Script: (export dat file mode)
>> computeSilTE.sh
-i ../data/test1/input/c00 -s 9 -e 7499 -p fr -f pgm -u -c testfile.dat
-c <string>: filename
2. Check to see if
'dat' files were created (one for each video)
>> cd ../data/test1/output
here
you should find a seq??.dat for each sequence (which look like the following):
===== seq00.dat ================
Numframes 7491
Numtheta 360
Image_Width 720
Image_Height 480
Image# 0
NumCHVerts 26
NumBoundaryPoints 0
356 69
8
285 80
11
.
.
.
================================
3. Run Epipolar Geometry Estimation code
(a) Run pairwise estimation individually
Execute the
script silPairEG.sh with 3 arguments - dataset name, first and second sequence
index.
E.g.
>> silPairEG.sh ../data/test1 00 01
This generates 3
output files in the current location:
- test
- test.inl
- test.hs
================================= test
====================================
FMatrix
-0.0000021424
0.0000177074 -0.0101552914
-0.0000132953
0.0000128819 0.0325015120
0.0136052903
-0.0479733899 1.0000000000
Epipole1
3398.6769711906
984.7122948355 1.0
Epipole2
2225.6945251689 664.6637406537
1.0
Res0 0.204745 Resf 0.201229 Score 3949.8 Inliers 321 Total 559
========================================================================
This file contains all the inliers
================================= test.inl ====================================
321
309.000000 403.000000 329.000000 456.000000
364.000000 70.000000 391.000000 42.000000
363.000000 70.000000 388.000000 41.000000
.
.
===========================================================================
This file contains original interest points and the
re-projected Hartley Sturm Triangulated Points in the two views.
================================= test.hs
==============================================================================
363.9828
70.0569
364.0000
70.0000 391.0140
41.9590
391.0000 42.0000
363.0347
69.8851
363.0000
70.0000 387.9717
41.0827
388.0000 41.0000
362.0867
69.7129
362.0000
70.0000 384.9295
40.2063
385.0000 40.0000
.
.
=================================================================================
(b) Process all pairs
Run Script
'silEG.sh'. This will attempt to estimate epipolar geometry for all pairs of
cameras in the dataset.
>> silEG.sh ../data/test1
Output files
will be created in ../data/test1/output/
For 4 camera
dataset, there will be 6 camera pairs. These are denoted by a.b where a
< b.
For e.g. with 4
cameras, the 6 pairs will be
00.01
00.02
00.03
01.02
01.03
02.03
For each pair, we will
see 3 files:
- F.a.b
- F.a.b.inl
- F.a.b.hs
For each seq??.dat
file, you will see a seq??.dat.key file. These are a subset of the original dat
file pertaining to keyframes in video only.
NOTE: For Statistics--
>> grep
Score F.??.??
F.00.01:Res0 0.218158 Resf 0.214303
Score 10773.7 Inliers 393 Total 581
F.00.02:Res0 0.220191
Resf 0.203222 Score 1484.4
Inliers 198 Total 439
F.00.03:Res0 0.206237
Resf 0.205421 Score 6988.9
Inliers 335 Total 490
F.01.02:Res0 0.314871
Resf 0.307046 Score 521.1
Inliers 138 Total 450
F.01.03:Res0 0.236671
Resf 0.227118 Score 5412.7
Inliers 388 Total 616
F.02.03:Res0 0.254390
Resf 0.254390 Score 1703.0
Inliers 222 Total 361
4.
Visualize estimated epipolar geometry results using 'matlab' script
>> cd SilCalib/plot
>>
cat README
Run
MATLAB and follow instructions in README
5. Run Projective Reconstruction Code
>> buildPCN.sh ../data/test1 720 480
Output
files will be created in ../data/test1/output/
For each F.a.b,
you will see an updated fundamental matrix: Ff.a.b
For each camera
you will see a "Pp.a" file.
For eg. for 4 cameras, you will see
Pp.00 Pp.01 Pp.02 Pp.03
And three more files
-
measurements.proj
-
projective.pts
-
bundle.fmap
As the code runs, it reports the final re-projection error.
6. Run Self Calibration
Code (MATLAB)
(a) Run script - selfcalibration
<matlab $> help selfcalibration
% selfCalibration(dataset,
w, h);
% %
Matlab script runs self-calibration on projective camera matrices to
% compute
the Euclidean Projective matrices and output it to a file.
%
(b) Run script - genpoints
<matlab $>
help genpoints
% genpoints(dataset);
% %
Matlab script prepares the input file containing 3d points and 2d image tracks
% the
generated file is called sba.pts.input and will be used as input to the
%
Euclidean bundle adjustment.
(b) Run script - sba_residual
Matlab script prepares the
input file containing 3d points and 2d image tracks
the
generated file is called sba.pts.input and will be used as input to the
<matlab $> sba_residual(‘../data/test1’,
1);
<matlab $> help sba_residual
% Syntax:
Res = sba_residual (dataset, option)
% Input:
% dataset: dataset name
% option : =1 for initial residual , = 2 for final
residual
% Output:
% Plots error distribution
% computes residual error
7. Run Euclidean Bundle Adjustment
(code derived from sba)
>> cd SilCalib/eucba
>>
eucba <camera file> <points file>
e.g.
eucba data/test1/output/sba.cams.input data/test1/output/sba.pts.input
…..
Then please copy the following files back into ../data/test1/output
-
sba.cams.output
-
sba.pts.output
You could check the re-projection error with the following matlab script.
<matlab $> sba_residual(‘../data/test1’, 1);
7. Retrieve Final
Calibration (MATLAB)
<matlab $> help retrieve_calib
% retrieve_calib (dataset, w, h);
% Matlab script
converts the final calibration from bundle adjustment output file format to the
% usual camera calibration format
The final calibration is in
../data/test1/output/cameras.calib