Ahmet Cecen
This script showcases test cases for camera calibration and fundemental matrix estimation. The project consists of 3 main parts.
Part 1 - Camera Projection Matrix
The objective here is to compute a mapping from the actual 3D coordinates and the 2D coordinates in an image.
We can calculate the projection matrix given corresponding 2D and 3D points by solving a system of linear equations. This is implemented as discussed in class notes using SVD.
We can also find the camera center by factoring the projection matrix.
Part 2 - Fundamental Matrix Estimation
This section showcases mapping of points between two images of the same location from different angles. The general idea is that the viewfield distortion between the two images are consistent, so it is possible to determine within reasonable accuracy, where 1 point on an image can possibly lie on another image. The collection of all possible locations lie on lines that are referred to as epipolar lines.
It is possible to determine this mapping solving a system of linear equations. Below are the results of the naive implementation.
As per the suggestion in the rubric, we can also normalize the coordinates of interest points in both images to overcome numerical stability issues.
Notice that while both implementations are reasonably accurate, the normalized implementation hits the marks slightly better. This is especially evident on the top-right most interest point.
Part 3
Final portion of this project demonstrates how to eliminate false feature matches between two images using RANSAC and the Fundemental matrix. This is done by filtering out feature matches that don't lie close enough on estimated epipolar lines.
The naive implementation of RANSAC as discussed in the class, yields the following results for each image pair, using the normalized estimator.
Mount Rushmore
Notre Dame
Gaudi
Woodruff Dorm