Project 2: Local Feature Matching

Matching features on Notre Dame using SIFT

This project was centered around matching features in 2 images of the same object from slightly different viewpoints. The following work was done in this project.

  1. Implement interest point detection
  2. Implement SIFT feature descriptor
  3. Implement feature matching

Interest Point Detection

The harris corner detector was used to detect interest points. I found that a threshold of 1e-6 produced the best combination of high accuracy and a good number of interest points. The following steps were taken to implement the harris corner detection algorithm.

  1. Create a Gaussian Filter, used to remove noise from the image.
  2. Retrieve x and y gradients of the filter
  3. Apply filter to the image
  4. Set all edges to 0 so that edges don't become interest points
  5. Compute product images resulting in Ix and Iy
  6. Use this: (g(Ix^2)g(Iy^2) - [g(IxIy)^2] - alpha[g(Ix^2) + g(Iy^2)]^2) to compute the harris score
  7. Threshold points and add to result
The code below shows how the product images and harris score were computed and thresholded

%Harris corner detection code
gix_square = imfilter(isubx.*isubx, gaussian_square);
giy_square = imfilter(isuby.*isuby, gaussian_square);
gixy = imfilter(isubx.*isuby, gaussian_square);

alpha = 0.04;
har = gix_square.*giy_square - (gixy.*gixy) - alpha.*(gix_square.*giy_square) .* (gix_square.*giy_square);
threshold = har > 1e-6;

Feature Description

I first started off with normalized patches in order to test match_features. However, this resulted in only around 50% accuracy on the Notre Dame image pair. I then implemented a SIFT descriptor which significantly improved the accuracy to about 81% on the Notre Dame Image pair. Initially I was ignoring interest points that had a window that could go out of range but found that too few interest points were making it out of the pipeline. One of the values I tuned was the sigma of the gaussian applied to the image. I found that between 0.6 and 0.8 worked the best.


x_min = (interest_point_x - feature_width / 2 + 1);
y_min = (interest_point_y - feature_width / 2);
x_max = (interest_point_x + feature_width / 2);
y_max = (interest_point_y + feature_width / 2 + 1);

magnitude_window = magnitude(y_min:y_max, x_min:x_max);
direction_window = direction(y_min:y_max, x_min:x_max);

% construct the cells
for j = 0:3
    for k = 0:3
        
        cellSize = feature_width / 4;
        
        cell_dir = direction_window(j * cellSize + 1:(j+1)*cellSize, k*cellSize+1:(k+1)*cellSize);
        cell_mag = magnitude_window(j * cellSize + 1:(j+1)*cellSize, k*cellSize+1:(k+1)*cellSize);
        
        % 8 gradient orientations
        for orientation = 1:8
            cell = sum(cell_mag(cell_dir == orientation));
            index = 8 * (j * 4 + k) + orientation;
            f(1, index) = cell;
        end
    end
end

Feature Matching

The final step in the pipeline is to match the features from the previous step. This is done by using eucliean distance and nearest neighbor distance ratio to find the best matches from a list of features from both images. Then a threshold is applied to only select the best features. I found that 0.7 worked best for the threshold. The following piece of code highlights the nndr process.


distance = pdist2(features1, features2, 'euclidean');
threshold = 0.7;

% Sort distance by rows
[sorted, index] = sort(distance, 2);
nndr = (sorted(:, 1)./ sorted(:, 2));
matchesInFeature1 = find(nndr < threshold);
matchesInFeature2 = index(nndr < threshold, 1);
matches = [matchesInFeature1 matchesInFeature2];

Results

Original Images

Matched Features

80% accuracy

Original Images

Matched Features

89% accuracy

Original Images

Matched Features

14% accuracy - This image pair had the worst accuracy of all the provided images. What makes it more difficult is that fact that it's from a different viewpoint combined with the different lighting. Matching interest points turned out to be really difficult for this pair.

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.