Table of Contents

Introduction

About Feature Extraction

Feature extraction is one of the critical steps in Visual SLAM, specifically, it solves the Data association problem of Visual SLAM

About ORB feature extraction

ORB is short for Oriented FAST and rotated BRIEF, as an OpenCV enthusiast, the most important thing about the ORB is that it came from “OpenCV Labs”. This algorithm was brought up by Ethan Rublee, Vincent Rabaud, Kurt Konolige and Gary R. Bradski in their paper ORB: An efficient alternative to SIFT or SURF in 2011. As the title says, it is a good alternative to SIFT and SURF in computation cost, matching performance and mainly the patents.

Yes, SIFT and SURF are patented and you are supposed to pay them for its use. But ORB is not !!!

Demo mini project

ORB in OpenCV

In OpenCV(C++) we create an ORB object with the function

Ptr<ORB> orb = ORB::create(500, 1.2f, 8, 31, 0, 2, ORB::HARRIS_SCORE, 31, 20);

It has a number of optional parameters. Most useful ones are nFeatures which denotes maximum number of features to be retained (by default 500), scoreType which denotes whether Harris score or FAST score to rank the features (by default, Harris score) etc. Another parameter, WTA_K decides number of points that produce each element of the oriented BRIEF descriptor. By default it is two, ie selects two points at a time. In that case, for matching, NORM_HAMMING distance is used. If WTA_K is 3 or 4, which takes 3 or 4 points to produce BRIEF descriptor, then matching distance is defined by NORM_HAMMING2 .

Sample inputs

download and rename as 1.jpg download and rename as 2.jpg

Code

main.cpp

// main.cpp #include <iostream> #include <opencv2/core/core.hpp> #include <opencv2/features2d/features2d.hpp> #include <opencv2/highgui/highgui.hpp> #include <opencv2/imgproc/imgproc.hpp> using namespace std; using namespace cv; // Define the usage of the program int main(int argc, char** argv) { if (argc != 3) { cout << "usage: feature_extraction img1 img2"; return -1; } // Read the images and downsize Mat img_1 = imread(argv[1], CV_LOAD_IMAGE_COLOR); Mat img_2 = imread(argv[2], CV_LOAD_IMAGE_COLOR); resize(img_1, img_1, Size(640, 480)); resize(img_2, img_2, Size(640, 480)); // Step 1: Initialize descriptors and key points of ORB std::vector<KeyPoint> keypoints_1, keypoints_2; Mat descriptors_1, descriptors_2; Ptr<ORB> orb = ORB::create(500, 1.2f, 8, 31, 0, 2, ORB::HARRIS_SCORE, 31, 20); // Parameters explanation: // 500: nfeatures // 1.2f: scaleFactor // 8: nlevels // 31: edge threshold // 0: first level // 2: WTA_K // ORB::HARRIS_SCORE: score type // 31: path size // 20: fast threshold // Step 2: Detect Oriented FAST key points orb->detect(img_1, keypoints_1); orb->detect(img_2, keypoints_2); // Step 3: Compute BRIEF descriptor based on key points orb->compute(img_1, keypoints_1, descriptors_1); orb->compute(img_2, keypoints_2, descriptors_2); // Display the key points along with the image Mat outimg1; drawKeypoints(img_1, keypoints_1, outimg1, Scalar::all(-1), DrawMatchesFlags::DEFAULT); imshow("ORB key points", outimg1); waitKey(0); // Step 4: Match BRIEF descriptors based on Hamming distance vector<DMatch> matches; BFMatcher matcher(NORM_HAMMING); matcher.match(descriptors_1, descriptors_2, matches); // Number of matches should be at most 500, which is nfeatures cout << "number of matches" << descriptors_1.rows << endl; // Step 5: Filter out the mismatches double min_dist = 1000, max_dist = 0; // Find the maximum Hamming distance between matched descriptors in two images for (int i = 0; i < descriptors_1.rows; i++) { double dist = matches[i].distance; if (dist < min_dist) min_dist = dist; if (dist > max_dist) max_dist = dist; } cout << "Max dist: " << max_dist << endl; cout << "Min dist: " << min_dist << endl; // Filtering algorithm: if the distance between matched descriptors is larger than twice of the min_dist, then filter the match out // However, the min_dist could be rather small sometimes. thus setting a baseline value vector<DMatch> valid_matches; for (int i = 0; i < descriptors_1.rows; i++) { if (matches[i].distance <= max(2 * min_dist, 40.0)) { valid_matches.push_back(matches[i]); } } // Last step: draw the valid matches Mat img_match; Mat img_validmatch; drawMatches(img_1, keypoints_1, img_2, keypoints_2, matches, img_match); drawMatches(img_1, keypoints_1, img_2, keypoints_2, valid_matches, img_validmatch); imshow("All matches", img_match); imshow("Valid matches", img_validmatch); waitKey(0); return 0; }

CMakeLists.txt

cmake_minimum_required(VERSION 2.8) project(feature_extraction) set(CMAKE_BUILD_TYPE "Release") set(CMAKE_CXX_STANDARD 14) find_package(OpenCV REQUIRED) include_directories( ${OpenCV_INCLUDE_DIRS} ) add_executable(feature_extraction main.cpp) target_link_libraries(feature_extraction ${OpenCV_LIBS})

Result

all matches valid matches

Reference