Author: A.Abbas ELMAS, Barış ATA
This repository contains experimental framework, datasets, evaluation outputs, and interactive visualizations for comprehensive evaluation of OpenCV feature detection, description, and matching algorithm combinations for UAV-based 3D reconstruction.
Research Summary
- 784 algorithm combinations evaluated across 5 diverse datasets with 89 top performers visualized
- Composite Unsupervised Efficiency Score (CUES) - Entropy + CRITIC + PCA + Variance
- 6-Phase comprehensive analysis: Dataset-specific, Algorithm-specific, Component-based, Cross-dataset, Unified, Mobile-optimized
- Cross-dataset stability: 0.865 average correlation
- Cross-platform consistency: >0.994 correlation (desktop vs mobile vs mobile2)
- Best overall performer: ORB-ORB-HAM-BF (0.8821 synthetic)
- Best unified performer: ORB-BEBLID-HAM-BF (cross-dataset champion)
- Best UAV: GFTT-DAISY-L2-BF (0.7299)
Interactive Dashboards
Access comprehensive analysis dashboards:
Detailed dashboards with variants and mobile analyses
| Dataset |
Main Page |
Efficiency |
Variants |
Timing |
Correlation |
Heatmap |
Violin |
| Synthetic |
synthetic |
Efficiency |
4, All4 |
Timing |
Correlation |
Heatmap |
Violin |
| Synthetic |
Mobile 1 |
Efficiency m1 |
4 m1, All4 m1 |
Timing m1 |
Correlation m1 |
Heatmap m1 |
Violin m1 |
| Synthetic |
Mobile 2 |
Efficiency m2 |
4 m2, All4 m2 |
Timing m2 |
Correlation m2 |
Heatmap m2 |
Violin m2 |
| Oxford |
oxford |
Efficiency |
9, All9, All |
Timing |
Correlation |
Heatmap |
Violin |
| Oxford |
Mobile 1 |
Efficiency m1 |
9 m1, All9 m1, All m1 |
Timing m1 |
Correlation m1 |
Heatmap m1 |
Violin m1 |
| Oxford |
Mobile 2 |
Efficiency m2 |
9 m2, All9 m2, All m2 |
Timing m2 |
Correlation m2 |
Heatmap m2 |
Violin m2 |
| Drone |
drone |
Efficiency |
All, AllXY |
Timing |
Correlation |
Heatmap |
Violin |
| UAV |
uav |
Efficiency |
All |
Timing |
Correlation |
Heatmap |
Violin |
| AirSim |
airsim |
Efficiency |
All |
Timing |
Correlation |
Heatmap |
Violin |
Feature Matching Visualizations
Interactive JPG and HTML visualizations for 135+ top-performing algorithm combinations across 784 evaluated configurations. Each visualization includes match quality metrics, timing statistics, and toggleable information overlays.
Top 5 Algorithm Combinations per Dataset
| Dataset |
Rank |
Algorithm Combination |
CUES |
Static |
Interactive |
| Synthetic |
1 |
ORB-ORB-HAM-BF |
0.8821 |
JPG |
HTML |
| |
2 |
ORB-ORB-HAM-FLANN |
0.8789 |
JPG |
HTML |
| |
3 |
ORB-TEBLID-HAM-BF |
0.8445 |
JPG |
HTML |
| |
4 |
ORB-BOOST-HAM-BF |
0.8412 |
JPG |
HTML |
| |
5 |
STAR-BRISK-HAM-BF |
0.8383 |
JPG |
HTML |
| Oxford |
1 |
ORB-BEBLID-HAM-BF |
0.6219 |
JPG |
HTML |
| |
2 |
ORB-TEBLID-HAM-BF |
0.6152 |
JPG |
HTML |
| |
3 |
AKAZE-SIFT-L2-BF |
0.6078 |
JPG |
HTML |
| |
4 |
ORB-TEBLID-HAM-FLANN |
0.6058 |
JPG |
HTML |
| |
5 |
ORB-BEBLID-HAM-FLANN |
0.6027 |
JPG |
HTML |
| AirSim |
1 |
ORB-BEBLID-HAM-BF |
0.7891 |
JPG |
HTML |
| |
2 |
ORB-TEBLID-HAM-BF |
0.7782 |
JPG |
HTML |
| |
3 |
ORB-BEBLID-HAM-FLANN |
0.7650 |
JPG |
HTML |
| |
4 |
ORB-TEBLID-HAM-FLANN |
0.7635 |
JPG |
HTML |
| |
5 |
KAZE-SIFT-L2-BF |
0.7614 |
JPG |
HTML |
| UAV |
1 |
GFTT-DAISY-L2-BF |
0.7299 |
JPG |
HTML |
| |
2 |
AGAST-SIFT-L2-BF |
0.7255 |
JPG |
HTML |
| |
3 |
AKAZE-DAISY-L2-BF |
0.7189 |
JPG |
HTML |
| |
4 |
FAST-SIFT-L2-BF |
0.7161 |
JPG |
HTML |
| |
5 |
AGAST-DAISY-L2-BF |
0.7124 |
JPG |
HTML |
| Drone |
1 |
ORB-BEBLID-HAM-BF |
0.8489 |
JPG |
HTML |
| |
2 |
ORB-DAISY-L2-BF |
0.8286 |
JPG |
HTML |
| |
3 |
ORB-TEBLID-HAM-BF |
0.8247 |
JPG |
HTML |
| |
4 |
ORB-BEBLID-HAM-FLANN |
0.8209 |
JPG |
HTML |
| |
5 |
GFTT-DAISY-L2-BF |
0.8155 |
JPG |
HTML |
Drone Dataset Visualizations
| Algorithm Combination |
CUES |
Static |
Interactive |
| ORB-BEBLID-HAM-BF |
0.8489 |
JPG |
HTML |
| ORB-DAISY-L2-BF |
0.8286 |
JPG |
HTML |
| ORB-TEBLID-HAM-BF |
0.8247 |
JPG |
HTML |
| ORB-BEBLID-HAM-FLANN |
0.8209 |
JPG |
HTML |
| GFTT-DAISY-L2-BF |
0.8155 |
JPG |
HTML |
| ORB-TEBLID-HAM-FLANN |
0.8100 |
JPG |
HTML |
| ORB-DAISY-L2-FLANN |
0.8053 |
JPG |
HTML |
| ORB-ORB-HAM-BF |
0.7979 |
JPG |
HTML |
| GFTT-DAISY-L2-FLANN |
0.7952 |
JPG |
HTML |
| ORB-ORB-HAM-FLANN |
0.7788 |
JPG |
HTML |
UAV Dataset Visualizations
| Algorithm Combination |
CUES |
Static |
Interactive |
| GFTT-DAISY-L2-BF |
0.7299 |
JPG |
HTML |
| AGAST-SIFT-L2-BF |
0.7255 |
JPG |
HTML |
| AKAZE-DAISY-L2-BF |
0.7189 |
JPG |
HTML |
| FAST-SIFT-L2-BF |
0.7161 |
JPG |
HTML |
| AGAST-DAISY-L2-BF |
0.7124 |
JPG |
HTML |
| BRISK-DAISY-L2-BF |
0.7122 |
JPG |
HTML |
| AGAST-VGG-L2-BF |
0.7079 |
JPG |
HTML |
| KAZE-SIFT-L2-BF |
0.7064 |
JPG |
HTML |
| AKAZE-DAISY-L2-FLANN |
0.7062 |
JPG |
HTML |
| GFTT_H-DAISY-L2-BF |
0.7041 |
JPG |
HTML |
AirSim Dataset Visualizations
| Algorithm Combination |
CUES |
Static |
Interactive |
| ORB-BEBLID-HAM-BF |
0.7891 |
JPG |
HTML |
| ORB-TEBLID-HAM-BF |
0.7782 |
JPG |
HTML |
| ORB-BEBLID-HAM-FLANN |
0.7650 |
JPG |
HTML |
| ORB-TEBLID-HAM-FLANN |
0.7635 |
JPG |
HTML |
| KAZE-SIFT-L2-BF |
0.7614 |
JPG |
HTML |
| GFTT-DAISY-L2-BF |
0.7436 |
JPG |
HTML |
| KAZE-SIFT-L2-FLANN |
0.7412 |
JPG |
HTML |
| KAZE-BEBLID-HAM-BF |
0.7386 |
JPG |
HTML |
| AGAST-SIFT-L2-BF |
0.7359 |
JPG |
HTML |
| AKAZE-DAISY-L2-BF |
0.7299 |
JPG |
HTML |
Synthetic Dataset Visualizations (Rotation)
| Algorithm Combination |
CUES |
Static |
Interactive |
| ORB-ORB-HAM-BF |
0.8821 |
JPG |
HTML |
| ORB-ORB-HAM-FLANN |
0.8789 |
JPG |
HTML |
| ORB-TEBLID-HAM-BF |
0.8445 |
JPG |
HTML |
| ORB-BOOST-HAM-BF |
0.8412 |
JPG |
HTML |
| STAR-BRISK-HAM-BF |
0.8383 |
JPG |
HTML |
| ORB-VGG-L2-BF |
0.8367 |
JPG |
HTML |
| ORB-VGG-L2-FLANN |
0.8362 |
JPG |
HTML |
| STAR-BRISK-HAM-FLANN |
0.8359 |
JPG |
HTML |
| ORB-ORB-L2-BF |
0.8343 |
JPG |
HTML |
| ORB-BEBLID-L2-BF |
0.8304 |
JPG |
HTML |
Synthetic Dataset Visualizations (Scale)
Synthetic Dataset Visualizations (Intensity)
Oxford Affine Dataset Visualizations
Bark (Viewpoint Change)
Bikes (Blur)
Boat (Zoom + Rotation)
Graf (Viewpoint Change)
Leuven (Illumination)
Trees (Blur)
UBC (JPEG Compression)
Wall (Viewpoint Change)
| Algorithm Combination |
Drone |
UAV |
AirSim |
Oxford |
Synthetic |
Average |
| ORB-BEBLID-HAM-BF |
0.849 |
0.704 |
0.789 |
0.622 |
0.825 |
0.758 |
| ORB-TEBLID-HAM-BF |
0.825 |
0.691 |
0.778 |
0.615 |
0.844 |
0.751 |
| ORB-BEBLID-HAM-FLANN |
0.821 |
0.679 |
0.765 |
0.603 |
0.818 |
0.737 |
| ORB-TEBLID-HAM-FLANN |
0.810 |
0.677 |
0.764 |
0.606 |
0.793 |
0.730 |
| AKAZE-SIFT-L2-BF |
0.717 |
0.703 |
0.681 |
0.608 |
0.809 |
0.704 |
| GFTT-DAISY-L2-BF |
0.816 |
0.730 |
0.744 |
0.504 |
0.728 |
0.704 |
| Rank |
Detector |
Mean CUES |
Max CUES |
Std |
Evaluations |
| 1 |
AKAZE |
0.5472 |
0.8092 |
0.1771 |
179 |
| 2 |
ORB |
0.5297 |
0.8821 |
0.2168 |
163 |
| 3 |
STAR |
0.5070 |
0.8383 |
0.1840 |
165 |
| 4 |
AGAST |
0.4696 |
0.7638 |
0.1791 |
165 |
| 5 |
BRISK |
0.4640 |
0.7855 |
0.1961 |
164 |
| 6 |
KAZE |
0.4614 |
0.7614 |
0.1910 |
186 |
| 7 |
FAST |
0.4339 |
0.7773 |
0.1841 |
164 |
| 8 |
SIFT |
0.4237 |
0.8001 |
0.1803 |
149 |
| 9 |
MSD |
0.4044 |
0.7283 |
0.2015 |
155 |
| 10 |
GFTT |
0.4017 |
0.8155 |
0.2126 |
162 |
| Rank |
Descriptor |
Mean CUES |
Max CUES |
Std |
Evaluations |
| 1 |
SIFT |
0.6354 |
0.8092 |
0.0966 |
129 |
| 2 |
DAISY |
0.6213 |
0.8286 |
0.1089 |
140 |
| 3 |
VGG |
0.6024 |
0.8367 |
0.1216 |
140 |
| 4 |
KAZE |
0.6002 |
0.7259 |
0.1401 |
11 |
| 5 |
AKAZE |
0.5894 |
0.8014 |
0.0944 |
30 |
| 6 |
BEBLID |
0.5801 |
0.8489 |
0.1237 |
210 |
| 7 |
TEBLID |
0.5535 |
0.8445 |
0.1446 |
210 |
| 8 |
BOOST |
0.5337 |
0.8412 |
0.1434 |
210 |
| 9 |
BRISK |
0.3677 |
0.8383 |
0.1855 |
206 |
| 10 |
ORB |
0.3344 |
0.8821 |
0.1588 |
188 |
Optimal Detector-Descriptor Combinations (Top 10)
| Rank |
Combination |
Mean CUES |
Max CUES |
| 1 |
AKAZE-SIFT |
0.6947 |
0.8092 |
| 2 |
AGAST-SIFT |
0.6801 |
0.7507 |
| 3 |
STAR-SIFT |
0.6787 |
0.7719 |
| 4 |
ORB-DAISY |
0.6785 |
0.8286 |
| 5 |
AKAZE-VGG |
0.6765 |
0.8090 |
| 6 |
ORB-SIFT |
0.6758 |
0.7670 |
| 7 |
AKAZE-DAISY |
0.6730 |
0.7315 |
| 8 |
STAR-DAISY |
0.6651 |
0.7383 |
| 9 |
BRISK-SIFT |
0.6643 |
0.7733 |
| 10 |
BRISK-DAISY |
0.6637 |
0.7624 |