-
Notifications
You must be signed in to change notification settings - Fork 1
Home
Stitch is application used for stitching aerial images. This application can be considered to be prototype as it is still in it's infancy stage.
The aim of stitching aerial images is to establish current state of terrain which has its usage in construction, agriculture, for the needs of Real Estate Cadaster and so on.
Process of stitching consists of several stages.
- Feature detection
- Feature description
- Registration
- Alignment
- Blending
First step of stitching is detection of features of interest. This means finding those pixels in image which are most distinguishable from it's neighbours. There are several interesting algorithms for this purpose as SIFT, SURF, ORB and so on. This application supports usage of all named algorithms.
(defn detect-keypoints
[mat algo]
(let [fd (FeatureDetector/create algo)
keypoints (MatOfKeyPoint.)]
(.detect fd mat keypoints)
keypoints))
After features of interest have been detected, they should be described so later they can be matched across different images.(This is often done in the same step as feature detection). So once you have the features and its description, you can find same features in other images and align them, stitch them or do whatever you want.
Image registration involves features matching in a set of images or using direct alignment methods to search for image alignments that minimize the sum of absolute differences between overlapping pixels. Even after matching features across images, there can still be some outliers(false matches) which must be eliminated in order to successfully stitch image.To estimate a robust model from the data RANSAC method is used. This method calculates Fundamental matrix which is used to relate points from one image to another image.
(defn calculate-homography [img-a img-b]
(let [algos {:surf {:extractor DescriptorExtractor/SURF :detector FeatureDetector/SURF}
:orb {:extractor DescriptorExtractor/ORB :detector FeatureDetector/ORB}
:sift {:extractor DescriptorExtractor/SIFT :detector FeatureDetector/SIFT}
:harris-orb {:extractor DescriptorExtractor/BRIEF :detector FeatureDetector/ORB}}
matchers {:flann DescriptorMatcher/FLANNBASED, :brute DescriptorMatcher/BRUTEFORCE}
algo (algos :surf)
extractor (DescriptorExtractor/create (:extractor algo))
matcher (DescriptorMatcher/create (matchers :flann))
img-a-g (blur (convert-to-gray img-a) 5.0)
img-b-g (blur (convert-to-gray img-b) 5.0)
kp-a (detect-keypoints img-a-g (:detector algo))
kp-b (detect-keypoints img-b-g (:detector algo))
desc-a (Mat.)
desc-b (Mat.)
matches (MatOfDMatch.)]
(.compute extractor img-a kp-a desc-a)
(.compute extractor img-b kp-b desc-b)
(.match matcher desc-a desc-b matches)
(let [good (good-matches matches)
img-matches (Mat.)
good-list (.toList good)
kp-a-vec (kp-vec kp-a)
kp-b-vec (kp-vec kp-b)
good-matches-vec (map #(to-map %) good-list)
img-points-list-a (map #(.pt (get kp-a-vec (:queryIdx %) )) good-matches-vec)
img-points-list-b (map #(.pt (get kp-b-vec (:trainIdx %) )) good-matches-vec)
matK1 (MatOfPoint2f.)
matK2 (MatOfPoint2f.)]
(.fromList matK1 img-points-list-a)
(.fromList matK2 img-points-list-b)
(Calib3d/findHomography matK1 matK2 Calib3d/RANSAC 10))))
After Homography matrix is found, alignment process is initiated to transform an image to match the view point of the other image it is being composted with. Alignment in simple terms is a change in the coordinates system so that it adopts a new coordinate system which outputs image matching the required viewpoint.
(defn warp-perspective
([img h s] (warp-perspective img h s (Mat.)))
([img h s res]
(Imgproc/warpPerspective img res h s)
res))
Blending consists of colors adjustment between images to compensate for exposure differences. This is often done with bluring of those areas where images intersect.
(defn stitch2 [img-a img-b]
(let [res (Mat.)
homography (calculate-homography img-a img-b)
inverse-h (inverse homography)
dimensions (find-dimensions inverse-h img-a)
s (Size. (int (* (:max_y dimensions) S-SCALE)) (int (* (:max_x dimensions) S-SCALE)))
move_h (create-move-homography inverse-h img-a)
mod_inv_h (matrix-multiplication move_h inverse-h)
res1 (Mat.)
res2 (Mat.)]
(warp-perspective img-a move_h s res1)
(warp-perspective img-b mod_inv_h s res2)
(let [result (Mat/zeros (.rows res1) (.cols res1) CvType/CV_8UC3)
mask (bitwise_not (threshold res2 0 255))
result1 (add res1 result mask)
final-img (add res2 result1)]
final-img)))
In future stitch will first group images in several clusters based on latitude and longitude to create intermediary results which will then be stitched to final image. Also it is planned to support Image Georeferencing so that resulting images can be used in any GIS software such as QGIS, GeoServer and so on.