We have made some improvements since the last time.
Detecting similar videos in Java and OpenCV rev.1 --Qiita
In addition to the histogram, the features are compared. There are various feature algorithms, but here we will use the one called "AKAZE". I don't know the details, but it seems to have been added from OpenCV3, and it looked good (miscellaneous).
Convert to grayscale before processing.
FeatureDetector detector = FeatureDetector.create(FeatureDetector.AKAZE);
DescriptorExtractor executor = DescriptorExtractor.create(DescriptorExtractor.AKAZE);
Mat gray = new Mat();
Imgproc.cvtColor(resizedFrame, gray, Imgproc.COLOR_RGB2GRAY);
MatOfKeyPoint point = new MatOfKeyPoint();
detector.detect(gray, point);
Mat desc = new Mat();
executor.compute(gray, point, desc);
Next, use Descriptor Matcher to compare the acquired feature amount Mat. You can choose several methods for this as well, but I'm not sure about this either, so I chose BRUTE FORCE.
DescriptorMatcher macher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE);
MatOfDMatch feature = new MatOfDMatch();
macher.match(video1.getFeatureImg().get(i - 1), video2.getFeatureImg().get(i - 1), feature);
List<Double> distanceList = new ArrayList<>();
for (DMatch dMatch : feature.toList()) {
distanceList.add(Double.valueOf(dMatch.distance));
}
This time, I decided to handle the results of video comparison on average. Since it is "distance", it can be judged that the smaller value is "similar".
For the purpose of reducing the processing time, we did not compare the ones with a difference in playback time of more than 10%.
long playtime1 = video1.getPlayTime();
long playtime2 = video2.getPlayTime();
long playtimeDiff = Math.abs(playtime1 - playtime2);
if ((playtimeDiff / playtime1) > 0.1) {
return null;
}
Due to the current mechanism, it is unavoidable because it is not possible to compare things that are related to "included" videos.
Added a context menu so that you can check the file after checking the result. Here's how to add a context menu to a TableView.
ContextMenu menu = new ContextMenu();
MenuItem mi = new MenuItem("Open in explorer");
mi.setOnAction(event -> {
TableItem item = table.getSelectionModel().getSelectedItem();
String command1 = "explorer /select," + item.getOrg().getVideo1().getFilename();
String command2 = "explorer /select," + item.getOrg().getVideo2().getFilename();
try {
Runtime.getRuntime().exec(command1);
Runtime.getRuntime().exec(command2);
} catch (IOException e) {
e.printStackTrace();
}
});
menu.getItems().add(mi);
table.setContextMenu(menu);
Exception handling is appropriate ...
I was able to detect "similar videos" just by comparing the histograms more than I expected. I think that it is accurate enough to detect data related to compressed original data such as frame rate and resolution. I wonder if it will be deep learning if it becomes more than this.
The next challenge is the non-functional aspect.
The execution time was about 90 seconds for 70 files in an environment with Core i5-4690 and 4GB of memory allocated. I am running it while looking at jconsole, but the memory seems to be severe as the number of files increases, so there seems to be room for improvement.
I'm currently working on caching results and writing histograms and features to a file and not having them in memory, but that doesn't work.
I want to improve it.
The end.
Recommended Posts