From the elevation PNG image created in "Read the data of Shizuoka point cloud DB with Java and generate aerial photograph and elevation PNG." , "Creation of tree height data using aerial laser survey data" introduced at the National Land Research Institute Try to detect the canopy.
The elevation PNG created in "Read the data of Shizuoka Point Cloud DB with Java and generate aerial photograph and elevation PNG." is DSM. Since it seems to correspond to (digital surface model), we need to extract the data of the terrain part from here. Looking at the "Aerial Laser Surveying Mechanism" on the Geospatial Information Authority of Japan website, in the forest area, the laser is blocked by trees, etc. It may not reach, and it seems that it may arrive. For this reason, this time we set a 2m mesh for the elevation PNG, and regarded the point with the lowest elevation in each mesh as the elevation of the earth's surface. I created the following DemMeshImterpolationProcesser class and generated an elevation PNG equivalent to DEM. ImageProcessingObservable is an Observable interface for checking the progress.
import java.awt.image.BufferedImage;
import java.io.File;
import java.util.Observable;
import javax.imageio.ImageIO;
public class DemMeshImterpolationProcesser extends Observable implements ImageProcessingObservable{
private double[][] dem;
private Cell[][] cell;
private BufferedImage image;
private String message;
public DemMeshImterpolationProcesser(BufferedImage img){
dem=ElevationPngUtil.imgToDouble(img);
image=new BufferedImage(img.getWidth(),img.getHeight(),BufferedImage.TYPE_INT_RGB);
}
public BufferedImage getImage(){
return image;
}
public void drawMinH(){
for(int i=0;i<cell.length;i++){
for(int j=0;j<cell[i].length;j++){
cell[i][j].setRGB(ElevationPngUtil.getRGB(cell[i][j].getMinH()));
}
}
}
public void createImageCell(int size){
int ww=dem.length;
int hh=dem[0].length;
cell=new Cell[ww/size][hh/size];
int x=0;
int y=0;
for(int i=0;i<cell.length;i++){
for(int j=0;j<cell[i].length;j++){
cell[i][j]=new Cell();
cell[i][j].x=new int[size];
cell[i][j].y=new int[size];
for(int m=0;m<size;m++){
cell[i][j].y[m]=y;
y=(y+1)%hh;
}
for(int m=0;m<size;m++){
cell[i][j].x[m]=x+m;
}
if(y==0)x=x+size;
}
}
}
public void output(File f)throws Exception{
ImageIO.write(image, "png", f);
update ("End output");
}
protected void update(String mes){
message=mes;
setChanged();
notifyObservers();
}
@Override
public String progress() {
return message;
}
protected class Cell{
int[] x;
int[] y;
double getMinH(){
double min=Double.MAX_VALUE;
for(int i=0;i<x.length;i++){
for(int j=0;j<y.length;j++){
if(Double.isNaN(dem[x[i]][y[j]])||dem[x[i]][y[j]]<0)continue;
min=Math.min(min, dem[x[i]][y[j]]);
}
}
if(min==Double.MAX_VALUE){
return 0;
}else{
return min;
}
}
void setRGB(int rgb){
for(int i=0;i<x.length;i++){
for(int j=0;j<y.length;j++){
image.setRGB(x[i], y[j], rgb);
}
}
}
}
}
This is the DEM elevation PNG generated above. (The following is a reduced size JPG.) After generating the elevation PNG, it was smoothed with a 5x5 Garcian filter.
Since there are many pixels with no data in the elevation PNG created last time, I generated a TIN and interpolated it. "Tinfour" was used to create and interpolate the TIN. The license for "Tinfour" is "the Apache License, Version 2.0".
/**
* TIN generation
*/
public void process(){
for(int i=0;i<img.getWidth();i++){
for(int j=0;j<img.getHeight();j++){
int rgb=img.getRGB(i, j);
double hh=ElevationPngUtil.getZ(rgb);
if(Double.isNaN(hh))continue;
Vertex v=new Vertex(i,j,hh,index++);
list.add(v);
}
}
tin=new IncrementalTin();
tin.add(list, null);
}
/**
* Interpolation by TIN
* @return
*/
public BufferedImage interpolation(){
BufferedImage ret=new BufferedImage(img.getWidth(),img.getHeight(),BufferedImage.TYPE_INT_RGB);
setInitVal(ret);
TriangularFacetInterpolator tfi=new TriangularFacetInterpolator(tin);
VertexValuatorDefault vvd=new VertexValuatorDefault();
double n=img.getWidth()*img.getHeight();
for(int i=0;i<img.getWidth();i++){
for(int j=0;j<img.getHeight();j++){
double hh=tfi.interpolate(i, j, vvd);
if(hh>0)ret.setRGB(i, j, ElevationPngUtil.getRGB(hh));
}
}
return ret;
}
The DSM elevation PNG inner layered by TIN is as follows. (The following is a reduced size JPG.)
The elevation model (DCHM) of the height of trees, etc. is obtained by subtracting DEM from DSM. The DCHM elevation PNG created by subtracting the DEM elevation data from the DSM elevation data is as follows. (The following is a reduced size JPG.)
The elevation map (tree height map) created from DCHM elevation PNG is as follows.
"Aerial Laser Surveying Mechanism" introduces a method for detecting the maximum value point with a 3m mesh. This time, we set a search circle with a radius of 2.0 m and detected the vertex position with the following code.
private void findPeaks(){
for(double z=highH;z>=lowH;z=z-1){
double dv=(++vv)/nn*100;
for(int i=0;i<width;i++){
for(int j=0;j<height;j++){
if(ck[i][j]!=0)continue;
if(val[i][j]>=z){
Ellipse2D e=new Ellipse2D.Double(i-this.radius, j-this.radius, this.radius*2, this.radius*2);
if(isHiMax(e,val[i][j])){
setPeak(e);
points.add(new Point3d(e.getCenterX(),e.getCenterY(),val[i][j]));
}
}
}
}
}
}
private boolean isHiMax(Ellipse2D e,double v){
int xs=(int)Math.max(e.getX(), 0);
int ys=(int)Math.max(e.getY(), 0);
int xe=(int )Math.min(e.getX()+e.getWidth(), width);
int ye=(int)Math.min(e.getY()+e.getHeight(),height);
for(int i=xs;i<xe;i++){
for(int j=ys;j<ye;j++){
if(val[i][j]>v)return false;
}
}
return true;
}
private void setPeak(Ellipse2D e){
int xs=(int)Math.max(e.getX(), 0);
int ys=(int)Math.max(e.getY(), 0);
int xe=(int )Math.min(e.getX()+e.getWidth(), width);
int ye=(int)Math.min(e.getY()+e.getHeight(),height);
for(int i=xs;i<xe;i++){
for(int j=ys;j<ye;j++){
if(ck[i][j]!=0)continue;
if(e.contains(i, j))ck[i][j]=1;
}
}
}
The HCHM peak points (standing tree position?) Detected above are as follows. 12392 peak points (standing tree position?) Were detected in the image range (674m x 600m).
Voronoi division was performed using "Tinfour" with the HCHM peak point as the position of the tree.
public List<ThiessenPolygon> process(){
BoundedVoronoiBuildOptions options = new BoundedVoronoiBuildOptions();
options.enableAutomaticColorAssignment(true);
BoundedVoronoiDiagram diagram = new BoundedVoronoiDiagram(vertList, options);
List<ThiessenPolygon> polyList=diagram.getPolygons();
return polyList;
}
The following Voronoi division results are shown. When magnified, the shadows of the canopy and the peaks / regions that can be seen in the photograph appear to be roughly divided.
In order to confirm whether the peak is detected properly, I visualized the DCHM pixel information estimated as the canopy with Jyz3d. I checked some places, but the peak was detected in the form shown in the figure below.
I tried to analyze the tree height etc. with Java from the Shizuoka point cloud DB data, but somehow I could detect it like that. Since there is no verification data, it is within the range of play, but it is difficult for broad-leaved trees etc., but I felt that it was possible to detect the number of trees and canopy as it is in sugi tree planting etc. In addition, in "Creation of tree height data using aerial laser survey data", "Tree height data by aerial laser survey is point cloud density. Except for data on evergreen coniferous forests with high densities, they generally do not match the measured values of individual trees. "However, if you take the average value in the range of about 30m square, the correlation will be better. ] Is stated.
If the results of surveys and surveys are made more widely available, such as Shizuoka Point Cloud DB, it will be useful for various purposes.