Create a 3D model (PLY format) from the Hyogo prefecture numerical topographic map DSM

1. Overview

Last weekend, Hyogo Prefecture released a 1m mesh 3D data set for all prefectures. The public data is DSM, DEM, CS stereoscopic drawing, the license is C.C.4.0, "It is data that can be used secondarily for any purpose, so please use it for various purposes."

-Hyogo Prefecture HP: Japan's first "high-precision 3D data of all prefectures" released -[G Spatial Information Center: Hyogo Prefecture Global Topographic Map Portal (2010-2018) (https://www.geospatial.jp/ckan/dataset/2010-2018-hyogo-geo-potal)

As a three-dimensional data set that has been released so far, there is a digital elevation model (5m mesh DEM) of Geographical Survey Institute: Basic Map Information Site. However, the digital elevation map of the entire Hyogo prefecture has 25 times the data density, and in addition to the DEM, the DSM including buildings and forests is open to the public. I immediately wanted to use this to create a three-dimensional model of cities and forests. snapshot00.png

2. What to make this time

DSM published in Hyogo Prefecture is a data set consisting of x, y coordinates and elevation values of the 5th plane Cartesian coordinates system. I want color information to make a 3D model. For this reason, first, Geographical Survey Tiles: [Latest Photos of Japan](https://maps.gsi.go.jp/development/ ichiran.html # seamlessphoto) Get the color information of each mesh from the image tile and create a tool to convert DSM to XYZ RGB format data. Next, create a tool that generates a face from the vertex data of XYZRGB and generates a 3D model in PLY format. For the specifications of PLY format, I referred to the following site.

3. Implementation of DSM-> XYZ RGB conversion tool

(1) Process flow

The processing flow is as follows.

  1. Inspect the x and y coordinates of DSM (Plane Cartesian Coordinates 5th system) and obtain the area of the DSM point cloud. To do.
  2. Coordinate conversion of the DSM point cloud area to a rectangular area of latitude and longitude.
  3. Convert the rectangular area of latitude and longitude to pixel coordinates and grasp the tile coordinates to be acquired.
  4. Get the Image tile of the coordinates corresponding to DSM and concatenate.
  5. Convert the DSM point cloud coordinates to pixel coordinates and obtain the color information of the corresponding pixel from the acquired aerial photograph image. To do.
  6. Output to a text file in the format of "x y z r g b".

(2) Implementation of coordinate conversion class (plane orthogonal coordinate system <-> latitude / longitude)

First, implement a class that converts the xy coordinates of the plane orthogonal coordinate system to latitude and longitude. Geospatial Information Authority of Japan time signal "Simpler calculation method for coordinate conversion between latitude and longitude coordinates and plane rectangular coordinates in Gauss-Krüger projection ”, And created the following class.

LonlatXYT.java


import java.awt.geom.Point2D;

public class LonLatXY {
	private static final double a=6378137;
	private static final double rf=298.257222101;
	private static final double m0=0.9999;
	private static final double s2r=Math.PI/648000;
	private static final double n=0.5/(rf-0.5);
	private static final double n15=1.5*n;
	private static final double anh=0.5*a/(1+n);
	private static final double nsq=n*n;
	private static final double e2n=2*Math.sqrt(n)/(1+n);
	private static final double ra=2*anh*m0*(1+nsq/4+nsq*nsq/64);
	private static int jt=5;
	private static int jt2=2*jt;
	private static double ep=1.0;
	private static double[] e=getE();
	private static final double[] phi0=new double[]{0,33,33,36,33,36,36,36,36,36,40,44,44,44,26,26,26,26,20,26};
	private static final double[] lmbd0=new double[]{0,7770,7860,7930,8010,8060,8160,8230,8310,8390,8450,8415,8535,8655,8520,7650,7440,7860,8160,9240};
	private static double[] alp=getAlp();
	private static double[] beta=getBeta();
	private static double[] dlt=getDlt();


	private static double[] getAlp(){
		double[] alp=new double[6];
		alp[1]=(1.0/2.0+(-2.0/3.0+(5.0/16.0+(41.0/180.0-127.0/288.0*n)*n)*n)*n)*n;
		alp[2]=(13.0/48.0+(-3.0/5.0+(557.0/1440.0+281.0/630.0*n)*n)*n)*nsq;
		alp[3]=(61.0/240.0+(-103.0/140.0+15061.0/26880.0*n)*n)*n*nsq;
		alp[4]=(49561.0/161280.0-179.0/168.0*n)*nsq*nsq;
		alp[5]=34729.0/80640.0*n*nsq*nsq;
		return alp;
	}

	private static double[] getBeta(){
		double[] beta=new double[6];
		beta[1]=(1.0/2.0+(-2.0/3.0+(37.0/96.0+(-1.0/360.0-81.0/512.0*n)*n)*n)*n)*n;
		beta[2]=(1.0/48.0+(1.0/15.0+(-437.0/1440.0+46.0/105.0*n)*n)*n)*nsq;
		beta[3]=(17.0/480.0+(-37.0/840.0-209.0/4480.0*n)*n)*n*nsq;
		beta[4]=(4397.0/161280.0-11.0/504.0*n)*nsq*nsq;
		beta[5]=4583.0/161280.0*n*nsq*nsq;
		return beta;
	}

	private static double[] getDlt(){
		double[] dlt=new double[7];
		dlt[1]=(2.0+(-2.0/3.0+(-2.0+(116.0/45.0+(26.0/45.0-2854.0/675.0*n)*n)*n)*n)*n)*n;
		dlt[2]=(7.0/3.0+(-8.0/5.0+(-227.0/45.0+(2704.0/315.0+2323.0/945.0*n)*n)*n)*n)*nsq;
		dlt[3]=(56.0/15.0+(-136.0/35.0+(-1262.0/105.0+73814.0/2835.0*n)*n)*n)*n*nsq;
		dlt[4]=(4279.0/630.0+(-332.0/35.0-399572.0/14175.0*n)*n)*nsq*nsq;
		dlt[5]=(4174.0/315.0-144838.0/6237.0*n)*n*nsq*nsq;
		dlt[6]=601676.0/22275.0*nsq*nsq*nsq;
		return dlt;
	}

	private static double[] getE(){
		double[] e=new double[jt2+2];
		for(int k=1;k<=jt;k++){
			ep*=e[k]=n15/k-n;
			e[k+jt]=n15/(k+jt)-n;
		}
		return e;
	}

	public static Point2D xyToLonLat(int num,double xx,double yy){
		double x=yy;
		double y=xx;
		double xi=(x+m0*Merid(2*phi0[num]*3600*s2r))/ra;
		double xip=xi;
		double eta=y/ra;
		double etap=eta;
		double sgmp=1;
		double taup=0;
		for(int j=beta.length-1;j>0;j--){
			double besin=beta[j]*Math.sin(2*j*xi);
			double becos=beta[j]*Math.cos(2*j*xi);
			xip -=besin*Math.cosh(2*j*eta);
			etap -=becos*Math.sinh(2*j*eta);
			sgmp -=2*j*becos*Math.cosh(2*j*eta);
			taup +=2*j*besin*Math.sinh(2*j*eta);
		}
		double sxip=Math.sin(xip);
		double cxip=Math.cos(xip);
		double shetap=Math.sinh(etap);
		double chetap=Math.cosh(etap);
		double chi=Math.asin(sxip/chetap);
		double phi=chi;
		for(int j=dlt.length-1;j>=0;j--){
			phi +=dlt[j]*Math.sin(2*j*chi);
		}
		double nphi=(1-n)/(1+n)*Math.tan(phi);

		double lmbd=lmbd0[num]*60+Math.atan2(shetap, cxip)/s2r;
		double lat=phi/s2r/3600;
		double lon=lmbd/3600;
		return new Point2D.Double(lon,lat);
	}

	public static Point2D lonlatToXY(int num,double lon,double lat){
		double phirad=Math.toRadians(lat);
		double lmbddeg=Math.floor(lon);
		double lmbdmin=Math.floor(60.0*(lon-lmbddeg));
		double lmbdsec=lmbddeg*3600.0+lmbdmin*60.0+(lon-lmbddeg-lmbdmin/60)*3600.0;

		double sphi=Math.sin(phirad);
		double nphi=(1-n)/(1+n)*Math.tan(phirad);
		double dlmbd=(lmbdsec-lmbd0[num]*60.0)*s2r;
		double sdlmbd=Math.sin(dlmbd);
		double cdlmbd=Math.cos(dlmbd);
		double tchi=Math.sinh(atanh(sphi)-e2n*atanh(e2n*sphi));
		double cchi=Math.sqrt(1+tchi*tchi);
		double xip=Math.atan2(tchi, cdlmbd);
		double xi=xip;
		double etap=atanh(sdlmbd/cchi);
		double eta=etap;
		double sgm=1;
		double tau=0;
		for(int j=alp.length-1;j>0;j--){
			double alsin=alp[j]*Math.sin(2*j*xip);
			double alcos=alp[j]*Math.cos(2*j*xip);
			xi +=alsin*Math.cosh(2*j*etap);
			eta +=alcos*Math.sinh(2*j*etap);
			sgm +=2*j*alcos*Math.cosh(2*j*etap);
			tau +=2*j*alsin*Math.sinh(2*j*etap);
		}
		double x=ra*xi-m0*Merid(2*phi0[num]*3600*s2r);
		double y=ra*eta;
		return new Point2D.Double(x,y);
	}

	private static double Merid(double phi2) {
			double dc=2.0*Math.cos(phi2);
			double[] s=new double[jt2+2];
			double[] t=new double[jt2+2];
			s[1]=Math.sin(phi2);
			for(int i=1;i<=jt2;i++){
				s[i+1]=dc*s[i]-s[i-1];
				t[i]=(1.0/i-4.0*i)*s[i];
			}
			double sum=0.0;
			double c1=ep;
			int j=jt;
			while(j>0){
				double c2=phi2;
				double c3=2.0;
				int l=j;
				int m=0;
				while(l>0){
					c2 +=(c3/=e[l--])*t[++m]+(c3*=e[2*j-l])*t[++m];
				}
				sum +=c1*c1*c2 ; c1/=e[j--];
			}
			return anh*(sum+phi2);
	}

	private static double atanh(double v){
		return 0.5*Math.log((1.0+v)/(1.0-v));
	}
}

(3) Implementation of aerial photograph tile image acquisition class

We have created the following class to acquire and connect tile images from the Geographical Survey Institute map.

GSITileReader.java


import java.awt.Graphics2D;
import java.awt.Shape;
import java.awt.geom.AffineTransform;
import java.awt.geom.GeneralPath;
import java.awt.geom.Point2D;
import java.awt.geom.Rectangle2D;
import java.awt.image.BufferedImage;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.net.URL;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import javax.imageio.ImageIO;

public class GSITileReader {
	private static final String base_url="https://cyberjapandata.gsi.go.jp/xyz/seamlessphoto/";
	private static final double L=85.05112877980659;

	public static BufferedImage getGSIImage(Rectangle2D plane,int map_num,int zoom,double scale)throws IOException{
		Shape sp=getLonLatShapeAtPlane(map_num, plane.getX(), plane.getY(), plane.getWidth(), plane.getHeight());
		Rectangle2D rect=sp.getBounds2D();
		long[] topLeft=lonlatToPixel(zoom,rect.getX(),rect.getY());
		long[] bottomRight=lonlatToPixel(zoom,rect.getX()+rect.getWidth(),rect.getY()+rect.getHeight());
		Set<Long> x=new HashSet<Long>();
		Set<Long> y=new HashSet<Long>();
		for(long i=topLeft[1];i<=bottomRight[1];i++){
			x.add((long)Math.ceil(i/256));
		}
		for(long i=bottomRight[2];i<=topLeft[2];i++){
			y.add((long)Math.ceil(i/256));
		}
		Long[] xx=x.toArray(new Long[x.size()]);
		Long[] yy=y.toArray(new Long[y.size()]);
		Arrays.sort(xx);
		Arrays.sort(yy);
		BufferedImage im=new BufferedImage(xx.length*256,yy.length*256,BufferedImage.TYPE_INT_RGB);
		Graphics2D g=im.createGraphics();
		for(int i=0;i<xx.length;i++){
			for(int j=0;j<yy.length;j++){
				try{
					String url=base_url+Integer.toString(zoom)+"/"+Long.toString(xx[i])+"/"+Long.toString(yy[j])+".jpg ";
					BufferedImage tmp=ImageIO.read(new URL(url));
					g.drawImage(tmp, i*256, j*256, null);
				}catch(IOException e){
					e.printStackTrace();
				}
			}
		}
		g.dispose();
		im=subImage(im,plane,(long)xx[0],(long)yy[0],map_num,zoom,scale);
		return im;
	}

	private static BufferedImage subImage(BufferedImage src,Rectangle2D rect,long minX,long minY,int num,int zoom,double scale){
		double xx=rect.getX();
		double yy=rect.getY();
		double ww=Math.abs(rect.getWidth());
		double hh=Math.abs(rect.getHeight());
		minX=minX*256;
		minY=minY*256;
		BufferedImage dst=new BufferedImage((int)(ww/scale),(int)(hh/scale),BufferedImage.TYPE_INT_RGB);
		System.out.println(dst.getWidth()+"/"+dst.getHeight());
		System.out.println(minX+"/"+minY);
		for(int i=0;i<dst.getWidth();i++){
			for(int j=0;j<dst.getHeight();j++){
				double x=xx+scale*i;
				double y=yy-scale*j;
				Point2D p=LonLatXY.xyToLonLat(num, x, y);
				long[] pc=lonlatToPixel(zoom, p.getX(), p.getY());
				int px=(int)(pc[1]-minX);
				int py=(int)(pc[2]-minY);
				int color=src.getRGB(px, py);
				dst.setRGB(i, j, color);
			}
		}
		return dst;
	}

	public static long[] lonlatToPixel(int zoom,double lon,double lat){
		long x=(long)(Math.pow(2, zoom+7)*(lon/180.0+1.0));
		long y=(long)((Math.pow(2, zoom+7)/Math.PI)*(-atanh(Math.sin(Math.toRadians(lat)))+atanh(Math.sin(Math.toRadians(L)))));
		return new long[]{(long)zoom,x,y};
	}

	private static double atanh(double v){
		return 0.5*Math.log((1.0+v)/(1.0-v));
	}

	public static AffineTransform createTfwTransform(Rectangle2D rectXY,BufferedImage img){
		double sx=rectXY.getWidth()/img.getWidth();
		double sy=rectXY.getHeight()/img.getHeight();
		double x=rectXY.getX();
		double y=rectXY.getY()+rectXY.getHeight();
		AffineTransform af=new AffineTransform(new double[]{sx,0,0,-sy,x,y});
        	return af;
	}

	public static Shape getLonLatShapeAtPlane(int num,double x,double y,double w,double h){
		Point2D p1=LonLatXY.xyToLonLat(num, x, y);
		Point2D p2=LonLatXY.xyToLonLat(num, x+w, y);
		Point2D p3=LonLatXY.xyToLonLat(num, x+w, y+h);
		Point2D p4=LonLatXY.xyToLonLat(num, x, y+h);
		GeneralPath gp=new GeneralPath();
		gp.moveTo(p1.getX(),p1.getY());
		gp.lineTo(p2.getX(), p2.getY());
		gp.lineTo(p3.getX(), p3.getY());
		gp.lineTo(p4.getX(), p4.getY());
		gp.closePath();
		return gp;
	}

	public static void outTfw(AffineTransform af,File out)throws IOException{
		BufferedWriter bw=new BufferedWriter(new FileWriter(out));
		bw.write(af.getScaleX()+"\n");
		bw.write(af.getShearX()+"\n");
		bw.write(af.getShearY()+"\n");
		bw.write(af.getScaleY()+"\n");
		bw.write(af.getTranslateX()+"\n");
		bw.write(af.getTranslateY()+"\n");
		bw.close();
	}
}

(4) XYZ RGB output class

Finally, I created a tool to convert DSM to XYZRGB, such as file input / output. In this class, RGB of pixels corresponding to DSM coordinates is output from the acquired aerial photograph image and output in XYZ RGB format. For the time being, we are making it possible to support other than the plane Cartesian coordinate system 5.

XYZRgbCreator.java


import java.awt.BorderLayout;
import java.awt.Color;
import java.awt.Font;
import java.awt.datatransfer.DataFlavor;
import java.awt.datatransfer.Transferable;
import java.awt.datatransfer.UnsupportedFlavorException;
import java.awt.event.WindowAdapter;
import java.awt.event.WindowEvent;
import java.awt.geom.AffineTransform;
import java.awt.geom.Point2D;
import java.awt.geom.Rectangle2D;
import java.awt.image.BufferedImage;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
import java.util.List;

import javax.imageio.ImageIO;
import javax.swing.JComboBox;
import javax.swing.JFrame;
import javax.swing.JLabel;
import javax.swing.JOptionPane;
import javax.swing.JPanel;
import javax.swing.SwingUtilities;
import javax.swing.TransferHandler;
import javax.swing.UIManager;

public class XYZRgbCreator {

	private JFrame frame;
	private JComboBox<String> zone;

	public XYZRgbCreator(){
		frame=new JFrame();
		frame.setTitle("XYZRGB");
		frame.setDefaultCloseOperation(JFrame.DO_NOTHING_ON_CLOSE);
		try {
			UIManager.setLookAndFeel("com.sun.java.swing.plaf.nimbus.NimbusLookAndFeel");
			SwingUtilities.updateComponentTreeUI(frame);
		}catch(Exception e){
			try {
				UIManager.setLookAndFeel("com.sun.java.swing.plaf.windows.WindowsLookAndFeel");
				SwingUtilities.updateComponentTreeUI(frame);
			}catch(Exception ee){
				ee.printStackTrace();
			}
		}
		WindowAdapter wa=new WindowAdapter(){
			@Override
			public void windowClosing(WindowEvent e) {
				close();
			}
		};
		frame.addWindowListener(wa);
		frame.getContentPane().setLayout(new BorderLayout());
		zone=new JComboBox<String>(new String[]{
				null,"Plane right angle 01 series","Plane right angle 02 series","Plane right angle 03 series","Plane right angle 04th system","Plane right angle 05 series","Plane right angle 06 series","Plane right angle 07 series","Plane right angle 08 series",
				"Plane right angle 09th system","Plane right angle 10th system","Plane right angle 11th system","Plane right angle 12th system","Plane right angle 13th system","Plane right angle 14th system","Plane right angle 15th system","Plane right angle 16th system",
				"Plane right angle 17th system","Plane right angle 18th system","Plane right angle 19th system"
			});
		zone.setSelectedIndex(5);
		frame.getContentPane().add(zone,BorderLayout.NORTH);
		JLabel label=new JLabel("DSM > XYZRGB");
		label.setFont(new Font(Font.SANS_SERIF,Font.BOLD,24));
		label.setVerticalAlignment(JLabel.CENTER);
		label.setHorizontalAlignment(JLabel.CENTER);
		JPanel jp=new JPanel(new BorderLayout());
		jp.add(label,BorderLayout.CENTER);
		frame.getContentPane().add(jp,BorderLayout.CENTER);
		frame.setSize(480,480);
		frame.setResizable(false);
		DropFileHandler handler=new DropFileHandler();
		label.setTransferHandler(handler);
		jp.setTransferHandler(handler);
	}

	private void close(){
		int id=JOptionPane.showConfirmDialog(frame, "Exit?", "Info", JOptionPane.YES_NO_OPTION,JOptionPane.INFORMATION_MESSAGE);
		if(id==JOptionPane.YES_OPTION){
			frame.setVisible(false);
			System.exit(0);
		}
	}

	private class DropFileHandler extends TransferHandler {
		private static final long serialVersionUID = 1L;
		@Override
		public boolean canImport(TransferSupport support) {
			if (!support.isDrop()) {
				return false;
			}
			if (!support.isDataFlavorSupported(DataFlavor.javaFileListFlavor)) {
				return false;
			}
			return true;
		}

		@SuppressWarnings("unchecked")
		@Override
		public boolean importData(TransferSupport support) {
			if (!canImport(support)) {
		        return false;
		    }
			Transferable t = support.getTransferable();
			try {
				List<File> files = (List<File>) t.getTransferData(DataFlavor.javaFileListFlavor);
				for (File file : files){
					if(file.isDirectory())continue;
					try{
						process(file);
					}catch(IOException e){
						e.printStackTrace();
					}
				}
			} catch (UnsupportedFlavorException | IOException e) {
				e.printStackTrace();
			}
			return true;
		}
	}

	private void process(File f) throws IOException{
		if(!(f.getName().endsWith(".txt")||f.getName().endsWith(".xyz")))return;
		frame.repaint();
		int bwz=zone.getSelectedIndex();
		Rectangle2D rectXY=getBounds(f);
		BufferedImage img=GSITileReader.getGSIImage(rectXY,bwz,18,0.5);
		String name=f.getName().substring(0,f.getName().lastIndexOf("."));
		File dir=f.getParentFile();
		ImageIO.write(img, "jpg", new File(dir.getAbsolutePath()+"/"+name+".jpg "));
		AffineTransform af=new AffineTransform(new double[]{
				0.5,0,0,-0.5,rectXY.getX(),rectXY.getY()});
		GSITileReader.outTfw(af, new File(dir.getAbsolutePath()+"/"+name+".jgw"));
		try{
			af=af.createInverse();
			BufferedReader br=new BufferedReader(new FileReader(f));
			File out=new File(dir.getAbsolutePath()+"/"+name+"_color.txt");
			BufferedWriter bw=new BufferedWriter(new FileWriter(out));
			String line=null;
			String str=null;
			Rectangle2D imgRect=new Rectangle2D.Double(0, 0, img.getWidth(),img.getHeight());
			while((line=br.readLine())!=null){
				line=line.replaceAll(" ",",");
				String[] sp=line.split(",");
				double x=Double.parseDouble(sp[0]);
				double y=Double.parseDouble(sp[1]);
				Point2D p=af.transform(new Point2D.Double(x,y), new Point2D.Double());
				int xx=(int)Math.floor(p.getX());
				int yy=(int)Math.floor(p.getY());
				if(imgRect.contains(xx, yy)){
					int col=img.getRGB(xx, yy);
					Color color=new Color(col);
					str=line+" "+Integer.toString(color.getRed())+" "+Integer.toString(color.getGreen())+" "+Integer.toString(color.getBlue())+"\n";
				}else{
					str=line+" 0 0 0\n";
				}
				bw.write(str);
				bw.flush();
			}
			br.close();
			bw.close();
		}catch(Exception e){
			e.printStackTrace();
		}
	}

	private Rectangle2D getBounds(File f)throws IOException{
		double xmin=Double.MAX_VALUE;
		double xmax=-Double.MAX_VALUE;
		double ymin=Double.MAX_VALUE;
		double ymax=-Double.MAX_VALUE;
		BufferedReader br=new BufferedReader(new FileReader(f));
		String line=null;
		while((line=br.readLine())!=null){
			line=line.replaceAll(" ",",");
			String[] sp=line.split(",");
			double x=Double.parseDouble(sp[0]);
			double y=Double.parseDouble(sp[1]);
			xmin=Math.min(x, xmin);
			xmax=Math.max(x, xmax);
			ymin=Math.min(y, ymin);
			ymax=Math.max(y, ymax);
		}
		br.close();
		Rectangle2D ret=new Rectangle2D.Double(xmin,ymax,xmax-xmin+1,-(ymax-ymin)-1);
		return ret;
	}

	public static void main(String[] args){
		XYZRgbCreator gp=new XYZRgbCreator();
		gp.frame.setLocationRelativeTo(null);
		gp.frame.setVisible(true);
	}
}

(5) Deliverables

When you run XYZRgbCreator, the following window will be displayed. If you drop the Hyogo prefecture whole area numerical topographic map DSM file in the window, the process will start and the xyzrgb file will be output. Originally, the progress information of the process should be displayed, but this time it was omitted.

999.jpg

4. Implementation of RGBXYZ-> PLY conversion tool

Then implement a tool to convert XYZ RGB files to PLY files. The DSM model is a three-dimensional structure model in which only the surface is defined, which is called the surface model. Therefore, 2D [Delaunay division](https://ja.wikipedia.org/wiki/%E3%83%89%E3%83%AD%E3%83%8D%E3%83%BC%E5%9B I thought that a 3D model could be generated by executing% B3) and setting the triangular polygon to Face. Delaunay split I used the 2D Delaunay Triangulation library "TINFOUR". I confirmed the necessary information from mvn repositiry: tinfour and added it to dependancis in the pom file.

(1) Process flow

The processing flow is as follows.

  1. Create a vertex list from the XYZ information of the XYZ RGB file and a color information list from the RGB information.
  2. Pass the vertex list to the IncrementalTin instance and perform a 2D Delaunay split.
  3. Output the necessary items according to the PLY specifications.
  4. Output vertex and color information.
  5. Output Face information.

(2) Implementation of PlyCreator class

Implement the PlyCreator class that performs the above process. It's annoying, so I'm implementing it as a command line tool.

PlyCreator.java


import java.awt.Color;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.List;
import java.util.Observable;
import java.util.function.Consumer;

import org.tinfour.common.SimpleTriangle;
import org.tinfour.common.Vertex;
import org.tinfour.standard.IncrementalTin;
import org.tinfour.utils.TriangleCollector;

public class PlyCreator extends Observable{
	private List<Vertex> vertexs;
	private List<Color> colors;
	private IncrementalTin tin;

	public PlyCreator(){}

	private void outVertex(BufferedWriter bw)throws IOException{
		System.out.println("Vertex output");
		for(Vertex v : vertexs){
			StringBuffer buf=new StringBuffer();
			buf.append(Float.toString((float)v.getX())+" ");
			buf.append(Float.toString((float)v.getY())+" ");
			buf.append(Float.toString((float)v.getZ())+" ");
			Color c=colors.get(v.getIndex());
			buf.append(Integer.toString(c.getRed())+" ");
			buf.append(Integer.toString(c.getGreen())+" ");
			buf.append(Integer.toString(c.getBlue())+"\n");
			writeBytes(bw,buf.toString());
		}
	}

	private void outFace(BufferedWriter bw)throws IOException{
		System.out.println("Face output");
		Consumer<SimpleTriangle> cons=new Consumer<SimpleTriangle>() {
		    @Override
		    public void accept(SimpleTriangle arg){
				StringBuffer buf=new StringBuffer();
				buf.append("3");
				buf.append(" "+Integer.toString(arg.getVertexA().getIndex()));
				buf.append(" "+Integer.toString(arg.getVertexB().getIndex()));
				buf.append(" "+Integer.toString(arg.getVertexC().getIndex()));
				buf.append("\n");
				try{
					writeBytes(bw,buf.toString());
				}catch(IOException e){
					e.printStackTrace();
				}
		    }
		};
		TriangleCollector.visitSimpleTriangles(tin, cons);
	}

	private void writeBytes(BufferedWriter bw,String str)throws IOException{
		bw.write(str,0,str.length());
	}

	public void writePLY(File f)throws IOException{
		System.out.println("PLY output");
		Charset charset = Charset.forName("US-ASCII");
		BufferedWriter bw = Files.newBufferedWriter(f.toPath(),charset);
		writeBytes(bw,"ply\n");
		writeBytes(bw,"format ascii 1.0\n");
		writeBytes(bw,"element vertex "+Integer.toString(vertexs.size())+"\n");
		writeBytes(bw,"property float x\n");
		writeBytes(bw,"property float y\n");
		writeBytes(bw,"property float z\n");
		writeBytes(bw,"property uchar red\n");
		writeBytes(bw,"property uchar green\n");
		writeBytes(bw,"property uchar blue\n");
		writeBytes(bw,"element face "+Integer.toString(tin.countTriangles().getCount())+"\n");
		writeBytes(bw,"property list uchar int vertex_index\n");
		writeBytes(bw,"end_header\n");
		outVertex(bw);
		outFace(bw);;
		bw.close();
	}

	public void readXYZGRB(File f,String separator)throws IOException{
		System.out.println("File reading");
		BufferedReader br=new BufferedReader(new FileReader(f));
		String line=null;
		vertexs=new ArrayList<Vertex>();
		colors=new ArrayList<Color>();
		int id=0;
		while((line=br.readLine())!=null){
			String[] sp=line.split(separator);
			Vertex v=new Vertex(
				Double.parseDouble(sp[0]),
				Double.parseDouble(sp[1]),
				Double.parseDouble(sp[2]),
				id++
			);
			vertexs.add(v);
			Color c=new Color(
				Integer.parseInt(sp[3]),
				Integer.parseInt(sp[4]),
				Integer.parseInt(sp[5])
			);
			colors.add(c);
		}
		br.close();
		System.out.println("TIN processing");
		tin=new IncrementalTin();
		tin.add(vertexs, null);
	}

	public static void main(String[] args){
		 PlyCreator pc=new  PlyCreator();
		 File in=new File(args[0]);
		 File out=new File(args[1]);
		 try{
			 pc.readXYZGRB(in, ",");
			 pc.writePLY(out);
		 }catch(Exception e){
			 e.printStackTrace();
		 }
	}
}

(3) Deliverables

When the PlyCreator class is executed with the input file (XYZRGB file) and output file (PLY file) as arguments, a PLY format 3D model is generated. 898.jpg

If you load the PLY file into a 3D viewer such as MeshLab, the created model will be displayed. The image below is a model image generated from DSM near JR Sannomiya Station. snapshot02.png

5. Summary

I took color information from the Geospatial Information Authority of Japan aerial photograph, converted the Hyogo prefecture whole area numerical topographic map DSM to an XYZ RGB file, and generated a model in PLY format. It may seem a little rough when you look at the model of the urban area, but the image below is a model of the DSM of the forest area of Sayo Town, but if it is a 1 m mesh, it is a terrain model that expresses the difference in the canopy depending on the forest fauna. Can be obtained. snapshot00.png

On the Hyogo Prefecture homepage, "In order to combine this data and other data with cutting-edge technology to promote efforts to solve regional issues through collaboration between the private sector, industry, academia, and government, we offer ideas and proposals for utilization. It says, "We are recruiting." However, since the DSM and DEM of the entire Hyogo prefecture are open to the public, the numerical topographic map of the entire Hyogo prefecture can be used for various purposes. Other than Hyogo prefecture, Shizuoka prefecture publishes point cloud data on "Shizuoka point cloud database", but other prefectures also publish such public survey data. I think it would be interesting if there was movement.

Recommended Posts

Create a 3D model (PLY format) from the Hyogo prefecture numerical topographic map DSM
Create a Tokyo subway map from the CSV file of station data.jp
Create a multi-key map with the standard library