There are many ways to read Java files, but I measured which one was the fastest. This time, I measured by reading all lines.
1.Files.readAllLines Read all lines of a file using readAllLines of java.nio.file.Files introduced from Java7. Internally, it just creates a BufferedReader, reads it line by line, sets it in a List, and returns it.
try {
List<String> lines = Files.readAllLines(path, StandardCharsets.UTF_8);
} catch (IOException e) {
e.printStackTrace();
}
try (BufferedReader reader = new BufferedReader(new FileReader(file));){
String str;
List<String> lines = new ArrayList<>();
while ((str = reader.readLine()) != null) {
lines.add(str);
}
} catch (IOException e) {
e.printStackTrace();
}
try (BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream(file), StandardCharsets.UTF_8));){
String str;
List<String> lines = new ArrayList<>();
while ((str = reader.readLine()) != null) {
lines.add(str);
}
} catch (IOException e) {
e.printStackTrace();
}
try {
List<String> lines = Files.lines(path, StandardCharsets.UTF_8).collect(Collectors.toList());
} catch (IOException e) {
e.printStackTrace();
}
I tried running it with the following source TestFile is a 200,000 line text file.
private static final String PATH = "src/test/java/com/test/common/TestFile";
public static void main(String[] args) {
Path path = Paths.get(PATH);
File file = path.toFile();
// Files.ReadAllLines measurement
Runnable test1 = () -> {
try {
List<String> lines = Files.readAllLines(path, StandardCharsets.UTF_8);
} catch (IOException e) {
e.printStackTrace();
}
};
//BufferedReader Measurement of FileReader
Runnable test2 = () -> {
try (BufferedReader reader = new BufferedReader(new FileReader(file));){
String str;
List<String> lines = new ArrayList<>();
while ((str = reader.readLine()) != null) {
lines.add(str);
}
} catch (IOException e) {
e.printStackTrace();
}
};
//BufferedReader InputStreamReader FileInputStream measurement
Runnable test3 = () -> {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream(file), StandardCharsets.UTF_8));){
String str;
List<String> lines = new ArrayList<>();
while ((str = reader.readLine()) != null) {
lines.add(str);
}
} catch (IOException e) {
e.printStackTrace();
}
};
// Files.ReadAllLines measurement
Runnable test4 = () -> {
try {
List<String> lines = Files.lines(path, StandardCharsets.UTF_8).collect(Collectors.toList());
} catch (IOException e) {
e.printStackTrace();
}
};
calcFuncTime(test1);
calcFuncTime(test2);
calcFuncTime(test3);
calcFuncTime(test4);
}
/**
*Measure time
* @param runnable
*/
private static void calcFuncTime(Runnable runnable) {
long start = System.nanoTime();
runnable.run();
long end = System.nanoTime();
System.out.println(end - start);
}
Execution result
101206619
178792154
53933069
102386121
When checking the execution results, the slowest case was the second case, and the earliest was the third case. However, the difference is about 0.1 seconds, so if there is a difference of about 200,000 lines, I think that you should hardly worry about it. Personally, I don't recommend using FileReader because it doesn't use character encoding and is slow. The method using FileInputStream is the fastest, but since the amount of description is large, the method using Files is probably the best. Please excuse me.
Recommended Posts