I am trying to write a json file using this code:
File f = new File("words_3.json");
if (!f.exists()) {
f.createNewFile();
}
if (fileWriter == null)
fileWriter = new BufferedWriter(new FileWriter(f));
while (scanner.hasNext()) {
String text = scanner.nextLine();
fileWriter.append(text);
System.out.println("writing : "+text);
}
Statement System.out.println() shows all text in the terminal.
When I'm checking the output file, I see that only 1300 lines has been written, while there are more than 2000 lines available.
The data that you're writing in to an output stream isn't guaranteed to reach its destination immediately.
The
BufferedWritteris a so-called high-level stream which decorates the underlying stream that deals with a particular destination of data likeFileWriter(and there could be a few more streams in between them) by buffering the text output and providing a convince-methodnewLine().BufferedWrittermaintains a buffer (an array of characters) with a default size of8192. And when it gets full, it hands it out to the underlying low-level stream. In this case, to aFileWriter, which will take care of encoding the characters into bytes.When it's done, the JVM will hand the data out to the operating system via
FileOutputStream(because under the hood character streams are build on top of bite streams).So, the data written to the buffer will appear in a file in chunks:
Javadoc for method
close()says:I.e. before releasing the resource
close()invokes methodflush()which forces the cached data to be passed into its destination.If no exception occur, everything that was written into the stream is guaranteed to reach the destination when the stream is being closed.
You can also use
flush()in your code. But it has to applied with great caution. Probably when you deal with large amounts of critical data and which is useful, even when partially written (so in case of exceptions you'll lose less information). Misusing theflush()could significantly reduce the performance.