'Read a set of lines from a file using BufferedReader
I have one large CSV file of around 1.6 GB and I am trying to read that file and create JSON array of these lines and send it to other consumables processes.
I have the following code
while(consumeover){
try (BufferedReader br = new BufferedReader(new FileReader
("/my/path/largefile"),65536)) {
for (String line; (line = br.readLine()) != null;) {
String[] dataRow = line.split("\\|");
//create json array
//add each dataRow element to array
}
}
}
Now what is happening is above code reads entire file and creates json array which throws "Out of Memory" error. I want to read set of lines say 1000 lines every time I create json array. How do I set my last read position to i+1000? As file is very large, Java is throwing out of memory because of data arrays getting created.
Solution 1:[1]
The simple solution is to output each line as you get it (not save it in an array and then send/write it), or every 1000 lines as you get them. This way you only read the file in one go. The less line you hold in memory, the less memory you use.
Note: the only way to read to from line N is to read N lines and ignore them. This would become increasingly expensive as the file get larger.
Say you have a method which translates a line of CSV into JSon.
try(BufferedReader br = new BufferedReader(new FileReader(infile));
PrintWriter bw = new PrintWriter(new FileWriter(outfile))) {
for(String line; (line = br.readLine()) != null;) {
String json = process(line);
bw.println(json);
}
}
This will only need enough memory for one line of CSV and one line of JSON, no matter how big the file is.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |