Mark Needham

Thoughts on Software Development

Java: Incrementally read/stream a CSV file

with 3 comments

I’ve been doing some work which involves reading in CSV files, for which I’ve been using OpenCSV, and my initial approach was to read through the file line by line, parse the contents and save it into a list of maps.

This works when the contents of the file fit into memory but is problematic for larger files where I needed to stream the file and process each line individually rather than all of them after the file was loaded.

I initially wrote a variation on totallylazy’s Strings#lines to do this and while I was able to stream the file I made a mistake somewhere which meant the number of maps on the heap was always increasing.

After spending a few hours trying to fix this Michael suggested that it’d be easier to use an iterator instead and I ended up with the following code:

public class ParseCSVFile {
    public static void main(String[] args) throws IOException
    {
        final CSVReader csvReader = new CSVReader( new BufferedReader( new FileReader( "/path/to/file.csv" ) ), '\t' );
        final String[] fields = csvReader.readNext();
 
        Iterator<Map<String, Object>>() lazilyLoadedFile = return new Iterator<Map<String, Object>>()
        {
            String[] data = csvReader.readNext();
 
            @Override
            public boolean hasNext()
            {
                return data != null;
            }
 
            @Override
            public Map<String, Object> next()
            {
                final Map<String, Object> properties = new HashMap<String, Object>();
                for ( int i = 0; i < data.length; i++ )
                {
                    properties.put(fields[i], data[i]);
                }
 
                try
                {
                    data = csvReader.readNext();
                }
                catch ( IOException e )
                {
                    data = null;
                }
 
                return properties;
            }
 
            @Override
            public void remove()
            {
                throw new UnsupportedOperationException();
            }
        };
    }	
}

Although this code works it’s not the most readable function I’ve ever written so any suggestions on how to do this in a cleaner way are welcome.

Written by Mark Needham

October 14th, 2013 at 7:27 am

Posted in Java

Tagged with

  • wilfredspringer

    Just a couple of thoughts: your reader never seems to close the file. And if you’d try, then you would notice that the Iterator approach actually makes it quite hard to do. When would you do it? When hasNext() returns false?

    I wrote a CSV parser in Scala, also because of issues with OpenCSV. (Memory usage is just one of them. There many other issues, like MacOS Excel CSV output support, and supporting cells with linebreaks, to name a few.) In my Scala version, I implemented Traversable instead, since with Traversable you know when you’re at the end of the file and when you need to close it. An Iteratee might have been the better approach, since then you can terminate halfway the file, and still make sure resources are closed correctly.

  • http://www.markhneedham.com/blog Mark Needham

    @wilfredspringer:disqus good point…I was actually thinking about that before I posted it so I had a look at the Open CSV examples and they don’t seem to close the reader anywhere for some reason.

    What happens if you don’t? Would it get closed when the object gets garbage collected or would it be a memory leak?

  • wilfredspringer

    You would eventually run out of file handles. It might be there is a finalizer for a FileInputStream in some JDKs, but I doubt it, and I wouldn’t rely on it.