Spring Batch 2.0 – Part II – Flat File To Database

Part I of my Spring Batch blog ran through an example of a basic Spring Batch job. Now lets put together one that reads 200,000 rows from a flat file and inserts them into the database. The entire process took around 1 minute and 10 seconds to execute. That is pretty good time for Java-based batch processing. In all fairness I must point out that, this is relatively fast since, I am using a local HSQLDB database and there is no processing related logic being performed during the entire process.

The file is a comma separated file with format => receiptDate,memberName,checkNumber,checkDate,paymentType,depositAmount,paymentAmount,comments

The DDL for the database table:

Here is the spring application context xml file…

  • 1 through 4 are the same as the previous blog.
  • 5 – Defines the job and its steps. Also registers a job listener and a step listener
  • 6 – The reader used to read the flat file with comma separated columns. The FlatFileItemReader will read the rows from the flat file and pass it to the writer to persist to the database..
  • 8 – Not shown here is the item writer. It is configured using annotations and the class is LedgerWriter.

Now for some of the Java code. Here is the DAO – LedgerDAOImpl.java

Here is the mapper – LedgerMapper.java

Here is the writer – LedgerWriter.java

The test driver class is a JUnit class.

Running the test case will insert approx 200k rows into the ledger table. It took roughly 1:12 seconds for the entire process to execute.

Next move over to Spring Batch – Part III – From Database to Flat File. You can download the Maven project from GitHub – https://github.com/thomasma/springbatch3part.