JDBC Driver for Avro

Build 22.0.8462

Batch Processing

The CData JDBC Driver for Avro enables you to take advantage of the bulk load support in Avro through the JDBC batch API. You can use the batch API to execute related SQL data manipulation statements simultaneously.

Using the JDBC Batch API

The following examples show how to execute bulk operations with the PreparedStatement class.

Bulk Insert

To perform a bulk insert with a PreparedStatement, call addBatch for each set of parameters you want to execute as part of the bulk insert. After adding all the sets of parameters to the batch, you can execute the bulk insert by calling executeBatch.

Note: For bulk insert in Avro, first create a new table using CREATE TABLE Statements. The new table would be created in the directory specified using the URI connection property. Currently bulk insert works only if URI points to a local file or a local directory.

The executeBatch method returns an array that contains the update counts for each statement. For example:

String query = "INSERT INTO SampleTable_1 (Column1) VALUES(?)"; 
PreparedStatement pstmt = conn.prepareStatement(query);

pstmt.setString(1, "Jon Doe"); 
pstmt.addBatch();

pstmt.setString(1, "John"); 
pstmt.addBatch();

int[] r = pstmt.executeBatch();
for(int i: r)
  System.out.println(i);

Setting the Batch Size

Setting the maximum batch size can be necessary when the server has limitations on the size of the request that can be submitted. Set the BatchSize property to split the entire batch into batches of the specified value. Each batch is submitted to the server individually.

Or, set BatchSize to 0 to submit the entire batch.

Copyright (c) 2023 CData Software, Inc. - All rights reserved.
Build 22.0.8462