Batch Processing
The CData JDBC Driver for Spark SQL enables you to take advantage of the bulk load support in Spark SQL through the JDBC batch API. You can use the batch API to execute related SQL data manipulation statements simultaneously.
Using the JDBC Batch API
The following examples show how to execute bulk operations with the PreparedStatement class.
Setting the Batch Size
Setting the maximum batch size can be necessary when the server has limitations on the size of the request that can be submitted. Set the BatchSize property to split the entire batch into batches of the specified value. Each batch is submitted to the server individually.
Or, set BatchSize to 0 to submit the entire batch.