JDBC Driver for CSV

Build 20.0.7654

Establishing a Connection

Creating a JDBC Data Source

You can create a JDBC data source to connect from your Java application. Creating a JDBC data source based on the CData JDBC Driver for CSV consists of three basic steps:

  • Add the driver JAR file to the classpath. The JAR file is located in the lib subfolder of the installation directory. Note that the .lic file must be located in the same folder as the JAR file.
  • Provide the driver class. For example:
    cdata.jdbc.csv.CSVDriver
  • Provide the JDBC URL. For example:
    jdbc:csv:GenerateSchemaFiles=OnStart;URI=https://MyAPI;Location=C:\\MySchemaFolder;
    
    or
    
    jdbc:cdata:csv:GenerateSchemaFiles=OnStart;URI=https://MyAPI;Location=C:\\MySchemaFolder;

    The second format above can be used whenever there is a conflict in your application between drivers using the same URL format to ensure you are using the CData driver. The URL must start with either "jdbc:csv:" or "jdbc:cdata:csv:" and can include any of the connection properties in name-value pairs separated with semicolons.

Connecting to CSV Data Sources

The CData JDBC Driver for CSV allows connecting to local and remote CSV resources. Set the URI property to the CSV resource location, in addition to any other properties necessary to connect to your data source.

Service provider URI formats InitiateOAuth OAuthClientId OAuthClientSecret OAuthAccessToken OAuthAccessTokenSecret User Password AuthScheme AzureAccount AzureAccessKey AWSAccessKey AWSSecretKey AWSRegion AccessKey SecretKey Region OracleNamespace ProjectId
Local localPath

file://localPath

HTTP or HTTPS http://remoteStream

https://remoteStream

OPTIONAL OPTIONAL OPTIONAL
Amazon S3 s3://remotePath REQUIRED (your AccessKey) REQUIRED (your SecretKey) OPTIONAL
Azure Blob Storage azureblob://mycontainer/myblob/ REQUIRED REQUIRED (your AccessKey)
Azure Data Lake Store Gen1 adl://remotePath

adl://Account.azuredatalakestore.net@remotePath

REQUIRED OPTIONAL OPTIONAL OPTIONAL REQUIRED REQUIRED
Azure Data Lake Store Gen2 abfs://myfilesystem/remotePath REQUIRED REQUIRED (your AccessKey)
Azure Data Lake Store Gen2 with SSL abfss://myfilesystem/remotePath REQUIRED REQUIRED (your AccessKey)
Google Drive gdrive://remotePath OPTIONAL OPTIONAL OPTIONAL OPTIONAL OPTIONAL REQUIRED
OneDrive onedrive://remotePath OPTIONAL OPTIONAL OPTIONAL OPTIONAL OPTIONAL REQUIRED
Box box://remotePath OPTIONAL REQUIRED REQUIRED OPTIONAL OPTIONAL REQUIRED
Dropbox dropbox://remotePath OPTIONAL REQUIRED REQUIRED OPTIONAL OPTIONAL REQUIRED
SharePoint Online SOAP sp://remotePath REQUIRED REQUIRED
SharePoint Online REST sprest://remotePath OPTIONAL OPTIONAL OPTIONAL OPTIONAL REQUIRED
FTP or FTPS ftp://server:port/remotePath

ftps://server:port/remotepath

REQUIRED REQUIRED
SFTP sftp://server:port/remotePath OPTIONAL OPTIONAL
Wasabi wasabi://bucket1/remotePath; REQUIRED (your AccessKey) REQUIRED (your SecretKey) OPTIONAL
Google Cloud Storage gs://bucket/remotePath; OPTIONAL OPTIONAL OPTIONAL OPTIONAL OPTIONAL REQUIRED REQUIRED
Oracle Cloud Storage os://bucket/remotePath; REQUIRED (your AccessKey) REQUIRED (your SecretKey) OPTIONAL REQUIRED

Connecting to Local Files

Set the URI to a folder containing CSV files.

Below is an example connection string:

URI=C:\folder1;

You can also connect to multiple CSV files which share the same schema. Below is an example connection string:

URI=C:\folder; AggregateFiles=True;

If you would prefer to expose all of the individual CSV files as tables instead, leave this property False.

URI=C:\folder; AggregateFiles=False;

Connecting to HTTP CSV Streams

Set the URI to the HTTP or HTTPS URL of the CSV resource you want to access as a table. For example:

URI=http://www.host1.com/streamname1;
To authenticate, set AuthScheme and the corresponding properties. Specify additional headers in CustomHeaders to modify the query string, set CustomUrlParams.

To query the CSV stream, reference streamedtable as the table name.

SELECT * FROM streamedtable

Connecting to Amazon S3

Set the URI to the bucket and folder. Additionally, set the following properties to authenticate:

  • AWSAccessKey: Set this to an Amazon Web Services Access Key (a username).
  • AWSSecretKey: Set this to an Amazon Web Services Secret Key.
For example:
URI=s3://bucket1/folder1; AWSAccessKey=token1; AWSSecretKey=secret1; AWSRegion=OHIO;
Optionally, specify AWSRegion in addition.

Note: It is also possible to connect to S3-compatible services by specifying its base Url. For example, if the Url conn prp is set to http://s3.%region%.myservice.com and Region is region-1, then we will generate request URLs like https://s3.region-1.myservice.com/bucket/... (or like https://bucket.s3.region-1.myservice.com/..., if the UseVirtualHosting property is true).

Connecting to Oracle Cloud Object Storage

Set the URI to the bucket and folder. Additionally, set the following properties to authenticate:

  • AccessKey: Set this to an Oracle cloud Access Key.
  • SecretKey: Set this to an Oracle cloud Secret Key.
  • OracleNamespace: Set this to an Oracle cloud namespace.
For example:
URI=os://bucket/remotePath/; AccessKey=token1; SecretKey=secret1; OracleNamespace=myNamespace; Region=us-ashburn-1;
Optionally, specify Region in addition.

Connecting to Wasabi

Set the URI to the bucket and folder. Additionally, set the following properties to authenticate:

  • AWSAccessKey: Set this to a Wasabi Access Key (a username)
  • AWSSecretKey: Set this to a Wasabi Secret Key.
Optionally, specify AWSRegion in addition.

For example:

URI=wasabi://bucket1/folder1; AWSAccessKey=token1; AWSSecretKey=secret1; AWSRegion=OHIO;

Connect to Azure Blob Storage

Set the URI to the name of your container and the name of the blob. Additionally, set the following properties to authenticate:

  • AzureAccount: Set this to the account associated with the Azure blob.
  • AzureAccessKey: Set this to the access key associated with the Azure blob.
For example:
URI=azureblob://mycontainer/myblob/; AzureAccount=myAccount; AzureAccessKey=myKey;

Connect to Azure Data Lake Store Gen 2

Set the URI to the name of the file system and the name of the folder which contacts your CSV files. Additionally, set the following properties to authenticate:

  • AzureAccount: Set this to the account associated with the Azure data lake store.
  • AzureAccessKey: Set this to the access key associated with the Azure data lake store.
For example:
  URI=abfs://myfilesystem/folder1; AzureAccount=myAccount; AzureAccessKey=myKey;
  URI=abfss://myfilesystem/folder1; AzureAccount=myAccount; AzureAccessKey=myKey;

Connecting to Box

Set the URI to the path to a folder containing CSV files. To authenticate to Box, use the OAuth authentication standard. See Connecting to Box for an authentication guide.

For example:

URI=box://folder1; InitiateOAuth=GETANDREFRESH; OAuthClientId=oauthclientid1; OAuthClientSecret=oauthcliensecret1; CallbackUrl=http://localhost:12345;

Connecting to Dropbox

Set the URI to the path to a folder containing CSV files. To authenticate to Dropbox, use the OAuth authentication standard. See Connecting to Dropbox for an authentication guide. You can authenticate with a user account or a service account. In the user account flow, you do not need to set any connection properties for your user credentials, as shown in the connection string below:

URI=dropbox://folder1/; InitiateOAuth=GETANDREFRESH; OAuthClientId=oauthclientid1; OAuthClientSecret=oauthcliensecret1; CallbackUrl=http://localhost:12345;

Connecting to SharePoint Online SOAP

Set the URI to a document library containing CSV files. To authenticate, set User and Password and SharepointUrl.

For example:

URI=sp://Documents/folder1; User=user1; Password=password1; SharepointUrl=https://subdomain.sharepoint.com;

Connecting to SharePoint Online REST

Set the URI to a document library containing CSV files. SharepointUrl is optional. If not provided, the driver will work with the root drive. To authenticate, use the OAuth authentication standard.

For example:

URI=sp://Documents/folder1; InitiateOAuth=GETANDREFRESH; SharepointUrl=https://subdomain.sharepoint.com;

Connecting to FTP

Set the URI to the address of the server followed by the path to the folder to be used as the root folder. To authenticate, set User and Password.

For example:

URI=ftps://localhost:990/folder1; User=user1; Password=password1; 

Connecting to Google Cloud Storage

Set the URI to the path to the name of the file system and the name of the folder which contacts your CSV files. To authenticate to Google APIs, provide a ProjectId.

For example:

URI=gs://bucket/remotePath/; ProjectId=PROJECT_ID; 

Connecting to Other Sources: System Streams

You can also read from and write to system streams. Reference the stream from code with the ExtendedProperties connection property.

InputStream sourceStream = new FileInputStream(localPath);
Properties properties = new Properties();
properties.put("URI", "{streamedtable1}");
properties.put("ExtendedProperties", sourceStream);
Connection connection = DriverManager.getConnection("jdbc:csv", properties);
Statement statement = connection.createStatement()
ResultSet resultSet = statement.executeQuery("SELECT * FROM streamedtable1")

Securing CSV Connections

By default, the driver attempts to negotiate SSL/TLS by checking the server's certificate against the system's trusted certificate store. To specify another certificate, see the SSLServerCert property for the available formats to do so.

Copyright (c) 2020 CData Software, Inc. - All rights reserved.
Build 20.0.7654