Databricks Connector for CData Sync

Build 23.0.8839
  • Databricks
    • Establishing a Connection
    • Advanced Features
      • SSL Configuration
      • Firewall and Proxy
    • Data Model
    • Connection String Options
      • Authentication
        • AuthScheme
        • Server
        • User
        • ProtocolVersion
        • Database
        • HTTPPath
        • Token
      • AWS Authentication
        • AWSAccessKey
        • AWSSecretKey
        • AWSRegion
        • AWSS3Bucket
      • Azure Authentication
        • AzureStorageAccount
        • AzureAccessKey
        • AzureTenant
        • AzureBlobContainer
      • AzureServicePrincipal Authentication
        • AzureTenantId
        • AzureClientId
        • AzureClientSecret
        • AzureSubscriptionId
        • AzureResourceGroup
        • AzureWorkspace
      • OAuth
        • OAuthClientId
      • SSL
        • SSLClientCert
        • SSLClientCertType
        • SSLClientCertPassword
        • SSLClientCertSubject
        • SSLServerCert
      • Firewall
        • FirewallType
        • FirewallServer
        • FirewallPort
        • FirewallUser
        • FirewallPassword
      • Proxy
        • ProxyAutoDetect
        • ProxyServer
        • ProxyPort
        • ProxyAuthScheme
        • ProxyUser
        • ProxyPassword
        • ProxySSLType
        • ProxyExceptions
      • Logging
        • LogModules
      • Schema
        • Location
        • BrowsableSchemas
        • Tables
        • Views
        • Catalog
        • PrimaryKeyIdentifiers
      • Databricks
        • CloudStorageType
        • StoreTableInCloud
        • QueryTableDetails
        • UseUploadApi
        • UseCloudFetch
        • UseLegacyDataModel
        • QueryAllMetadata
      • Miscellaneous
        • AllowPreparedStatement
        • ConnectRetryWaitTime
        • ApplicationName
        • AsyncQueryTimeout
        • DescribeCommand
        • DetectView
        • MaxRows
        • Other
        • PseudoColumns
        • QueryPassthrough
        • ServerConfigurations
        • Timeout
        • UseDescTableQuery
        • UseInsertSelectSyntax
        • UserDefinedViews

Databricks Connector for CData Sync

Overview

The CData Sync App provides a straightforward way to continuously pipeline your Databricks data to any database, data lake, or data warehouse, making it easily available for Analytics, Reporting, AI, and Machine Learning.

The Databricks connector can be used from the CData Sync application to pull data from Databricks and move it to any of the supported destinations.

Databricks Version Support

The Sync App leverages Databricks Thrift to enable bidirectional SQL access to Databricks data. It supports Databricks databases running Databricks Runtime Version 9.1 - 13.X and the Pro and Classic Databricks SQL versions.

Databricks Connector for CData Sync

Establishing a Connection

Adding a Connection to Databricks

To add a connection to Databricks:

  1. In the application console, navigate to the Connections page.
  2. At the Add Connections panel, select the icon for the connection you want to add.
  3. If the Databricks icon is not available, click the Add More icon to download and install the Databricks connector from the CData site.

For required properties, see the Settings tab.

For connection properties that are not typically required, see the Advanced tab.

Connecting to Databricks

To connect to a Databricks cluster, set the following properties:

  • Database: The name of the Databricks database.
  • Server: The Server Hostname of your Databricks cluster.
  • HTTPPath: The HTTP Path of your Databricks cluster.
  • Token: Your personal access token. You can obtain this value by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab.

You can find the required values in your Databricks instance by navigating to Clusters and selecting the desired cluster, and selecting the JDBC/ODBC tab under Advanced Options.

Configuring Cloud Storage

The Sync App supports DBFS, Azure Blob Storage, and AWS S3 for uploading CSV files.

DBFS Cloud Storage

To use DBFS for cloud storage, set the CloudStorageType property to DBFS.

Azure Blob Storage

Set the following properties:

  • CloudStorageType: Azure Blob storage.
  • StoreTableInCloud: True to store tables in cloud storage when creating a new table.
  • AzureStorageAccount: The name of your Azure storage account.
  • AzureAccessKey: The storage key associated with your Databricks account. Find this via the azure portal (using the root account). Select your storage account and click Access Keys to find this value.
  • AzureBlobContainer: Set to the name of you Azure Blob storage container.

AWS S3 Storage

Set the following properties:

  • CloudStorageType: AWS S3.
  • StoreTableInCloud: True to store tables in cloud storage when creating a new table.
  • AWSAccessKey: The AWS account access key. You can acquire this value from your AWS security credentials page.
  • AWSSecretKey: Your AWS account secret key. You can acquire this value from your AWS security credentials page.
  • AWSS3Bucket: The name of your AWS S3 bucket.
  • AWSRegion: The hosting region for your Amazon Web Services. You can obtain the AWS Region value by navigating to the Buckets List page of your Amazon S3 service, for example, us-east-1.

Authenticating to Databricks

CData supports the following authentication schemes:

  • Basic
  • Personal Access Token
  • Azure Active Directory (AD)
  • Azure Service Principal

Basic

Basic authentication requires a username and password. Set the following:

  • AuthScheme: Basic.
  • User: Your username. This overrides the default value ("Token").
  • Token: Your password.

Personal Access Token

To authenticate, set the following:

  • AuthScheme: PersonalAccessToken.
  • Token: The token used to access the Databricks server. It can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab.

Azure Active Directory

To authenticate, follow these steps:

  1. Register an application with the AzureAD (now known as Microsoft Entra ID) endpoint in the Azure portal. See Configure an app in Azure portal for how to create and register the application. Alternatively, you can use a AzureAD application that is already registered.

  2. Set these properties:

    • AzureTenant: The "Directory(tenant) ID" in the AzureAD application "Overview" page
    • OAuthClientId: The "Application(client) ID" in the AzureAD application "Overview" page.
    • CallbackURL: The "Redirect URIs" in AzureAD application "Authentication" page

  3. When connecting, a web page opens that prompts you to authenticate. After successful authentication, the connection is established.

Here is an example of the connection string:

"Server=https://adb-8439982502599436.16.azuredatabricks.net;HTTPPath=sql/protocolv1/o/8439982502599436/0810-011933-odsz4s3r;database=default;
AuthScheme=AzureAD;InitiateOAuth=GETANDREFRESH;AzureTenant=94be69e7-edb4-4fda-ab12-95bfc22b232f;OAuthClientId=f544a825-9b69-43d9-bec2-3e99727a1669;CallbackURL=http://localhost;"

Azure AD Service Principal

To authenticate, set the following properties:

  • AuthScheme: AzureServicePrincipal.
  • AzureTenantId: The tenant ID of your Microsoft Azure Active Directory.
  • AzureClientId: The application (client) ID of your Microsoft Azure Active Directory application.
  • AzureClientSecret: The application (client) secret of your Microsoft Azure Active Directory application.
  • AzureSubscriptionId: The Subscription Id of your Microsoft Azure Databricks Service Workspace.
  • AzureResourceGroup: The Resource Group name of your Microsoft Azure Databricks Service Workspace.
  • AzureWorkspace: The name of your Microsoft Azure Databricks Service Workspace.

Databricks Connector for CData Sync

Advanced Features

This section details a selection of advanced features of the Databricks Sync App.

User Defined Views

The Sync App allows you to define virtual tables, called user defined views, whose contents are decided by a pre-configured query. These views are useful when you cannot directly control queries being issued to the drivers. See User Defined Views for an overview of creating and configuring custom views.

SSL Configuration

Use SSL Configuration to adjust how Sync App handles TLS/SSL certificate negotiations. You can choose from various certificate formats; see the SSLServerCert property under "Connection String Options" for more information.

Firewall and Proxy

Configure the Sync App for compliance with Firewall and Proxy, including Windows proxies and HTTP proxies. You can also set up tunnel connections.

Query Processing

The Sync App offloads as much of the SELECT statement processing as possible to Databricks and then processes the rest of the query in memory (client-side).

See Query Processing for more information.

Logging

See Logging for an overview of configuration settings that can be used to refine CData logging. For basic logging, you only need to set two connection properties, but there are numerous features that support more refined logging, where you can select subsets of information to be logged using the LogModules connection property.

Databricks Connector for CData Sync

SSL Configuration

Customizing the SSL Configuration

By default, the Sync App attempts to negotiate SSL/TLS by checking the server's certificate against the system's trusted certificate store.

To specify another certificate, see the SSLServerCert property for the available formats to do so.

Client SSL Certificates

The Databricks Sync App also supports setting client certificates. Set the following to connect using a client certificate.

  • SSLClientCert: The name of the certificate store for the client certificate.
  • SSLClientCertType: The type of key store containing the TLS/SSL client certificate.
  • SSLClientCertPassword: The password for the TLS/SSL client certificate.
  • SSLClientCertSubject: The subject of the TLS/SSL client certificate.

Databricks Connector for CData Sync

Firewall and Proxy

Connecting Through a Firewall or Proxy

HTTP Proxies

To connect through the Windows system proxy, you do not need to set any additional connection properties. To connect to other proxies, set ProxyAutoDetect to false.

In addition, to authenticate to an HTTP proxy, set ProxyAuthScheme, ProxyUser, and ProxyPassword, in addition to ProxyServer and ProxyPort.

Other Proxies

Set the following properties:

  • To use a proxy-based firewall, set FirewallType, FirewallServer, and FirewallPort.
  • To tunnel the connection, set FirewallType to TUNNEL.
  • To authenticate, specify FirewallUser and FirewallPassword.
  • To authenticate to a SOCKS proxy, additionally set FirewallType to SOCKS5.

Databricks Connector for CData Sync

Data Model

The Sync App leverages Databricks Thrift to enable bidirectional SQL access to Databricks data. It supports Databricks databases running Databricks Runtime Version 9.1 - 13.X and the Pro and Classic Databricks SQL versions.

Discovering Schemas

The CData Sync App dynamically obtains the Databricks schemas; reconnect to pick up any metadata changes, such as added or removed columns or changes in data type.

Databricks Connector for CData Sync

Connection String Options

The connection string properties are the various options that can be used to establish a connection. This section provides a complete list of the options you can configure in the connection string for this provider. Click the links for further details.

For more information on establishing a connection, see Establishing a Connection.

Authentication


PropertyDescription
AuthSchemeThe authentication scheme used. Accepted entries are PersonalAccessToken, AzureServicePrincipal.
ServerThe host name or IP address of the server hosting the Databricks database.
UserThe username used to authenticate with Databricks.
ProtocolVersionThe Protocol Version used to authenticate with Databricks.
DatabaseThe name of the Databricks database.
HTTPPathThe path component of the URL endpoint.
TokenThe token used to access the Databricks server.

AWS Authentication


PropertyDescription
AWSAccessKeyYour AWS account access key. This value is accessible from your AWS security credentials page.
AWSSecretKeyYour AWS account secret key. This value is accessible from your AWS security credentials page.
AWSRegionThe hosting region for your Amazon Web Services.
AWSS3BucketThe name of your AWS S3 bucket.

Azure Authentication


PropertyDescription
AzureStorageAccountThe name of your Azure storage account.
AzureAccessKeyThe storage key associated with your Azure account.
AzureTenantThe Microsoft Online tenant being used to access data. If not specified, your default tenant is used.
AzureBlobContainerThe name of your Azure Blob storage container.

AzureServicePrincipal Authentication


PropertyDescription
AzureTenantIdThe Tenant id of your Microsoft Azure Active Directory.
AzureClientIdThe application(client) id of your Microsoft Azure Active Directory application.
AzureClientSecretThe application(client) secret of your Microsoft Azure Active Directory application.
AzureSubscriptionIdThe Subscription id of your Azure Databricks Service Workspace.
AzureResourceGroupThe Resource Group name of your Azure Databricks Service Workspace.
AzureWorkspaceThe name of your Azure Databricks Service Workspace.

OAuth


PropertyDescription
OAuthClientIdThe client Id assigned when you register your application with an OAuth authorization server.

SSL


PropertyDescription
SSLClientCertThe TLS/SSL client certificate store for SSL Client Authentication (2-way SSL).
SSLClientCertTypeThe type of key store containing the TLS/SSL client certificate.
SSLClientCertPasswordThe password for the TLS/SSL client certificate.
SSLClientCertSubjectThe subject of the TLS/SSL client certificate.
SSLServerCertThe certificate to be accepted from the server when connecting using TLS/SSL.

Firewall


PropertyDescription
FirewallTypeThe protocol used by a proxy-based firewall.
FirewallServerThe name or IP address of a proxy-based firewall.
FirewallPortThe TCP port for a proxy-based firewall.
FirewallUserThe user name to use to authenticate with a proxy-based firewall.
FirewallPasswordA password used to authenticate to a proxy-based firewall.

Proxy


PropertyDescription
ProxyAutoDetectThis indicates whether to use the system proxy settings or not.
ProxyServerThe hostname or IP address of a proxy to route HTTP traffic through.
ProxyPortThe TCP port the ProxyServer proxy is running on.
ProxyAuthSchemeThe authentication type to use to authenticate to the ProxyServer proxy.
ProxyUserA user name to be used to authenticate to the ProxyServer proxy.
ProxyPasswordA password to be used to authenticate to the ProxyServer proxy.
ProxySSLTypeThe SSL type to use when connecting to the ProxyServer proxy.
ProxyExceptionsA semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .

Logging


PropertyDescription
LogModulesCore modules to be included in the log file.

Schema


PropertyDescription
LocationA path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemasThis property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.
TablesThis property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.
ViewsRestricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.
CatalogThe default catalog name.
PrimaryKeyIdentifiersSet this property to define primary keys.

Databricks


PropertyDescription
CloudStorageTypeDetermine which cloud storage service will be used.
StoreTableInCloudThis option specifies whether Databricks server will create and save tables in cloud storage.
QueryTableDetailsSpecifies whether to use DESCRIBE FORMATTED ... to query detailed table information. If set to True, the query runs for a long time.
UseUploadApiThis option specifies whether the Databricks Upload API will be used when executing Bulk INSERT operations.
UseCloudFetchThis option specifies whether to use CloudWatch to improve query efficiency when the data volume of the table is large.
UseLegacyDataModelThis option specifies whether to support Unity Catalog.
QueryAllMetadataThis option specifies whether to query all catalogs and schemas/databases or only the default catalog and schema/database if catalog and schema parameters are not specified when querying metadata. The default catalog is specified by the property Catalog . The default schema/database is specified by the property Database .

Miscellaneous


PropertyDescription
AllowPreparedStatementPrepare a query statement before its execution.
ConnectRetryWaitTimeThis property specifies the number of seconds to wait prior to retrying a connection request. It only applies to the following case: when attempting to establish a connection to the Databricks cluster, you receive the response 'HTTP response with error code 503: The Cluster is starting'.
ApplicationNameThe application name connection string property expresses the HTTP User-Agent.
AsyncQueryTimeoutThe timeout for asynchronous requests issued by the provider to download large result sets.
DescribeCommandThe describe command used to communicate with the Hive server. Accepted entries are DESCRIBE and DESC.
DetectViewSpecifies whether to use DESCRIBE FORMATTED ... to detect the specified table is view or not.
MaxRowsLimits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.
OtherThese hidden properties are used only in specific use cases.
PseudoColumnsThis property indicates whether or not to include pseudo columns as columns to the table.
QueryPassthroughThis option passes the query to the Databricks server as is.
ServerConfigurationsA name-value list of server configuration variables to override the server defaults.
TimeoutThe value in seconds until the timeout error is thrown, canceling the operation.
UseDescTableQueryThis option specifies whether the columns will be retrieved using a DESC TABLE query or the GetColumns Thrift API.The GetColumns Thrift API works for the Apache Spark 3.0.0 or later.
UseInsertSelectSyntaxSpecifies whether to use an INSERT INTO SELECT statement.
UserDefinedViewsA filepath pointing to the JSON configuration file containing your custom views.
Databricks Connector for CData Sync

Authentication

This section provides a complete list of the Authentication properties you can configure in the connection string for this provider.


PropertyDescription
AuthSchemeThe authentication scheme used. Accepted entries are PersonalAccessToken, AzureServicePrincipal.
ServerThe host name or IP address of the server hosting the Databricks database.
UserThe username used to authenticate with Databricks.
ProtocolVersionThe Protocol Version used to authenticate with Databricks.
DatabaseThe name of the Databricks database.
HTTPPathThe path component of the URL endpoint.
TokenThe token used to access the Databricks server.
Databricks Connector for CData Sync

AuthScheme

The authentication scheme used. Accepted entries are PersonalAccessToken, AzureServicePrincipal.

Remarks

The Sync App supports the following authentication mechanisms. See the Getting Started chapter for authentication guides.

  • PersonalAccessToken: Set this to authenticate with Databricks' access token.
  • Basic: Set this to authenticate with Databricks' user and access token.
  • AzureServicePrincipal: Set this along with AzureTenantId, AzureClientId, AzureClientSecret, AzureSubscriptionId, AzureResourceGroup and AzureWorkspace to authenticate with the Azure Service Principal. You should follow the instructions in https://docs.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/aad/service-prin-aad-token#--provision-a-service-principal-in-azure-portal to register an AzureAD application(client), and then follow the instructions in https://docs.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal?tabs=current to make sure that the service principal is assigned the Contributor or Owner role on the target Databricks workspace resource in Azure.
  • AzureAD: Set this along with AzureTenant, OAuthClientId and CallbackURL to authenticate with the Azure Active Directory OAuth. You should follow the instructions in https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/aad/app-aad-token#configure-an-app-in-azure-portal to register an AzureAD application(client).

Databricks Connector for CData Sync

Server

The host name or IP address of the server hosting the Databricks database.

Remarks

The host name or IP address of the server hosting the Databricks database.

Databricks Connector for CData Sync

User

The username used to authenticate with Databricks.

Remarks

The username used to authenticate with Databricks.

Databricks Connector for CData Sync

ProtocolVersion

The Protocol Version used to authenticate with Databricks.

Remarks

The Protocol Version used to authenticate with Databricks.

Databricks Connector for CData Sync

Database

The name of the Databricks database.

Remarks

The name of the Databricks database.

Databricks Connector for CData Sync

HTTPPath

The path component of the URL endpoint.

Remarks

This property is used to specify the path component of the URL endpoint.

This property can be found by following the path: Databricks main page -> Compute(in left panel) -> {your Cluster} -> Advanced options(in Configuration tab) -> JDBC/ODBC - HTTP Path

Databricks Connector for CData Sync

Token

The token used to access the Databricks server.

Remarks

The token can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab.

Databricks Connector for CData Sync

AWS Authentication

This section provides a complete list of the AWS Authentication properties you can configure in the connection string for this provider.


PropertyDescription
AWSAccessKeyYour AWS account access key. This value is accessible from your AWS security credentials page.
AWSSecretKeyYour AWS account secret key. This value is accessible from your AWS security credentials page.
AWSRegionThe hosting region for your Amazon Web Services.
AWSS3BucketThe name of your AWS S3 bucket.
Databricks Connector for CData Sync

AWSAccessKey

Your AWS account access key. This value is accessible from your AWS security credentials page.

Remarks

Your AWS account access key. This value is accessible from your AWS security credentials page:

  1. Sign into the AWS Management console with the credentials for your root account.
  2. Select your account name or number and select My Security Credentials in the menu that is displayed.
  3. Click Continue to Security Credentials and expand the Access Keys section to manage or create root account access keys.

Databricks Connector for CData Sync

AWSSecretKey

Your AWS account secret key. This value is accessible from your AWS security credentials page.

Remarks

Your AWS account secret key. This value is accessible from your AWS security credentials page:

  1. Sign into the AWS Management console with the credentials for your root account.
  2. Select your account name or number and select My Security Credentials in the menu that is displayed.
  3. Click Continue to Security Credentials and expand the Access Keys section to manage or create root account access keys.

Databricks Connector for CData Sync

AWSRegion

The hosting region for your Amazon Web Services.

Remarks

The hosting region for your Amazon Web Services. Available values are OHIO, NORTHERNVIRGINIA, NORTHERNCALIFORNIA, OREGON, CAPETOWN, HONGKONG, JAKARTA, MUMBAI, OSAKA, SEOUL, SINGAPORE, SYDNEY, TOKYO, CENTRAL, BEIJING, NINGXIA, FRANKFURT, IRELAND, LONDON, MILAN, PARIS, STOCKHOLM, ZURICH, BAHRAIN, UAE, SAOPAULO, GOVCLOUDEAST, and GOVCLOUDWEST.

Databricks Connector for CData Sync

AWSS3Bucket

The name of your AWS S3 bucket.

Remarks

The name of your AWS S3 bucket.

Databricks Connector for CData Sync

Azure Authentication

This section provides a complete list of the Azure Authentication properties you can configure in the connection string for this provider.


PropertyDescription
AzureStorageAccountThe name of your Azure storage account.
AzureAccessKeyThe storage key associated with your Azure account.
AzureTenantThe Microsoft Online tenant being used to access data. If not specified, your default tenant is used.
AzureBlobContainerThe name of your Azure Blob storage container.
Databricks Connector for CData Sync

AzureStorageAccount

The name of your Azure storage account.

Remarks

The name of your Azure storage account.

Databricks Connector for CData Sync

AzureAccessKey

The storage key associated with your Azure account.

Remarks

The storage key associated with your Databricks account. You can retrieve it as follows:

  1. Sign into the azure portal with the credentials for your root account. (https://portal.azure.com/)
  2. Click on storage accounts and select the storage account you want to use.
  3. Under settings, click Access keys.
  4. Your storage account name and key will be displayed on that page.

Databricks Connector for CData Sync

AzureTenant

The Microsoft Online tenant being used to access data. If not specified, your default tenant is used.

Remarks

The Microsoft Online tenant being used to access data. For instance, contoso.onmicrosoft.com. Alternatively, specify the tenant Id. This value is the directory Id in the Azure Portal > Azure Active Directory > Properties.

Typically it is not necessary to specify the Tenant. This can be automatically determined by Microsoft when using the OAuthGrantType set to CODE (default). However, it may fail in the case that the user belongs to multiple tenants. For instance, if an Admin of domain A invites a user of domain B to be a guest user. The user will now belong to both tenants. It is a good practice to specify the Tenant, although in general things should normally work without having to specify it.

The AzureTenant is required when setting OAuthGrantType to CLIENT. When using client credentials, there is no user context. The credentials are taken from the context of the app itself. While Microsoft still allows client credentials to be obtained without specifying which Tenant, it has a much lower probability of picking the specific tenant you want to work with. For this reason, we require AzureTenant to be explicitly stated for all client credentials connections to ensure you get credentials that are applicable for the domain you intend to connect to.

Databricks Connector for CData Sync

AzureBlobContainer

The name of your Azure Blob storage container.

Remarks

The name of your Azure Blob storage container.

Databricks Connector for CData Sync

AzureServicePrincipal Authentication

This section provides a complete list of the AzureServicePrincipal Authentication properties you can configure in the connection string for this provider.


PropertyDescription
AzureTenantIdThe Tenant id of your Microsoft Azure Active Directory.
AzureClientIdThe application(client) id of your Microsoft Azure Active Directory application.
AzureClientSecretThe application(client) secret of your Microsoft Azure Active Directory application.
AzureSubscriptionIdThe Subscription id of your Azure Databricks Service Workspace.
AzureResourceGroupThe Resource Group name of your Azure Databricks Service Workspace.
AzureWorkspaceThe name of your Azure Databricks Service Workspace.
Databricks Connector for CData Sync

AzureTenantId

The Tenant id of your Microsoft Azure Active Directory.

Remarks

The Tenant id of your Microsoft Azure Active Directory.

Databricks Connector for CData Sync

AzureClientId

The application(client) id of your Microsoft Azure Active Directory application.

Remarks

The application(client) can be registered following the AuthScheme -> AzureServicePrincipal.

Databricks Connector for CData Sync

AzureClientSecret

The application(client) secret of your Microsoft Azure Active Directory application.

Remarks

The application(client) can be registered following the AuthScheme -> AzureServicePrincipal.

Databricks Connector for CData Sync

AzureSubscriptionId

The Subscription id of your Azure Databricks Service Workspace.

Remarks

The Subscription id of your Azure Databricks Service Workspace.

Databricks Connector for CData Sync

AzureResourceGroup

The Resource Group name of your Azure Databricks Service Workspace.

Remarks

The Resource Group name of your Azure Databricks Service Workspace.

Databricks Connector for CData Sync

AzureWorkspace

The name of your Azure Databricks Service Workspace.

Remarks

The name of your Azure Databricks Service Workspace.

Databricks Connector for CData Sync

OAuth

This section provides a complete list of the OAuth properties you can configure in the connection string for this provider.


PropertyDescription
OAuthClientIdThe client Id assigned when you register your application with an OAuth authorization server.
Databricks Connector for CData Sync

OAuthClientId

The client Id assigned when you register your application with an OAuth authorization server.

Remarks

As part of registering an OAuth application, you will receive the OAuthClientId value, sometimes also called a consumer key, and a client secret, the OAuthClientSecret.

Databricks Connector for CData Sync

SSL

This section provides a complete list of the SSL properties you can configure in the connection string for this provider.


PropertyDescription
SSLClientCertThe TLS/SSL client certificate store for SSL Client Authentication (2-way SSL).
SSLClientCertTypeThe type of key store containing the TLS/SSL client certificate.
SSLClientCertPasswordThe password for the TLS/SSL client certificate.
SSLClientCertSubjectThe subject of the TLS/SSL client certificate.
SSLServerCertThe certificate to be accepted from the server when connecting using TLS/SSL.
Databricks Connector for CData Sync

SSLClientCert

The TLS/SSL client certificate store for SSL Client Authentication (2-way SSL).

Remarks

The name of the certificate store for the client certificate.

The SSLClientCertType field specifies the type of the certificate store specified by SSLClientCert. If the store is password protected, specify the password in SSLClientCertPassword.

SSLClientCert is used in conjunction with the SSLClientCertSubject field in order to specify client certificates. If SSLClientCert has a value, and SSLClientCertSubject is set, a search for a certificate is initiated. See SSLClientCertSubject for more information.

Designations of certificate stores are platform-dependent.

The following are designations of the most common User and Machine certificate stores in Windows:

MYA certificate store holding personal certificates with their associated private keys.
CACertifying authority certificates.
ROOTRoot certificates.
SPCSoftware publisher certificates.

In Java, the certificate store normally is a file containing certificates and optional private keys.

When the certificate store type is PFXFile, this property must be set to the name of the file. When the type is PFXBlob, the property must be set to the binary contents of a PFX file (for example, PKCS12 certificate store).

Databricks Connector for CData Sync

SSLClientCertType

The type of key store containing the TLS/SSL client certificate.

Remarks

This property can take one of the following values:

USER - defaultFor Windows, this specifies that the certificate store is a certificate store owned by the current user. Note that this store type is not available in Java.
MACHINEFor Windows, this specifies that the certificate store is a machine store. Note that this store type is not available in Java.
PFXFILEThe certificate store is the name of a PFX (PKCS12) file containing certificates.
PFXBLOBThe certificate store is a string (base-64-encoded) representing a certificate store in PFX (PKCS12) format.
JKSFILEThe certificate store is the name of a Java key store (JKS) file containing certificates. Note that this store type is only available in Java.
JKSBLOBThe certificate store is a string (base-64-encoded) representing a certificate store in JKS format. Note that this store type is only available in Java.
PEMKEY_FILEThe certificate store is the name of a PEM-encoded file that contains a private key and an optional certificate.
PEMKEY_BLOBThe certificate store is a string (base64-encoded) that contains a private key and an optional certificate.
PUBLIC_KEY_FILEThe certificate store is the name of a file that contains a PEM- or DER-encoded public key certificate.
PUBLIC_KEY_BLOBThe certificate store is a string (base-64-encoded) that contains a PEM- or DER-encoded public key certificate.
SSHPUBLIC_KEY_FILEThe certificate store is the name of a file that contains an SSH-style public key.
SSHPUBLIC_KEY_BLOBThe certificate store is a string (base-64-encoded) that contains an SSH-style public key.
P7BFILEThe certificate store is the name of a PKCS7 file containing certificates.
PPKFILEThe certificate store is the name of a file that contains a PuTTY Private Key (PPK).
XMLFILEThe certificate store is the name of a file that contains a certificate in XML format.
XMLBLOBThe certificate store is a string that contains a certificate in XML format.

Databricks Connector for CData Sync

SSLClientCertPassword

The password for the TLS/SSL client certificate.

Remarks

If the certificate store is of a type that requires a password, this property is used to specify that password to open the certificate store.

Databricks Connector for CData Sync

SSLClientCertSubject

The subject of the TLS/SSL client certificate.

Remarks

When loading a certificate the subject is used to locate the certificate in the store.

If an exact match is not found, the store is searched for subjects containing the value of the property. If a match is still not found, the property is set to an empty string, and no certificate is selected.

The special value "*" picks the first certificate in the certificate store.

The certificate subject is a comma separated list of distinguished name fields and values. For example, "CN=www.server.com, OU=test, C=US, [email protected]". The common fields and their meanings are shown below.

FieldMeaning
CNCommon Name. This is commonly a host name like www.server.com.
OOrganization
OUOrganizational Unit
LLocality
SState
CCountry
EEmail Address

If a field value contains a comma, it must be quoted.

Databricks Connector for CData Sync

SSLServerCert

The certificate to be accepted from the server when connecting using TLS/SSL.

Remarks

If using a TLS/SSL connection, this property can be used to specify the TLS/SSL certificate to be accepted from the server. Any other certificate that is not trusted by the machine is rejected.

This property can take the following forms:

Description Example
A full PEM Certificate (example shortened for brevity) -----BEGIN CERTIFICATE----- MIIChTCCAe4CAQAwDQYJKoZIhv......Qw== -----END CERTIFICATE-----
A path to a local file containing the certificate C:\cert.cer
The public key (example shortened for brevity) -----BEGIN RSA PUBLIC KEY----- MIGfMA0GCSq......AQAB -----END RSA PUBLIC KEY-----
The MD5 Thumbprint (hex values can also be either space or colon separated) ecadbdda5a1529c58a1e9e09828d70e4
The SHA1 Thumbprint (hex values can also be either space or colon separated) 34a929226ae0819f2ec14b4a3d904f801cbb150d

If not specified, any certificate trusted by the machine is accepted.

Use '*' to signify to accept all certificates. Note that this is not recommended due to security concerns.

Databricks Connector for CData Sync

Firewall

This section provides a complete list of the Firewall properties you can configure in the connection string for this provider.


PropertyDescription
FirewallTypeThe protocol used by a proxy-based firewall.
FirewallServerThe name or IP address of a proxy-based firewall.
FirewallPortThe TCP port for a proxy-based firewall.
FirewallUserThe user name to use to authenticate with a proxy-based firewall.
FirewallPasswordA password used to authenticate to a proxy-based firewall.
Databricks Connector for CData Sync

FirewallType

The protocol used by a proxy-based firewall.

Remarks

This property specifies the protocol that the Sync App will use to tunnel traffic through the FirewallServer proxy. Note that by default, the Sync App connects to the system proxy; to disable this behavior and connect to one of the following proxy types, set ProxyAutoDetect to false.

Type Default Port Description
TUNNEL 80 When this is set, the Sync App opens a connection to Databricks and traffic flows back and forth through the proxy.
SOCKS4 1080 When this is set, the Sync App sends data through the SOCKS 4 proxy specified by FirewallServer and FirewallPort and passes the FirewallUser value to the proxy, which determines if the connection request should be granted.
SOCKS5 1080 When this is set, the Sync App sends data through the SOCKS 5 proxy specified by FirewallServer and FirewallPort. If your proxy requires authentication, set FirewallUser and FirewallPassword to credentials the proxy recognizes.

To connect to HTTP proxies, use ProxyServer and ProxyPort. To authenticate to HTTP proxies, use ProxyAuthScheme, ProxyUser, and ProxyPassword.

Databricks Connector for CData Sync

FirewallServer

The name or IP address of a proxy-based firewall.

Remarks

This property specifies the IP address, DNS name, or host name of a proxy allowing traversal of a firewall. The protocol is specified by FirewallType: Use FirewallServer with this property to connect through SOCKS or do tunneling. Use ProxyServer to connect to an HTTP proxy.

Note that the Sync App uses the system proxy by default. To use a different proxy, set ProxyAutoDetect to false.

Databricks Connector for CData Sync

FirewallPort

The TCP port for a proxy-based firewall.

Remarks

This specifies the TCP port for a proxy allowing traversal of a firewall. Use FirewallServer to specify the name or IP address. Specify the protocol with FirewallType.

Databricks Connector for CData Sync

FirewallUser

The user name to use to authenticate with a proxy-based firewall.

Remarks

The FirewallUser and FirewallPassword properties are used to authenticate against the proxy specified in FirewallServer and FirewallPort, following the authentication method specified in FirewallType.

Databricks Connector for CData Sync

FirewallPassword

A password used to authenticate to a proxy-based firewall.

Remarks

This property is passed to the proxy specified by FirewallServer and FirewallPort, following the authentication method specified by FirewallType.

Databricks Connector for CData Sync

Proxy

This section provides a complete list of the Proxy properties you can configure in the connection string for this provider.


PropertyDescription
ProxyAutoDetectThis indicates whether to use the system proxy settings or not.
ProxyServerThe hostname or IP address of a proxy to route HTTP traffic through.
ProxyPortThe TCP port the ProxyServer proxy is running on.
ProxyAuthSchemeThe authentication type to use to authenticate to the ProxyServer proxy.
ProxyUserA user name to be used to authenticate to the ProxyServer proxy.
ProxyPasswordA password to be used to authenticate to the ProxyServer proxy.
ProxySSLTypeThe SSL type to use when connecting to the ProxyServer proxy.
ProxyExceptionsA semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .
Databricks Connector for CData Sync

ProxyAutoDetect

This indicates whether to use the system proxy settings or not.

Remarks

This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.

To connect to an HTTP proxy, see ProxyServer. For other proxies, such as SOCKS or tunneling, see FirewallType.

Databricks Connector for CData Sync

ProxyServer

The hostname or IP address of a proxy to route HTTP traffic through.

Remarks

The hostname or IP address of a proxy to route HTTP traffic through. The Sync App can use the HTTP, Windows (NTLM), or Kerberos authentication types to authenticate to an HTTP proxy.

If you need to connect through a SOCKS proxy or tunnel the connection, see FirewallType.

By default, the Sync App uses the system proxy. If you need to use another proxy, set ProxyAutoDetect to false.

Databricks Connector for CData Sync

ProxyPort

The TCP port the ProxyServer proxy is running on.

Remarks

The port the HTTP proxy is running on that you want to redirect HTTP traffic through. Specify the HTTP proxy in ProxyServer. For other proxy types, see FirewallType.

Databricks Connector for CData Sync

ProxyAuthScheme

The authentication type to use to authenticate to the ProxyServer proxy.

Remarks

This value specifies the authentication type to use to authenticate to the HTTP proxy specified by ProxyServer and ProxyPort.

Note that the Sync App will use the system proxy settings by default, without further configuration needed; if you want to connect to another proxy, you will need to set ProxyAutoDetect to false, in addition to ProxyServer and ProxyPort. To authenticate, set ProxyAuthScheme and set ProxyUser and ProxyPassword, if needed.

The authentication type can be one of the following:

  • BASIC: The Sync App performs HTTP BASIC authentication.
  • DIGEST: The Sync App performs HTTP DIGEST authentication.
  • NEGOTIATE: The Sync App retrieves an NTLM or Kerberos token based on the applicable protocol for authentication.
  • PROPRIETARY: The Sync App does not generate an NTLM or Kerberos token. You must supply this token in the Authorization header of the HTTP request.

If you need to use another authentication type, such as SOCKS 5 authentication, see FirewallType.

Databricks Connector for CData Sync

ProxyUser

A user name to be used to authenticate to the ProxyServer proxy.

Remarks

The ProxyUser and ProxyPassword options are used to connect and authenticate against the HTTP proxy specified in ProxyServer.

You can select one of the available authentication types in ProxyAuthScheme. If you are using HTTP authentication, set this to the user name of a user recognized by the HTTP proxy. If you are using Windows or Kerberos authentication, set this property to a user name in one of the following formats:

user@domain
domain\user

Databricks Connector for CData Sync

ProxyPassword

A password to be used to authenticate to the ProxyServer proxy.

Remarks

This property is used to authenticate to an HTTP proxy server that supports NTLM (Windows), Kerberos, or HTTP authentication. To specify the HTTP proxy, you can set ProxyServer and ProxyPort. To specify the authentication type, set ProxyAuthScheme.

If you are using HTTP authentication, additionally set ProxyUser and ProxyPassword to HTTP proxy.

If you are using NTLM authentication, set ProxyUser and ProxyPassword to your Windows password. You may also need these to complete Kerberos authentication.

For SOCKS 5 authentication or tunneling, see FirewallType.

By default, the Sync App uses the system proxy. If you want to connect to another proxy, set ProxyAutoDetect to false.

Databricks Connector for CData Sync

ProxySSLType

The SSL type to use when connecting to the ProxyServer proxy.

Remarks

This property determines when to use SSL for the connection to an HTTP proxy specified by ProxyServer. This value can be AUTO, ALWAYS, NEVER, or TUNNEL. The applicable values are the following:

AUTODefault setting. If the URL is an HTTPS URL, the Sync App will use the TUNNEL option. If the URL is an HTTP URL, the component will use the NEVER option.
ALWAYSThe connection is always SSL enabled.
NEVERThe connection is not SSL enabled.
TUNNELThe connection is through a tunneling proxy. The proxy server opens a connection to the remote host and traffic flows back and forth through the proxy.

Databricks Connector for CData Sync

ProxyExceptions

A semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .

Remarks

The ProxyServer is used for all addresses, except for addresses defined in this property. Use semicolons to separate entries.

Note that the Sync App uses the system proxy settings by default, without further configuration needed; if you want to explicitly configure proxy exceptions for this connection, you need to set ProxyAutoDetect = false, and configure ProxyServer and ProxyPort. To authenticate, set ProxyAuthScheme and set ProxyUser and ProxyPassword, if needed.

Databricks Connector for CData Sync

Logging

This section provides a complete list of the Logging properties you can configure in the connection string for this provider.


PropertyDescription
LogModulesCore modules to be included in the log file.
Databricks Connector for CData Sync

LogModules

Core modules to be included in the log file.

Remarks

Only the modules specified (separated by ';') will be included in the log file. By default all modules are included.

See the Logging page for an overview.

Databricks Connector for CData Sync

Schema

This section provides a complete list of the Schema properties you can configure in the connection string for this provider.


PropertyDescription
LocationA path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemasThis property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.
TablesThis property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.
ViewsRestricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.
CatalogThe default catalog name.
PrimaryKeyIdentifiersSet this property to define primary keys.
Databricks Connector for CData Sync

Location

A path to the directory that contains the schema files defining tables, views, and stored procedures.

Remarks

The path to a directory which contains the schema files for the Sync App (.rsd files for tables and views, .rsb files for stored procedures). The folder location can be a relative path from the location of the executable. The Location property is only needed if you want to customize definitions (for example, change a column name, ignore a column, and so on) or extend the data model with new tables, views, or stored procedures.

If left unspecified, the default location is "%APPDATA%\\CData\\Databricks Data Provider\\Schema" with %APPDATA% being set to the user's configuration directory:

Platform %APPDATA%
Windows The value of the APPDATA environment variable
Linux ~/.config

Databricks Connector for CData Sync

BrowsableSchemas

This property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.

Remarks

Listing the schemas from databases can be expensive. Providing a list of schemas in the connection string improves the performance.

Databricks Connector for CData Sync

Tables

This property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.

Remarks

Listing the tables from some databases can be expensive. Providing a list of tables in the connection string improves the performance of the Sync App.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the tables you want in a comma-separated list. Each table should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Tables=TableA,[TableB/WithSlash],WithCatalog.WithSchema.`TableC With Space`.

Note that when connecting to a data source with multiple schemas or catalogs, you will need to provide the fully qualified name of the table in this property, as in the last example here, to avoid ambiguity between tables that exist in multiple catalogs or schemas.

Databricks Connector for CData Sync

Views

Restricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.

Remarks

Listing the views from some databases can be expensive. Providing a list of views in the connection string improves the performance of the Sync App.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the views you want in a comma-separated list. Each view should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Views=ViewA,[ViewB/WithSlash],WithCatalog.WithSchema.`ViewC With Space`.

Note that when connecting to a data source with multiple schemas or catalogs, you will need to provide the fully qualified name of the table in this property, as in the last example here, to avoid ambiguity between tables that exist in multiple catalogs or schemas.

Databricks Connector for CData Sync

Catalog

The default catalog name.

Remarks

When the property UseLegacyDataModel is set to True, this property also needs to be set to sepecify a default catalog. In most cases this should be "hive_metastore".

Databricks Connector for CData Sync

PrimaryKeyIdentifiers

Set this property to define primary keys.

Remarks

Databricks does not natively support primary keys, but for certain DML operations or database tools you may need to define them. By default this option is disabled so that no tables have primary keys.

Primary keys are defined using a list of rules that match tables and provide a list of key columns. For example, PrimaryKeyIdentifiers="*=my_key;my_table=my_key2,my_key3;my_nokeys_table=;" has three rules separated by semicolons:

  1. The first rule *=my_key means that every table without a more specific rule contains one primary key column called my_key. Tables without a my_key column do not have any primary keys. Multiple keys are supported; set *=my_key,my_key2" to specify them.
  2. The second rule my_table=my_key2,my_key3 means that the my_table table contains the two primary key columns my_key2 and my_key3. If any of those columns are missing from the table they are ignored.
  3. The third rule my_nokeys_table= means that the my_nokeys_table table has no primary keys. The only use that empty key lists have is overriding the default rule. If there is no default rule present, only tables with primary keys are explicitly listed.

Note that the table names can include

  • just the table
  • the table and schema
  • the table, schema, and catalog
You can use SQL quotes to specify column and table names:
/* Rules with just table names use the default connection Catalog and Schema. 
   All these rules refer to the same table with a connection where Catalog=someCatalog;Schema=someSchema */

someTable=a,b,c
someSchema.someTable=a,b,c
someCatalog.someSchema.someTable=a,b,c

/* Any table or column name may be quoted */
`someCatalog`."someSchema".[someTable]=`a`,[b],"c"

Databricks Connector for CData Sync

Databricks

This section provides a complete list of the Databricks properties you can configure in the connection string for this provider.


PropertyDescription
CloudStorageTypeDetermine which cloud storage service will be used.
StoreTableInCloudThis option specifies whether Databricks server will create and save tables in cloud storage.
QueryTableDetailsSpecifies whether to use DESCRIBE FORMATTED ... to query detailed table information. If set to True, the query runs for a long time.
UseUploadApiThis option specifies whether the Databricks Upload API will be used when executing Bulk INSERT operations.
UseCloudFetchThis option specifies whether to use CloudWatch to improve query efficiency when the data volume of the table is large.
UseLegacyDataModelThis option specifies whether to support Unity Catalog.
QueryAllMetadataThis option specifies whether to query all catalogs and schemas/databases or only the default catalog and schema/database if catalog and schema parameters are not specified when querying metadata. The default catalog is specified by the property Catalog . The default schema/database is specified by the property Database .
Databricks Connector for CData Sync

CloudStorageType

Determine which cloud storage service will be used.

Remarks

By default, the "DBFS" provided by Databricks is used. If set to "Azure Blob storage", these properties are required: AzureStorageAccount AzureAccessKey AzureBlobContainer If set to "AWS S3", these properties are required: AWSAccessKey AWSSecretKey AWSS3Bucket AWSRegion

Databricks Connector for CData Sync

StoreTableInCloud

This option specifies whether Databricks server will create and save tables in cloud storage.

Remarks

Setting this property to "True" will create and save tables in cloud storage, in this case the CloudStorageType property cannot be "DBFS".

Databricks Connector for CData Sync

QueryTableDetails

Specifies whether to use DESCRIBE FORMATTED ... to query detailed table information. If set to True, the query runs for a long time.

Remarks

Specifies whether to use DESCRIBE FORMATTED ... to query detailed table information. If set to True, the query runs for a long time.

Databricks Connector for CData Sync

UseUploadApi

This option specifies whether the Databricks Upload API will be used when executing Bulk INSERT operations.

Remarks

Setting this property to true will improve performance if there is a large amount of data in a Bulk INSERT operation.

Databricks Connector for CData Sync

UseCloudFetch

This option specifies whether to use CloudWatch to improve query efficiency when the data volume of the table is large.

Remarks

This option specifies whether to use CloudWatch to improve query efficiency when the table contains over one million entries.

Databricks Connector for CData Sync

UseLegacyDataModel

This option specifies whether to support Unity Catalog.

Remarks

True by default. This enables multi-catalog support for both the Unity Catalog and the single-catalog case. A single catalog is usually named "hive_metastore".

Setting this property to False disables multi-catalog support, in which case there is only one catalog, named "CData".

Databricks Connector for CData Sync

QueryAllMetadata

This option specifies whether to query all catalogs and schemas/databases or only the default catalog and schema/database if catalog and schema parameters are not specified when querying metadata. The default catalog is specified by the property Catalog . The default schema/database is specified by the property Database .

Remarks

True by default. The driver will query metadata from all catalogs and schemas/databases.

Setting this property to False to query metadata only from the default catalog and schema/database.

Databricks Connector for CData Sync

Miscellaneous

This section provides a complete list of the Miscellaneous properties you can configure in the connection string for this provider.


PropertyDescription
AllowPreparedStatementPrepare a query statement before its execution.
ConnectRetryWaitTimeThis property specifies the number of seconds to wait prior to retrying a connection request. It only applies to the following case: when attempting to establish a connection to the Databricks cluster, you receive the response 'HTTP response with error code 503: The Cluster is starting'.
ApplicationNameThe application name connection string property expresses the HTTP User-Agent.
AsyncQueryTimeoutThe timeout for asynchronous requests issued by the provider to download large result sets.
DescribeCommandThe describe command used to communicate with the Hive server. Accepted entries are DESCRIBE and DESC.
DetectViewSpecifies whether to use DESCRIBE FORMATTED ... to detect the specified table is view or not.
MaxRowsLimits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.
OtherThese hidden properties are used only in specific use cases.
PseudoColumnsThis property indicates whether or not to include pseudo columns as columns to the table.
QueryPassthroughThis option passes the query to the Databricks server as is.
ServerConfigurationsA name-value list of server configuration variables to override the server defaults.
TimeoutThe value in seconds until the timeout error is thrown, canceling the operation.
UseDescTableQueryThis option specifies whether the columns will be retrieved using a DESC TABLE query or the GetColumns Thrift API.The GetColumns Thrift API works for the Apache Spark 3.0.0 or later.
UseInsertSelectSyntaxSpecifies whether to use an INSERT INTO SELECT statement.
UserDefinedViewsA filepath pointing to the JSON configuration file containing your custom views.
Databricks Connector for CData Sync

AllowPreparedStatement

Prepare a query statement before its execution.

Remarks

If the AllowPreparedStatement property is set to false, statements are parsed each time they are executed. Setting this property to false can be useful if you are executing many different queries only once.

If you are executing the same query repeatedly, you will generally see better performance by leaving this property at the default, true. Preparing the query avoids recompiling the same query over and over. However, prepared statements also require the Sync App to keep the connection active and open while the statement is prepared.

Databricks Connector for CData Sync

ConnectRetryWaitTime

This property specifies the number of seconds to wait prior to retrying a connection request. It only applies to the following case: when attempting to establish a connection to the Databricks cluster, you receive the response 'HTTP response with error code 503: The Cluster is starting'.

Remarks

Specify a reasonable positive integer value to enable this feature, generally 30-60 (seconds), and the default value '-1' means to disable this feature. Specify the maximum number of retries with MaximumRequestRetries.

Databricks Connector for CData Sync

ApplicationName

The application name connection string property expresses the HTTP User-Agent.

Remarks

The format is

[isv-name+product-name]/[product-version] [comment]> 
where

  • [isv-name+product-name] is the name of the application, with no spaces, parentheses, or new lines.
  • [product-version] is the version number of the application, with no spaces, parentheses, or new lines.
  • [comment] is optional, with no comma or new lines. Nested comments are not supported.

Databricks Connector for CData Sync

AsyncQueryTimeout

The timeout for asynchronous requests issued by the provider to download large result sets.

Remarks

If the AsyncQueryTimeout property is set to 0, asynchronous operations will not time out; instead, they will run until they complete successfully or encounter an error condition. This property is distinct from Timeout which applies to individual operations while AsyncQueryTimeout applies to execution time of the operation as a whole.

If AsyncQueryTimeout expires and the asynchronous request has not finished being processed, the Sync App raises an error condition.

Databricks Connector for CData Sync

DescribeCommand

The describe command used to communicate with the Hive server. Accepted entries are DESCRIBE and DESC.

Remarks

The describe command used to communicate with the Hive server. Accepted entries are DESCRIBE and DESC.

Databricks Connector for CData Sync

DetectView

Specifies whether to use DESCRIBE FORMATTED ... to detect the specified table is view or not.

Remarks

Specifies whether to use DESCRIBE FORMATTED ... to detect the specified table is view or not.

Databricks Connector for CData Sync

MaxRows

Limits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.

Remarks

Limits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.

Databricks Connector for CData Sync

Other

These hidden properties are used only in specific use cases.

Remarks

The properties listed below are available for specific use cases. Normal driver use cases and functionality should not require these properties.

Specify multiple properties in a semicolon-separated list.

Integration and Formatting

DefaultColumnSizeSets the default length of string fields when the data source does not provide column length in the metadata. The default value is 2000.
ConvertDateTimeToGMTDetermines whether to convert date-time values to GMT, instead of the local time of the machine.
RecordToFile=filenameRecords the underlying socket data transfer to the specified file.

Databricks Connector for CData Sync

PseudoColumns

This property indicates whether or not to include pseudo columns as columns to the table.

Remarks

This setting is particularly helpful in Entity Framework, which does not allow you to set a value for a pseudo column unless it is a table column. The value of this connection setting is of the format "Table1=Column1, Table1=Column2, Table2=Column3". You can use the "*" character to include all tables and all columns; for example, "*=*".

Databricks Connector for CData Sync

QueryPassthrough

This option passes the query to the Databricks server as is.

Remarks

When this is set, queries are passed through directly to Databricks.

Databricks Connector for CData Sync

ServerConfigurations

A name-value list of server configuration variables to override the server defaults.

Remarks

This property takes a comma separated list of configuration variables specified as name-value pairs. Any values specified here will be sent to the Hive server to override the default values.

Example: hive.enforce.bucketing=true,hive.enforce.sorting=true

Databricks Connector for CData Sync

Timeout

The value in seconds until the timeout error is thrown, canceling the operation.

Remarks

If Timeout = 0, operations do not time out. The operations run until they complete successfully or until they encounter an error condition.

If Timeout expires and the operation is not yet complete, the Sync App throws an exception.

Databricks Connector for CData Sync

UseDescTableQuery

This option specifies whether the columns will be retrieved using a DESC TABLE query or the GetColumns Thrift API.The GetColumns Thrift API works for the Apache Spark 3.0.0 or later.

Remarks

When set to true, a DESC TABLE query will be issued to retrieve the columns for the table.

Databricks Connector for CData Sync

UseInsertSelectSyntax

Specifies whether to use an INSERT INTO SELECT statement.

Remarks

When set to true, an INSERT INTO SELECT statement will be used when executing insert statements. When set to false, an INSERT INTO VALUES statement will be used.

Unless explicitly specified, this option will be configured accordingly based on the Databricks version.

Databricks Connector for CData Sync

UserDefinedViews

A filepath pointing to the JSON configuration file containing your custom views.

Remarks

User Defined Views are defined in a JSON-formatted configuration file called UserDefinedViews.json. The Sync App automatically detects the views specified in this file.

You can also have multiple view definitions and control them using the UserDefinedViews connection property. When you use this property, only the specified views are seen by the Sync App.

This User Defined View configuration file is formatted as follows:

  • Each root element defines the name of a view.
  • Each root element contains a child element, called query, which contains the custom SQL query for the view.

For example:

{
	"MyView": {
		"query": "SELECT * FROM [CData].[Sample].Customers WHERE MyColumn = 'value'"
	},
	"MyView2": {
		"query": "SELECT * FROM MyTable WHERE Id IN (1,2,3)"
	}
}
Use the UserDefinedViews connection property to specify the location of your JSON configuration file. For example:
"UserDefinedViews", C:\Users\yourusername\Desktop\tmp\UserDefinedViews.json
Note that the specified path is not embedded in quotation marks.

Copyright (c) 2024 CData Software, Inc. - All rights reserved.
Build 23.0.8839