JDBC Driver for Spark SQL

Build 22.0.8462

Connection String Options

The connection string properties are the various options that can be used to establish a connection. This section provides a complete list of the options you can configure in the connection string for this provider. Click the links for further details.

For more information on establishing a connection, see Establishing a Connection.

Authentication


PropertyDescription
AuthSchemeThe authentication scheme used. Accepted entries are Plain, LDAP, NOSASL, and Kerberos.
ServerThe host name or IP address of the server hosting the SparkSQL database.
PortThe port for the SparkSQL database.
UserThe username used to authenticate with SparkSQL.
PasswordThe password used to authenticate with SparkSQL.
DatabaseThe name of the SparkSQL database.
ProtocolVersionThe Protocol Version used to authenticate with SparkSQL.
ImpersonationProxyUserThe proxy user of the Hive user impersonation.
SaslQopQuality of protection for the SASL framework. The level of quality is negotiated between the client and server during authentication. Used by Kerberos authentication with TCP transport.
TransportModeThe transport mode to use to communicate with the Hive server. Accepted entries are BINARY and HTTP.

Kerberos


PropertyDescription
KerberosKDCThe Kerberos Key Distribution Center (KDC) service used to authenticate the user.
KerberosRealmThe Kerberos Realm used to authenticate the user.
KerberosSPNThe service principal name (SPN) for the Kerberos Domain Controller.
KerberosKeytabFileThe Keytab file containing your pairs of Kerberos principals and encrypted keys.
KerberosServiceRealmThe Kerberos realm of the service.
KerberosServiceKDCThe Kerberos KDC of the service.
KerberosTicketCacheThe full file path to an MIT Kerberos credential cache file.

SSL


PropertyDescription
SSLClientCertThe TLS/SSL client certificate store for SSL Client Authentication (2-way SSL).
SSLClientCertTypeThe type of key store containing the TLS/SSL client certificate.
SSLClientCertPasswordThe password for the TLS/SSL client certificate.
SSLClientCertSubjectThe subject of the TLS/SSL client certificate.
SSLServerCertThe certificate to be accepted from the server when connecting using TLS/SSL.

Firewall


PropertyDescription
FirewallTypeThe protocol used by a proxy-based firewall.
FirewallServerThe name or IP address of a proxy-based firewall.
FirewallPortThe TCP port for a proxy-based firewall.
FirewallUserThe user name to use to authenticate with a proxy-based firewall.
FirewallPasswordA password used to authenticate to a proxy-based firewall.

Proxy


PropertyDescription
ProxyAutoDetectThis indicates whether to use the system proxy settings or not. This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.
ProxyServerThe hostname or IP address of a proxy to route HTTP traffic through.
ProxyPortThe TCP port the ProxyServer proxy is running on.
ProxyAuthSchemeThe authentication type to use to authenticate to the ProxyServer proxy.
ProxyUserA user name to be used to authenticate to the ProxyServer proxy.
ProxyPasswordA password to be used to authenticate to the ProxyServer proxy.
ProxySSLTypeThe SSL type to use when connecting to the ProxyServer proxy.
ProxyExceptionsA semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .

Logging


PropertyDescription
LogfileA filepath which designates the name and location of the log file.
VerbosityThe verbosity level that determines the amount of detail included in the log file.
LogModulesCore modules to be included in the log file.
MaxLogFileSizeA string specifying the maximum size in bytes for a log file (for example, 10 MB).
MaxLogFileCountA string specifying the maximum file count of log files.

Schema


PropertyDescription
LocationA path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemasThis property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.
TablesThis property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.
ViewsRestricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.

Caching


PropertyDescription
AutoCacheAutomatically caches the results of SELECT queries into a cache database specified by either CacheLocation or both of CacheConnection and CacheProvider .
CacheDriverThe database driver used to cache data.
CacheConnectionThe connection string for the cache database. This property is always used in conjunction with CacheProvider . Setting both properties will override the value set for CacheLocation for caching data.
CacheLocationSpecifies the path to the cache when caching to a file.
CacheToleranceThe tolerance for stale data in the cache specified in seconds when using AutoCache .
OfflineUse offline mode to get the data from the cache instead of the live source.
CacheMetadataThis property determines whether or not to cache the table metadata to a file store.

Miscellaneous


PropertyDescription
AsyncQueryTimeoutThe timeout for asynchronous requests issued by the provider to download large result sets.
BatchSizeThe maximum size of each batch operation to submit.
ConnectionLifeTimeThe maximum lifetime of a connection in seconds. Once the time has elapsed, the connection object is disposed.
ConnectOnOpenThis property specifies whether to connect to the Spark SQL when the connection is opened.
DescribeCommandThe describe command to determine which describe command will use to communicate with the Hive server. Accepted entries are DESCRIBE and DESC.
DetectViewSpecifies whether to use DECRIBE FORMATTED ... to detect the specified table is view or not.
HTTPPathThe path component of the URL endpoint when using HTTP TransportMode.
MaxRowsLimits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.
OtherThese hidden properties are used only in specific use cases.
PoolIdleTimeoutThe allowed idle time for a connection before it is closed.
PoolMaxSizeThe maximum connections in the pool.
PoolMinSizeThe minimum number of connections in the pool.
PoolWaitTimeThe max seconds to wait for an available connection.
PseudoColumnsThis property indicates whether or not to include pseudo columns as columns to the table.
QueryPassthroughThis option passes the query to the Spark SQL server as is.
ReadonlyYou can use this property to enforce read-only access to Spark SQL from the provider.
RTKThe runtime key used for licensing.
ServerConfigurationsA name-value list of server configuration variables to override the server defaults.
TimeoutThe value in seconds until the timeout error is thrown, canceling the operation.
UseConnectionPoolingThis property enables connection pooling.
UseDatabricksUploadApiThis option specifies whether the Databricks Upload API will be use when executing batch insert.
UseDescTableQueryThis option specifies whether the columns will be retrieved using a DESC TABLE query or the GetColumns Thrift API.The GetColumns Thrift API works for the SparkSQL 3.0.0 or later.
UseInsertSelectSyntaxSpecifies whether to use an INSERT INTO SELECT statement.
UserDefinedViewsA filepath pointing to the JSON configuration file containing your custom views.
UseSSLSpecifies whether to use SSL Encryption when connecting to Hive.

Copyright (c) 2023 CData Software, Inc. - All rights reserved.
Build 22.0.8462