Cmdlets for Spark SQL

Build 24.0.9060

Parameters (Connect-SparkSQL Cmdlet)

The following is the full list of the parameters of the cmdlet with short descriptions. Click the links for further details.

Authentication


PropertyDescription
AuthSchemeThe authentication scheme used. Accepted entries are Plain, LDAP, NOSASL, and Kerberos.
ServerThe host name or IP address of the server hosting the SparkSQL database.
PortThe port for the SparkSQL database.
UserThe username used to authenticate with SparkSQL.
PasswordThe password used to authenticate with SparkSQL.
DatabaseThe name of the SparkSQL database.
ProtocolVersionThe Protocol Version used to authenticate with SparkSQL.
ImpersonationProxyUserThe proxy user of the Hive user impersonation.
SaslQopQuality of protection for the SASL framework. The level of quality is negotiated between the client and server during authentication. Used by Kerberos authentication with TCP transport.
TransportModeThe transport mode to use to communicate with the Hive server. Accepted entries are BINARY and HTTP.

Kerberos


PropertyDescription
KerberosKDCThe Kerberos Key Distribution Center (KDC) service used to authenticate the user.
KerberosRealmThe Kerberos Realm used to authenticate the user.
KerberosSPNThe service principal name (SPN) for the Kerberos Domain Controller.
KerberosUserThe principal name for the Kerberos Domain Controller. Used in the format host/user@realm.
KerberosKeytabFileThe Keytab file containing your pairs of Kerberos principals and encrypted keys.
KerberosServiceRealmThe Kerberos realm of the service.
KerberosServiceKDCThe Kerberos KDC of the service.
KerberosTicketCacheThe full file path to an MIT Kerberos credential cache file.

SSL


PropertyDescription
SSLClientCertThe TLS/SSL client certificate store for SSL Client Authentication (2-way SSL).
SSLClientCertTypeThe type of key store containing the TLS/SSL client certificate.
SSLClientCertPasswordThe password for the TLS/SSL client certificate.
SSLClientCertSubjectThe subject of the TLS/SSL client certificate.
SSLServerCertThe certificate to be accepted from the server when connecting using TLS/SSL.

Firewall


PropertyDescription
FirewallTypeThe protocol used by a proxy-based firewall.
FirewallServerThe name or IP address of a proxy-based firewall.
FirewallPortThe TCP port for a proxy-based firewall.
FirewallUserThe user name to use to authenticate with a proxy-based firewall.
FirewallPasswordA password used to authenticate to a proxy-based firewall.

Proxy


PropertyDescription
ProxyAutoDetectWhen this connection property is set to True, the provider checks your system proxy settings for existing proxy server configurations (no need to manually supply proxy server details). Set to False if you want to manually configure the provider to connect to a specific proxy server.
ProxyServerThe hostname or IP address of the proxy server that you want to route HTTP traffic through.
ProxyPortThe TCP port that the proxy server (specified in the ProxyServer connection property) is running on.
ProxyAuthSchemeThe authentication method the provider uses when authenticating to the proxy server specified in the ProxyServer connection property.
ProxyUserThe username of a user account registered with the proxy server specified in the ProxyServer connnection property.
ProxyPasswordThe password associated with the user specified in the ProxyUser connection property.
ProxySSLTypeThe SSL type to use when connecting to the ProxyServer proxy.
ProxyExceptionsA semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .

Logging


PropertyDescription
LogfileA filepath which designates the name and location of the log file.
VerbosityThe verbosity level that determines the amount of detail included in the log file.
LogModulesCore modules to be included in the log file.
MaxLogFileSizeA string specifying the maximum size in bytes for a log file (for example, 10 MB).
MaxLogFileCountA string specifying the maximum file count of log files.

Schema


PropertyDescription
LocationA path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemasThis property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.
TablesThis property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.
ViewsRestricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.

Caching


PropertyDescription
AutoCacheAutomatically caches the results of SELECT queries into a cache database specified by either CacheLocation or both of CacheConnection and CacheProvider .
CacheLocationSpecifies the path to the cache when caching to a file.
CacheToleranceThe tolerance for stale data in the cache specified in seconds when using AutoCache .
OfflineUse offline mode to get the data from the cache instead of the live source.
CacheMetadataThis property determines whether or not to cache the table metadata to a file store.

Miscellaneous


PropertyDescription
AsyncQueryTimeoutThe timeout for asynchronous requests issued by the provider to download large result sets.
DescribeCommandThe describe command to determine which describe command will use to communicate with the Hive server. Accepted entries are DESCRIBE and DESC.
DetectViewSpecifies whether to use DECRIBE FORMATTED ... to detect the specified table is view or not.
HTTPPathThe path component of the URL endpoint when using HTTP TransportMode.
MaxRowsLimits the number of rows returned when no aggregation or GROUP BY is used in the query. This takes precedence over LIMIT clauses.
OtherThese hidden properties are used only in specific use cases.
PseudoColumnsSpecify a set of pseudocolumns to expose as columns.
QueryPassthroughThis option passes the query to the Spark SQL server as is.
ReadonlyYou can use this property to enforce read-only access to Spark SQL from the provider.
RTKThe runtime key used for licensing.
ServerConfigurationsA name-value list of server configuration variables to override the server defaults.
TimeoutThe value in seconds until the timeout error is thrown, canceling the operation.
UseDatabricksUploadApiThis option specifies whether the Databricks Upload API will be use when executing batch insert.
UseDescTableQueryThis option specifies whether the columns will be retrieved using a DESC TABLE query or the GetColumns Thrift API.The GetColumns Thrift API works for the SparkSQL 3.0.0 or later.
UseInsertSelectSyntaxSpecifies whether to use an INSERT INTO SELECT statement.
UserDefinedViewsA filepath pointing to the JSON configuration file containing your custom views.
UseSSLSpecifies whether to use SSL Encryption when connecting to Hive.

Copyright (c) 2024 CData Software, Inc. - All rights reserved.
Build 24.0.9060