Miscellaneous
This section provides a complete list of the Miscellaneous properties you can configure in the connection string for this provider.
Property | Description |
AsyncQueryTimeout | The timeout for asynchronous requests issued by the provider to download large result sets. |
BatchSize | The maximum size of each batch operation to submit. |
ConnectionLifeTime | The maximum lifetime of a connection in seconds. Once the time has elapsed, the connection object is disposed. |
ConnectOnOpen | This property specifies whether to connect to the Spark SQL when the connection is opened. |
DescribeCommand | The describe command to determine which describe command will use to communicate with the Hive server. Accepted entries are DESCRIBE and DESC. |
DetectView | Specifies whether to use DECRIBE FORMATTED ... to detect the specified table is view or not. |
HTTPPath | The path component of the URL endpoint when using HTTP TransportMode. |
MaxRows | Limits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time. |
Other | These hidden properties are used only in specific use cases. |
PoolIdleTimeout | The allowed idle time for a connection before it is closed. |
PoolMaxSize | The maximum connections in the pool. |
PoolMinSize | The minimum number of connections in the pool. |
PoolWaitTime | The max seconds to wait for an available connection. |
PseudoColumns | This property indicates whether or not to include pseudo columns as columns to the table. |
QueryPassthrough | This option passes the query to the Spark SQL server as is. |
Readonly | You can use this property to enforce read-only access to Spark SQL from the provider. |
RTK | The runtime key used for licensing. |
ServerConfigurations | A name-value list of server configuration variables to override the server defaults. |
Timeout | The value in seconds until the timeout error is thrown, canceling the operation. |
UseConnectionPooling | This property enables connection pooling. |
UseDatabricksUploadApi | This option specifies whether the Databricks Upload API will be use when executing batch insert. |
UseDescTableQuery | This option specifies whether the columns will be retrieved using a DESC TABLE query or the GetColumns Thrift API.The GetColumns Thrift API works for the SparkSQL 3.0.0 or later. |
UseInsertSelectSyntax | Specifies whether to use an INSERT INTO SELECT statement. |
UserDefinedViews | A filepath pointing to the JSON configuration file containing your custom views. |
UseSSL | Specifies whether to use SSL Encryption when connecting to Hive. |