Miscellaneous
This section provides a complete list of the Miscellaneous properties you can configure in the connection string for this provider.
| Property | Description |
| AsyncQueryTimeout | The timeout for asynchronous requests issued by the provider to download large result sets. |
| BatchSize | Specifies the maximum number of rows included in each batch submitted during batch operations. To submit the entire batch as a single request, set BatchSize to 0 . |
| ConnectionLifeTime | Specifies the maximum lifetime of a connection in seconds. When the specified time elapses, the provider closes the connection. |
| ConnectOnOpen | Specifies whether the provider establishes a connection to Spark SQL immediately upon opening the connection. Set ConnectOnOpen to True if immediate connectivity verification is necessary. |
| DescribeCommand | The describe command to determine which describe command will use to communicate with the Hive server. Accepted entries are DESCRIBE and DESC. |
| DetectView | Specifies whether to use DECRIBE FORMATTED ... to detect the specified table is view or not. |
| HTTPPath | The path component of the URL endpoint when using HTTP TransportMode. |
| MaxRows | Specifies the maximum number of rows returned for queries that do not include either aggregation or GROUP BY. |
| Other | Specifies additional hidden properties for specific use cases., to be used only when our Support team advises it, to address specific issues. See Remarks for details. |
| PoolIdleTimeout | Specifies the maximum idle time, in seconds, that a connection can remain in the pool before being closed. Requires UseConnectionPooling=True. |
| PoolMaxSize | Specifies the maximum number of connections allowed in the connection pool. |
| PoolMinSize | Specifies the minimum number of connections to be maintained in the connection pool at all times. |
| PoolWaitTime | Specifies the maximum number of seconds a connection request waits for an available connection in the pool. If the wait exceeds this time, an error is returned. |
| PseudoColumns | Specifies the pseudocolumns to expose as table columns, expressed as a string in the format 'TableName=ColumnName;TableName=ColumnName'. |
| QueryPassthrough | This option passes the query to the Spark SQL server as is. |
| Readonly | Toggles read-only access to Spark SQL from the provider. |
| RTK | Specifies the runtime key for licensing the provider. If unset or invalid, the provider defaults to the standard licensing method. This property is only required in environments where the standard licensing method is unsupported or requires a runtime key. |
| ServerConfigurations | A name-value list of server configuration variables to override the server defaults. |
| Timeout | Specifies the maximum time, in seconds, that the provider waits for a server response before throwing a timeout error. |
| UseConnectionPooling | Enables the connection pooling feature, which allows the provider to reuse existing connections instead of creating new ones for each request. |
| UseDatabricksUploadApi | This option specifies whether the Databricks Upload API will be use when executing batch insert. |
| UseDescTableQuery | This option specifies whether the columns will be retrieved using a DESC TABLE query or the GetColumns Thrift API.The GetColumns Thrift API works for the SparkSQL 3.0.0 or later. |
| UseInsertSelectSyntax | Specifies whether to use an INSERT INTO SELECT statement. |
| UserDefinedViews | Specifies a filepath to a JSON configuration file that defines custom views. The provider automatically detects and uses the views specified in this file. |
| UseSSL | Specifies whether to use SSL Encryption when connecting to Hive. |