ODBC Driver for Spark SQL

Build 22.0.8462

Linux DSN Configuration

This section describes how to set up ODBC connectivity and configure DSNs on several Linux distributions: Debian-based systems, like Ubuntu, and Red Hat Linux platforms, like Red Hat Enterprise Linux (RHEL), CentOS, and Fedora.

Minimum Linux Versions

Here are the minimum supported versions for Red Hat-based and Debian-based systems:

OSMin. Version
Ubuntu11.04
Debian7
RHEL6.9
CentOS6.9
Fedora13
SUSE12.1

Installing the Driver Dependencies

Run the following commands as root or with sudo to install the necessary dependencies:

  • Debian/Ubuntu:
    apt-get install libc6 libstdc++6 zlib1g libgcc1
  • RHEL/CentOS/Fedora:
    yum install glibc libstdc++ zlib libgcc

Here are the corresponding libraries required by the driver:

Debian/Ubuntu PackageRHEL/CentOS/Fedora PackageFile
libc6glibclinux-vdso.1
libc6glibclibm.so.6
libc6glibclibrt.so.1
libc6glibclibdl.so.2
libc6glibclibpthread.so.0
libc6glibclibc.so.6
libc6glibcld-linux-x86-64.so.2
libstdc++6libstdc++libstdc++.so.6
zlib1gzliblibz.so.1
libgcc1libgcclibgcc_s.so.1

Installing the Driver

You can use standard package management systems to install the driver.

On Debian-based systems, like Ubuntu, run the following command with root or sudo:

dpkg -i /path/to/driver/setup/SparkSQLODBCDriverforUnix.deb 

On systems that support the RPM package format, run the following command with root or sudo:

rpm -ivh /path/to/driver/SparkSQLODBCDriverforUnix.rpm 

Licensing the Driver

Run the following commands to license the driver. To activate a trial, omit the <key> input.

cd /opt/cdata/cdata-odbc-driver-for-sparksql/bin/
sudo ./install-license.sh <key>

Connecting through the Driver Manager

The driver manager loads the driver and passes function calls from the application to the driver. You need to register the driver with the driver manager and you define DSNs in the driver manager's configuration files.

The driver installation registers the driver with the unixODBC driver manager and creates a system DSN. The unixODBC driver manager can be used from Python and from many other applications. Your application may embed another driver manager.

Creating the DSN

See Using unixODBC to install unixODBC and configure DSNs. See Using the DataDirect Driver Manager to create a DSN to connect to OBIEE, Informatica, and SAS.

Connecting to Spark SQL

Specify the following to establish a connection with Spark SQL:

  • Server: Set this to the host name or IP address of the server hosting SparkSQL.
  • Port: Set this to the port for the connection to the SparkSQL instance.
  • TransportMode: The transport mode to use to communicate with the SparkSQL server. Accepted entries are BINARY and HTTP. BINARY is selected by default.

Securing Spark SQL Connections

To enable TLS/SSL in the driver, set UseSSL to True.

Authenticating to Spark SQL

The service may be authenticated to using the PLAIN, LDAP, NOSASL, KERBEROS auth schemes.

PLAIN

To authenticate with PLAIN, set the following connection properties:

  • AuthScheme: Set this to PLAIN.
  • User: Set this to user to login as.
  • Password: Set this to the password of the user.
To authenticate, set User and Password.

LDAP

To authenticate with LDAP, set the following connection properties:

  • AuthScheme: Set this to LDAP.
  • User: Set this to user to login as.
  • Password: Set this to the password of the user.
To authenticate, set User, Password, and AuthScheme.

NOSASL

When using NOSASL, no authentication is performed. Set the following connection properties:

  • AuthScheme: Set this to NOSASL.

Kerberos

Please see Using Kerberos for details on how to authenticate with Kerberos.

Connecting to Databricks

To connect to a Databricks cluster, set the properties as described below. Note: The needed values can be found in your Databricks instance by navigating to 'Clusters', selecting the desired cluster, and selecting the JDBC/ODBC tab under 'Advanced Options'.

  • Server: Set to the Server Hostname of your Databricks cluster.
  • Port: 443
  • TransportMode: HTTP
  • HTTPPath: Set to the HTTP Path of your Databricks cluster.
  • UseSSL: True
  • AuthScheme: PLAIN
  • User: Set this to user to login as
  • Password: Set to your personal access token (value can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab).

Set the Driver Encoding

The ODBC drivers need to specify which encoding to use with the ODBC Driver Manager. By default, the CData ODBC Drivers for Unix are configured to use UTF-16 which is compatible with unixODBC, but other Driver Managers may require alternative encoding.

Alternatively, if you are using the ODBC driver from an application that uses the ANSI ODBC API it may be necessary to set the ANSI code page. For example, to import Japanese characters in an ANSI application, you can specify the code page in the config file '/opt/cdata/cdata-odbc-driver-for-sparksql/lib/cdata.odbc.sparksql.ini':

[Driver]
AnsiCodePage = 932

Copyright (c) 2023 CData Software, Inc. - All rights reserved.
Build 22.0.8462