ODBC Driver for Databricks

Build 24.0.9060

Linux DSN Configuration

This section describes how to set up ODBC connectivity and configure DSNs on several Linux distributions: Debian-based systems, like Ubuntu, and Red Hat Linux platforms, like Red Hat Enterprise Linux (RHEL) and Fedora.

Minimum Linux Versions

Here are the minimum supported versions for Red Hat-based and Debian-based systems:

OSMin. Version
Ubuntu18.04
Debian10
RHEL8
Fedora28
SUSE15

Installing the Driver Dependencies

Run the following commands as root or with sudo to install the necessary dependencies:

  • Debian/Ubuntu:
    apt-get install libc6 libstdc++6 zlib1g libgcc1
  • RHEL/Fedora:
    yum install glibc libstdc++ zlib libgcc

Installing the Driver

You can use standard package management systems to install the driver.

On Debian-based systems, like Ubuntu, run the following command with root or sudo:

dpkg -i /path/to/driver/setup/DatabricksODBCDriverforUnix.deb 

On systems that support the RPM package format, run the following command with root or sudo:

rpm -ivh /path/to/driver/DatabricksODBCDriverforUnix.rpm 

Licensing the Driver

Run the following commands to license the driver. To activate a trial, omit the <key> input.

cd /opt/cdata/cdata-odbc-driver-for-databricks/bin/
sudo ./install-license.sh <key>

Connecting through the Driver Manager

The driver manager loads the driver and passes function calls from the application to the driver. You need to register the driver with the driver manager and you define DSNs in the driver manager's configuration files.

The driver installation registers the driver with the unixODBC driver manager and creates a system DSN. The unixODBC driver manager can be used from Python and from many other applications. Your application may embed another driver manager.

Creating the DSN

See Using unixODBC to install unixODBC and configure DSNs. See Using the DataDirect Driver Manager to create a DSN to connect to OBIEE, Informatica, and SAS.

Connecting to Databricks

To connect to a Databricks cluster, set the following properties:

  • Database: The name of the Databricks database.
  • Server: The Server Hostname of your Databricks cluster.
  • HTTPPath: The HTTP Path of your Databricks cluster.
  • Token: Your personal access token. You can obtain this value by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab.

You can find the required values in your Databricks instance by navigating to Clusters and selecting the desired cluster, and selecting the JDBC/ODBC tab under Advanced Options.

Configuring Cloud Storage

The driver supports DBFS, Azure Blob Storage, and AWS S3 for uploading CSV files.

DBFS Cloud Storage

To use DBFS for cloud storage, set the CloudStorageType property to DBFS.

Azure Blob Storage

Set the following properties:

  • CloudStorageType: Azure Blob storage.
  • StoreTableInCloud: True to store tables in cloud storage when creating a new table.
  • AzureStorageAccount: The name of your Azure storage account.
  • AzureAccessKey: The storage key associated with your Databricks account. Find this via the azure portal (using the root account). Select your storage account and click Access Keys to find this value.
  • AzureBlobContainer: Set to the name of you Azure Blob storage container.

AWS S3 Storage

Set the following properties:

  • CloudStorageType: AWS S3.
  • StoreTableInCloud: True to store tables in cloud storage when creating a new table.
  • AWSAccessKey: The AWS account access key. You can acquire this value from your AWS security credentials page.
  • AWSSecretKey: Your AWS account secret key. You can acquire this value from your AWS security credentials page.
  • AWSS3Bucket: The name of your AWS S3 bucket.
  • AWSRegion: The hosting region for your Amazon Web Services. You can obtain the AWS Region value by navigating to the Buckets List page of your Amazon S3 service, for example, us-east-1.

Authenticating to Databricks

CData supports the following authentication schemes:

  • Basic
  • Personal Access Token
  • Azure Active Directory (AD)
  • Azure Service Principal

Basic

Basic authentication requires a username and password. Set the following:

  • AuthScheme: Basic.
  • User: Your username. This overrides the default value ("Token").
  • Token: Your password.

Personal Access Token

To authenticate, set the following:

  • AuthScheme: PersonalAccessToken.
  • Token: The token used to access the Databricks server. It can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab.

Azure Active Directory

To authenticate, follow these steps:

  1. Register an application with the AzureAD (now known as Microsoft Entra ID) endpoint in the Azure portal. See Configure an app in Azure portal for information on how to create and register the application. Alternatively, you can use a AzureAD application that is already registered.

  2. Set these properties:

    • AuthScheme: AzureAD.
    • AzureTenant: The "Directory(tenant) ID" in the AzureAD application "Overview" page
    • OAuthClientId: The "Application(client) ID" in the AzureAD application "Overview" page.
    • CallbackURL: The "Redirect URIs" in AzureAD application "Authentication" page

  3. When connecting, a web page opens that prompts you to authenticate. After successful authentication, the connection is established.

Here is an example of the connection string:

"Server=https://adb-8439982502599436.16.azuredatabricks.net;HTTPPath=sql/protocolv1/o/8439982502599436/0810-011933-odsz4s3r;database=default;
AuthScheme=AzureAD;InitiateOAuth=GETANDREFRESH;AzureTenant=94be69e7-edb4-4fda-ab12-95bfc22b232f;OAuthClientId=f544a825-9b69-43d9-bec2-3e99727a1669;CallbackURL=http://localhost;"

Azure AD Service Principal

To authenticate, set the following properties:

  • AuthScheme: AzureServicePrincipal.
  • AzureTenantId: The tenant ID of your Microsoft Azure Active Directory.
  • AzureClientId: The application (client) ID of your Microsoft Azure Active Directory application.
  • AzureClientSecret: The application (client) secret of your Microsoft Azure Active Directory application.

Set the Driver Encoding

The ODBC drivers need to specify which encoding to use with the ODBC Driver Manager. By default, the CData ODBC Drivers for Unix are configured to use UTF-16 which is compatible with unixODBC, but other Driver Managers may require alternative encoding.

Alternatively, if you are using the ODBC driver from an application that uses the ANSI ODBC API it may be necessary to set the ANSI code page. For example, to import Japanese characters in an ANSI application, you can specify the code page in the config file '/opt/cdata/cdata-odbc-driver-for-databricks/lib/cdata.odbc.databricks.ini':

[Driver]
AnsiCodePage = 932

Copyright (c) 2024 CData Software, Inc. - All rights reserved.
Build 24.0.9060