Linux DSN Configuration
This section describes how to set up ODBC connectivity and configure DSNs on several Linux distributions: Debian-based systems, like Ubuntu, and Red Hat Linux platforms, like Red Hat Enterprise Linux (RHEL), CentOS, and Fedora.
Minimum Linux Versions
Here are the minimum supported versions for Red Hat-based and Debian-based systems:
OS | Min. Version |
Ubuntu | 11.04 |
Debian | 7 |
RHEL | 6.9 |
CentOS | 6.9 |
Fedora | 13 |
SUSE | 12.1 |
Installing the Driver Dependencies
Run the following commands as root or with sudo to install the necessary dependencies:
- Debian/Ubuntu:
apt-get install libc6 libstdc++6 zlib1g libgcc1
- RHEL/CentOS/Fedora:
yum install glibc libstdc++ zlib libgcc
Here are the corresponding libraries required by the driver:
Debian/Ubuntu Package | RHEL/CentOS/Fedora Package | File |
libc6 | glibc | linux-vdso.1 |
libc6 | glibc | libm.so.6 |
libc6 | glibc | librt.so.1 |
libc6 | glibc | libdl.so.2 |
libc6 | glibc | libpthread.so.0 |
libc6 | glibc | libc.so.6 |
libc6 | glibc | ld-linux-x86-64.so.2 |
libstdc++6 | libstdc++ | libstdc++.so.6 |
zlib1g | zlib | libz.so.1 |
libgcc1 | libgcc | libgcc_s.so.1 |
Installing the Driver
You can use standard package management systems to install the driver.
On Debian-based systems, like Ubuntu, run the following command with root or sudo:
dpkg -i /path/to/driver/setup/ApacheHiveODBCDriverforUnix.deb
On systems that support the RPM package format, run the following command with root or sudo:
rpm -ivh /path/to/driver/ApacheHiveODBCDriverforUnix.rpm
Licensing the Driver
Run the following commands to license the driver. To activate a trial, omit the <key> input.
cd /opt/cdata/cdata-odbc-driver-for-apachehive/bin/
sudo ./install-license.sh <key>
Connecting through the Driver Manager
The driver manager loads the driver and passes function calls from the application to the driver. You need to register the driver with the driver manager and you define DSNs in the driver manager's configuration files.
The driver installation registers the driver with the unixODBC driver manager and creates a system DSN. The unixODBC driver manager can be used from Python and from many other applications. Your application may embed another driver manager.
Creating the DSN
See Using unixODBC to install unixODBC and configure DSNs. See Using the DataDirect Driver Manager to create a DSN to connect to OBIEE, Informatica, and SAS.
Connecting to Apache Hive
Self-hosted Instances
Specify the following to establish a connection with Apache Hive:
- TransportMode: The transport mode to use to communicate with the Hive server. Accepted entries are BINARY and HTTP. BINARY is selected by default.
- Server: Set this to the host name or IP address of the server hosting HiveServer2.
- Port: Set this to the port for the connection to the HiveServer2 instance.
- UseSSL (optional): Set this to enable TLS/SSL.
Amazon EMR Instances
Connections to Amazon EMR will need to be established using an SSH tunnel.
Use the following procedure to create an SSH tunnel to EMR.
- To begin, you will need an active EMR cluster and an EC2 key pair. The key pair can be in .ppk or .pem format.
- Next, authorize inbound traffic in your cluster settings.
Set the following to connect (while running an active tunnel session to EMR):
- Server: Set this to the master node (master-public-dns-name) where the Apache Hive server is running.
- Port: Set this to the port required to connect to Apache Hive.
- UseSSH: Set this to true.
- SSHServer: Set this to the master node (master-public-dns-name).
- SSHPort: Set this to 22.
- SSHAuthMode: Set this to PUBLIC_KEY.
- SSHUser: Set this to hadoop.
- SSHClientCert: Set this to the full path to the key file.
- SSHClientCertType: Set this to type that corresponds to the key file. Typically either PEMKEY_FILE or PPKFILE.
Hadoop Cluster on Azure HDInsight Instances
You will need to supply the following to establish a connection to a Hadoop cluster hosted on Azure HDInsight:
- User: Set this to the cluster username that you specified when creating the cluster on Azure HDInsight.
- Password: Set this to the cluster password that you specified when creating the cluster on Azure HDInsight.
- Server: The server corresponding to your cluster. For example: myclustername.azurehdinsight.net.
- Port: Set this to the port running HiveServer2. This will be 443 by default.
- HTTPPath: Set this to the HTTP path for the hive2 service. This will be hive2 by default.
- TransportMode: Set this to HTTP.
- UseSSL: Set this to true.
- QueryPassthrough (optional): Set QueryPassthrough to true to bypass the SQL engine of the driver and execute HiveQL queries directly to Apache Hive.
Google DataProc Instances
Before Connecting
Ensure that the Apache Hive server on DataProc was created with the DataProc Component Gateway enabled.
Next, obtain the external IP address of the Hive Cluster. To find this, load up the Cloud Shell and list the instances.
gcloud compute instances list
Note the external IP of the relevant machine.
Build an SSH Tunnel to the Hive Cluster Web Interface
Navigate to the Hive cluster on DataProc and select the WEB INTERFACES tab. Select Create an SSH tunnel to connect to a web interface.
A cloud console command will be shown that can be used to create an SSH key pair. Download the private key from the directory specified in the console.
Configure the SSH tunnel in an SSH utility:
- Host Name: Set this to the external IP noted above.
- Port: 22
- Point the tool to your private SSH key.
- For the Tunnel, map an open port to localhost:10000. localhost will be resolved properly on the server.
Connecting to Hive on Google DataProc
Specify the following information to connect to Apache Hive:
- TransportMode: Set this to BINARY.
- AuthScheme: Set this to Plain.
- Port: Set this to the chosen SSH Tunnel port on the local machine.
Authenticating to Apache Hive
PLAIN
Set AuthScheme to PLAIN when the hive.server2.authentication property is set to None (uses Plain SASL), PAM, or CUSTOM. In addition, set the following connection properties:
- User: Set this to user to login as. If nothing is set, 'anonymous' will be sent instead.
- Password: Set this to the password of the user. If nothing is set, 'anonymous' will be sent instead.
LDAP
Set AuthScheme to LDAP when the hive.server2.authentication property is set to LDAP. In addition, set the following connection properties:
- User: Set this to user to login as.
- Password: Set this to the password of the user.
NOSASL
Set AuthScheme to NOSASL when the hive.server2.authentication property is set to NOSASL. There are no user credentials submitted in this auth scheme.
Kerberos
Set AuthScheme to Kerberos when the hive.server2.authentication property is set to Kerberos. Please see Using Kerberos for details on how to authenticate with Kerberos.
Set the Driver Encoding
The ODBC drivers need to specify which encoding to use with the ODBC Driver Manager. By default, the CData ODBC Drivers for Unix are configured to use UTF-16 which is compatible with unixODBC, but other Driver Managers may require alternative encoding.
Alternatively, if you are using the ODBC driver from an application that uses the ANSI ODBC API it may be necessary to set the ANSI code page. For example, to import Japanese characters in an ANSI application, you can specify the code page in the config file '/opt/cdata/cdata-odbc-driver-for-apachehive/lib/cdata.odbc.apachehive.ini':
[Driver]
AnsiCodePage = 932