Apache Kafka Connector for CData Sync

Build 22.0.8462
  • Apache Kafka
    • Establishing a Connection
    • Advanced Features
      • SSL Configuration
      • Firewall and Proxy
    • Data Model
    • Connection String Options
      • Authentication
        • AuthScheme
        • User
        • Password
        • BootstrapServers
        • Topic
        • UseSSL
      • Connection
        • ConsumerGroupId
        • AutoCommit
      • Kerberos
        • KerberosKeytabFile
        • KerberosSPN
        • KerberosServiceName
        • UseKerberosTicketCache
      • SSL
        • SSLServerCert
        • SSLServerCertType
        • SSLServerCertPassword
        • SSLClientCert
        • SSLClientCertType
        • SSLClientCertPassword
        • SSLIdentificationAlgorithm
      • Schema Registry
        • RegistryUrl
        • RegistryType
        • RegistryService
        • RegistryAuthScheme
        • RegistryUser
        • RegistryPassword
        • RegistryClientCert
        • RegistryClientCertType
        • RegistryClientCertPassword
        • RegistryClientCertSubject
        • RegistryVersion
        • RegistryServerCert
      • Firewall
        • FirewallType
        • FirewallServer
        • FirewallPort
        • FirewallUser
        • FirewallPassword
      • Proxy
        • ProxyAutoDetect
        • ProxyServer
        • ProxyPort
        • ProxyAuthScheme
        • ProxyUser
        • ProxyPassword
        • ProxySSLType
        • ProxyExceptions
      • Logging
        • LogModules
      • Schema
        • Location
        • BrowsableSchemas
        • Tables
        • Views
      • Miscellaneous
        • AggregateMessages
        • CompressionType
        • ConsumerProperties
        • CreateTablePartitions
        • CreateTableReplicationFactor
        • EnableIdempotence
        • FlattenArrays
        • GenerateSchemaFiles
        • MaximumBatchSize
        • MaxRows
        • MessageKeyColumn
        • MessageKeyType
        • OffsetResetStrategy
        • Other
        • ProduceMeta
        • ProducerProperties
        • PseudoColumns
        • ReadDuration
        • RowScanDepth
        • SerializationFormat
        • Timeout
        • TypeDetectionScheme
        • UseConfluentAvroFormat
        • UserDefinedViews
        • ValidateRegistryTopics
    • Third Party Copyrights

Apache Kafka Connector for CData Sync

Overview

The CData Sync App provides a straightforward way to continuously pipeline your Apache Kafka data to any database, data lake, or data warehouse, making it easily available for Analytics, Reporting, AI, and Machine Learning.

The Apache Kafka connector can be used from the CData Sync application to pull data from Apache Kafka and move it to any of the supported destinations.

Apache Kafka Connector for CData Sync

Establishing a Connection

Create a connection to Apache Kafka by navigating to the Connections page in the Sync App application and selecting the corresponding icon in the Add Connections panel. If the Apache Kafka icon is not available, click the Add More icon to download and install the Apache Kafka connector from the CData site.

Required properties are listed under the Settings tab. The Advanced tab lists connection properties that are not typically required.

Connecting to Apache Kafka

Set BootstrapServers and the Topic properties to specify the address of your Apache Kafka server, as well as the topic you would like to interact with.

By default, the Sync App communicates with the data source in PLAINTEXT, which means that all data is sent in the clear. To encrypt communication, you should configure the Sync App to use SSL encryption. To do this, set UseSSL to true and configure SSLServerCert and SSLServerCertType to load the server certificates.

Note that proxy settings like ProxyServer and firewall settings like FirewallServer do not affect the connection to the Apache Kafka broker. Internally the Sync App connects to Apache Kafka using the official libraries which do not support proxies. These options are only used when the Sync App connects to the schema registry as described in Extracting Metadata From Topics.

Authenticating to Apache Kafka

The Apache Kafka data source supports the following authentication methods:

  • Anonymous
  • Plain
  • Scram
  • Kerberos

Anonymous

Certain on-premise deployments of Apache Kafka are able to connect to Apache Kafka without setting any authentication connection properties. To do so, simply set the AuthScheme to "None", and you are ready to connect.

SASL Plain

The User and Password properties should be specified. AuthScheme should be set to Plain.

SCRAM login module

The User and Password properties should be specified. The AuthScheme should be set to 'SCRAM' (for SCRAM-SHA-256) or 'SCRAM-SHA-512'.

SSL client certificates

The SSLClientCert and SSLClientCertType properties should be specified and AuthScheme should be set to SSLCertificate.

Kerberos

To authenticate to Apache Kafka using Kerberos, set the following properties:
  • AuthScheme: Set this to KERBEROS.
  • KerberosServiceName: This should match to the principal name of the Kafka brokers. For example, the principal is "kafka/[email protected]", so: KerberosServiceName=kafka.

Apache Kafka Connector for CData Sync

Advanced Features

This section details a selection of advanced features of the Apache Kafka Sync App.

User Defined Views

The Sync App allows you to define virtual tables, called user defined views, whose contents are decided by a pre-configured query. These views are useful when you cannot directly control queries being issued to the drivers. See User Defined Views for an overview of creating and configuring custom views.

SSL Configuration

Use SSL Configuration to adjust how Sync App handles TLS/SSL certificate negotiations. You can choose from various certificate formats; see the SSLServerCert property under "Connection String Options" for more information.

Firewall and Proxy

Configure the Sync App for compliance with Firewall and Proxy, including Windows proxies and HTTP proxies. You can also set up tunnel connections.

Query Processing

The Sync App offloads as much of the SELECT statement processing as possible to Apache Kafka and then processes the rest of the query in memory (client-side).

See Query Processing for more information.

Logging

See Logging for an overview of configuration settings that can be used to refine CData logging. For basic logging, you only need to set two connection properties, but there are numerous features that support more refined logging, where you can select subsets of information to be logged using the LogModules connection property.

Apache Kafka Connector for CData Sync

SSL Configuration

Customizing the SSL Configuration

By default, the Sync App attempts to negotiate SSL/TLS by checking the server's certificate against the system's trusted certificate store.

To specify another certificate, see the SSLServerCert property for the available formats to do so.

Client SSL Certificates

The Apache Kafka Sync App also supports setting client certificates. Set the following to connect using a client certificate.

  • SSLClientCert: The name of the certificate store for the client certificate.
  • SSLClientCertType: The type of key store containing the TLS/SSL client certificate.
  • SSLClientCertPassword: The password for the TLS/SSL client certificate.
  • SSLClientCertSubject: The subject of the TLS/SSL client certificate.

Apache Kafka Connector for CData Sync

Firewall and Proxy

Connecting Through a Firewall or Proxy

HTTP Proxies

To connect through the Windows system proxy, you do not need to set any additional connection properties. To connect to other proxies, set ProxyAutoDetect to false.

In addition, to authenticate to an HTTP proxy, set ProxyAuthScheme, ProxyUser, and ProxyPassword, in addition to ProxyServer and ProxyPort.

Other Proxies

Set the following properties:

  • To use a proxy-based firewall, set FirewallType, FirewallServer, and FirewallPort.
  • To tunnel the connection, set FirewallType to TUNNEL.
  • To authenticate, specify FirewallUser and FirewallPassword.
  • To authenticate to a SOCKS proxy, additionally set FirewallType to SOCKS5.

Apache Kafka Connector for CData Sync

Data Model

The CData Sync App dynamically models Apache Kafka topics as tables. A complete list of discovered topics can be obtained from the sys_tables system table.

Connections to Apache Kafka become part of a consumer group. Any message delivered to a consumer group will only come back in a SELECT query to the topic once.

When a topic is queried for the first time, the OffsetResetStrategy property influences which messages are retrieved.

Select Earliest to retrieve old and live messages, and select Latest to select only live messages.

This means that, if Earliest was chosen as the OffsetResetStrategy, historical data is only read the first time the topic is queried.

Each Sync App connection is assigned a unique consumer group, and can therefore retrieve historical data again. Set ConsumerGroupId to persists the message offset across connections. If the ConsumerGroupId is persisted, queries will retrieve all messages since the last connection. Any consistent value will work, though GUIDs are recommended.

Tables

The Sync App reads the metadata from Apache Kafka topics and exposes them as tables. Set the Topic to specify a specific Topic to read from.

SELECTing from a topic returns old messages from the topic, as well as live messages posted before the number of seconds specified by the ReadDuration have elapsed.

Stored Procedures

Stored Procedures are function-like interfaces to Apache Kafka. They can be used to create schema files, commit messages, and more.

Apache Kafka Connector for CData Sync

Connection String Options

The connection string properties are the various options that can be used to establish a connection. This section provides a complete list of the options you can configure in the connection string for this provider. Click the links for further details.

For more information on establishing a connection, see Establishing a Connection.

Authentication


PropertyDescription
AuthSchemeThe scheme used for authentication with the Apache Kafka broker.
UserThe user who is authenticating to Apache Kafka.
PasswordThe password used to authenticate to Apache Kafka.
BootstrapServersThe address of the Apache Kafka BootstrapServers to which you are connecting to.
TopicThe topic used for read and write operations.
UseSSLThis field sets whether SSL is enabled. Automatically enabled if AuthScheme is set to SSL.

Connection


PropertyDescription
ConsumerGroupIdSpecifies which group the consumers created by the driver should belong to.
AutoCommitSpecifies if the Apache Kafka consumer should autocommit after each poll.

Kerberos


PropertyDescription
KerberosKeytabFileThe Keytab file containing your pairs of Kerberos principals and encrypted keys.
KerberosSPNThe service principal name (SPN) for the Kerberos Domain Controller.
KerberosServiceNameThe name of the Kerberos service you want to authenticate with.
UseKerberosTicketCacheSet this to use a ticket cache with the logged in user instead of a keytab file.

SSL


PropertyDescription
SSLServerCertThe SSL server certificate used to validate to the Apache Kafka broker.
SSLServerCertTypeThe format of the SSL server certificate used to verify the Apache Kafka broker.
SSLServerCertPasswordThe password used to decrypt the certificate in SSLServerCert .
SSLClientCertThe SSL client certificate used to connect to the Apache Kafka broker.
SSLClientCertTypeThe format of the SSL client certificate used to connect to the Apache Kafka broker.
SSLClientCertPasswordThe password used to decrypt the certificate in SSLClientCert .
SSLIdentificationAlgorithmThe endpoint identification algorithm used by the Apache Kafka data provider client app to validate server host name.

Schema Registry


PropertyDescription
RegistryUrlThe server for the schema registry. When this property is specified, the driver will read Apache Avro schema from the server.
RegistryTypeType of the schema specified for the a specific topic.
RegistryServiceThe Schema Registry service used for working with topic schemas.
RegistryAuthSchemeThe scheme used to authenticate to the schema registry.
RegistryUserUsername to authorize with the server specified in RegistryUrl .
RegistryPasswordPassword to authorize with the server specified in RegistryUrl .
RegistryClientCertThe TLS/SSL client certificate store for SSL Client Authentication (2-way SSL) with the schema registry.
RegistryClientCertTypeThe type of key store used by the TLS/SSL client certificate given in RegistryClientCert .
RegistryClientCertPasswordThe password for the TLS/SSL client certificate given in RegistryClientCert .
RegistryClientCertSubjectThe subject of the TLS/SSL client certificate given in RegistryClientCert .
RegistryVersionVersion of the schema read from RegistryUrl for the specified topic.
RegistryServerCertThe certificate to be accepted from the schema registry when connecting using TLS/SSL.

Firewall


PropertyDescription
FirewallTypeThe protocol used by a proxy-based firewall.
FirewallServerThe name or IP address of a proxy-based firewall.
FirewallPortThe TCP port for a proxy-based firewall.
FirewallUserThe user name to use to authenticate with a proxy-based firewall.
FirewallPasswordA password used to authenticate to a proxy-based firewall.

Proxy


PropertyDescription
ProxyAutoDetectThis indicates whether to use the system proxy settings or not. This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.
ProxyServerThe hostname or IP address of a proxy to route HTTP traffic through.
ProxyPortThe TCP port the ProxyServer proxy is running on.
ProxyAuthSchemeThe authentication type to use to authenticate to the ProxyServer proxy.
ProxyUserA user name to be used to authenticate to the ProxyServer proxy.
ProxyPasswordA password to be used to authenticate to the ProxyServer proxy.
ProxySSLTypeThe SSL type to use when connecting to the ProxyServer proxy.
ProxyExceptionsA semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .

Logging


PropertyDescription
LogModulesCore modules to be included in the log file.

Schema


PropertyDescription
LocationA path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemasThis property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.
TablesThis property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.
ViewsRestricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.

Miscellaneous


PropertyDescription
AggregateMessagesSpecifies whether or not to return the message as a whole string.
CompressionTypeData compression type. Batches of data will be compressed together.
ConsumerPropertiesAdditional options used to configure Kafka consumers.
CreateTablePartitionsThe number of partitions assigned to a topic created with CREATE TABLE.
CreateTableReplicationFactorThe number of replicas assigned to a topic created with CREATE TABLE.
EnableIdempotenceIf set to true, the Apache Kafka will ensure messages are delivered in the correct order, and without duplicates.
FlattenArraysBy default, nested arrays won't show up if TypeDetectionScheme is set to SchemaRegistry. The FlattenArrays property can be used to flatten the elements of nested arrays into columns of their own. Set FlattenArrays to the number of elements you want to return from nested arrays.
GenerateSchemaFilesIndicates the user preference as to when schemas should be generated and saved.
MaximumBatchSizeSpecifies maximum batch size to gather before sending a request.
MaxRowsLimits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.
MessageKeyColumnIf specified, the message key sent to Apache Kafka will be read from this column.
MessageKeyTypeIf MessageKeyColumn is specified, this property must be set to the expected type for the pertinent column.
OffsetResetStrategySpecifies an offset for the consumer group.
OtherThese hidden properties are used only in specific use cases.
ProduceMetaSpecifies whether or not to send a meta message while producing the outgoing message.
ProducerPropertiesAdditional options used to configure Kafka producers.
PseudoColumnsThis property indicates whether or not to include pseudo columns as columns to the table.
ReadDurationThe duration which additional messages are allowed.
RowScanDepthThe maximum number of messages to scan for the columns available in the topic.
SerializationFormatSpecifies how to serialize/deserialize the incoming or outgoing message.
TimeoutThe value in seconds until the timeout error is thrown, canceling the operation.
TypeDetectionSchemeComma-separated list of options specifying how the provider will scan the data to determine the fields and datatypes for the bucket.
UseConfluentAvroFormatSpecifies how Avro data should be formatted during an INSERT.
UserDefinedViewsA filepath pointing to the JSON configuration file containing your custom views.
ValidateRegistryTopicsSpecifies whether or not to validate schema registry topics against the Apache Kafka broker. Only has an effect when TypeDetectionScheme =SchemaRegistry.
Apache Kafka Connector for CData Sync

Authentication

This section provides a complete list of the Authentication properties you can configure in the connection string for this provider.


PropertyDescription
AuthSchemeThe scheme used for authentication with the Apache Kafka broker.
UserThe user who is authenticating to Apache Kafka.
PasswordThe password used to authenticate to Apache Kafka.
BootstrapServersThe address of the Apache Kafka BootstrapServers to which you are connecting to.
TopicThe topic used for read and write operations.
UseSSLThis field sets whether SSL is enabled. Automatically enabled if AuthScheme is set to SSL.
Apache Kafka Connector for CData Sync

AuthScheme

The scheme used for authentication with the Apache Kafka broker.

Remarks

The supported schemes are described as follows:

Auto Lets the Sync App decide automatically based on the other connection properties you have set.
None Anonymous authentication will be used, and you can connect to the data source without specifying the user credentials.
Plain The plain text login module will be used.
SCRAM The SCRAM login module will be used with SHA-256 hashing.
SCRAM-SHA-512 The SCRAM login module will be used with SHA-512 hashing.
Kerberos Kerberos authentication will be used, when using this value the system kerberos configuration file should be specified.
SSLCertificate SSL client certificate authentication will be used.

Apache Kafka Connector for CData Sync

User

The user who is authenticating to Apache Kafka.

Remarks

If not specified, the driver will attempt unauthorized connection.

Apache Kafka Connector for CData Sync

Password

The password used to authenticate to Apache Kafka.

Remarks

If not specified, the driver will attempt an unauthorized connection.

Apache Kafka Connector for CData Sync

BootstrapServers

The address of the Apache Kafka BootstrapServers to which you are connecting to.

Remarks

Specify both the server and port. The server may be either a hostname or IP address, for example: 10.1.2.3:9092. Multiple comma-separated addresses may be provided. As long as one of the bootstrap servers is in the list responses, the connection will be successful.

If you are connecting to Confluent Cloud, you can find this on the Cluster settings.

Apache Kafka Connector for CData Sync

Topic

The topic used for read and write operations.

Remarks

By default the Sync App supports producing into and consuming from all topics in Kafka. You can limit it to just one topic by setting this option.

Apache Kafka Connector for CData Sync

UseSSL

This field sets whether SSL is enabled. Automatically enabled if AuthScheme is set to SSL.

Remarks

This field sets whether the Sync App will attempt to negotiate TLS/SSL connections to the server. By default, the Sync App checks the server's certificate against the system's trusted certificate store. To specify another certificate, set SSLServerCert.

Apache Kafka Connector for CData Sync

Connection

This section provides a complete list of the Connection properties you can configure in the connection string for this provider.


PropertyDescription
ConsumerGroupIdSpecifies which group the consumers created by the driver should belong to.
AutoCommitSpecifies if the Apache Kafka consumer should autocommit after each poll.
Apache Kafka Connector for CData Sync

ConsumerGroupId

Specifies which group the consumers created by the driver should belong to.

Remarks

If not specified, the driver will assign a random string.

Apache Kafka Connector for CData Sync

AutoCommit

Specifies if the Apache Kafka consumer should autocommit after each poll.

Remarks

If true, the consumer's offset will be periodically committed in the background.

Apache Kafka Connector for CData Sync

Kerberos

This section provides a complete list of the Kerberos properties you can configure in the connection string for this provider.


PropertyDescription
KerberosKeytabFileThe Keytab file containing your pairs of Kerberos principals and encrypted keys.
KerberosSPNThe service principal name (SPN) for the Kerberos Domain Controller.
KerberosServiceNameThe name of the Kerberos service you want to authenticate with.
UseKerberosTicketCacheSet this to use a ticket cache with the logged in user instead of a keytab file.
Apache Kafka Connector for CData Sync

KerberosKeytabFile

The Keytab file containing your pairs of Kerberos principals and encrypted keys.

Remarks

The Keytab file containing your pairs of Kerberos principals and encrypted keys.

Apache Kafka Connector for CData Sync

KerberosSPN

The service principal name (SPN) for the Kerberos Domain Controller.

Remarks

The service principal name (SPN) for the Kerberos Domain Controller.

Apache Kafka Connector for CData Sync

KerberosServiceName

The name of the Kerberos service you want to authenticate with.

Remarks

The name of the Kerberos service you want to authenticate with.

Apache Kafka Connector for CData Sync

UseKerberosTicketCache

Set this to use a ticket cache with the logged in user instead of a keytab file.

Remarks

Set this to use a ticket cache with the logged in user instead of a keytab file

Apache Kafka Connector for CData Sync

SSL

This section provides a complete list of the SSL properties you can configure in the connection string for this provider.


PropertyDescription
SSLServerCertThe SSL server certificate used to validate to the Apache Kafka broker.
SSLServerCertTypeThe format of the SSL server certificate used to verify the Apache Kafka broker.
SSLServerCertPasswordThe password used to decrypt the certificate in SSLServerCert .
SSLClientCertThe SSL client certificate used to connect to the Apache Kafka broker.
SSLClientCertTypeThe format of the SSL client certificate used to connect to the Apache Kafka broker.
SSLClientCertPasswordThe password used to decrypt the certificate in SSLClientCert .
SSLIdentificationAlgorithmThe endpoint identification algorithm used by the Apache Kafka data provider client app to validate server host name.
Apache Kafka Connector for CData Sync

SSLServerCert

The SSL server certificate used to validate to the Apache Kafka broker.

Remarks

The value of this property must be provided in the format described on the SSLServerCertType page. Please refer to it for more details.

Apache Kafka Connector for CData Sync

SSLServerCertType

The format of the SSL server certificate used to verify the Apache Kafka broker.

Remarks

This property is used to determine what format the SSLServerCert property expects. This property can take one of the following values:

PEMKEY_FILEThe certificate store is the name of a PEM-encoded file that contains a the server certificate.
PEMKEY_BLOBThe certificate store is a string that contains the server certificate.

Apache Kafka Connector for CData Sync

SSLServerCertPassword

The password used to decrypt the certificate in SSLServerCert .

Remarks

Leave this blank if the server certificate isn't password protected.

Corresponds to the ssl.truststore.password property.

Apache Kafka Connector for CData Sync

SSLClientCert

The SSL client certificate used to connect to the Apache Kafka broker.

Remarks

The value of this property must be provided in the format described on the SSLClientCertType page. Please refer to it for more details.

Apache Kafka Connector for CData Sync

SSLClientCertType

The format of the SSL client certificate used to connect to the Apache Kafka broker.

Remarks

This property is used to determine what format the SSLClientCert property expects. This property can take one of the following values:

PEMKEY_FILEThe certificate store is the name of a PEM-encoded file that contains a private key and certificate.
PEMKEY_BLOBThe certificate store is a string that contains a private key and certificate, optionally encoded in base64.

Apache Kafka Connector for CData Sync

SSLClientCertPassword

The password used to decrypt the certificate in SSLClientCert .

Remarks

Leave this blank if the client certificate isn't password protected.

Apache Kafka Connector for CData Sync

SSLIdentificationAlgorithm

The endpoint identification algorithm used by the Apache Kafka data provider client app to validate server host name.

Remarks

The default value is 'https' and the server host name validation is enabled. You can disable it by setting its value to a blank space.

Apache Kafka Connector for CData Sync

Schema Registry

This section provides a complete list of the Schema Registry properties you can configure in the connection string for this provider.


PropertyDescription
RegistryUrlThe server for the schema registry. When this property is specified, the driver will read Apache Avro schema from the server.
RegistryTypeType of the schema specified for the a specific topic.
RegistryServiceThe Schema Registry service used for working with topic schemas.
RegistryAuthSchemeThe scheme used to authenticate to the schema registry.
RegistryUserUsername to authorize with the server specified in RegistryUrl .
RegistryPasswordPassword to authorize with the server specified in RegistryUrl .
RegistryClientCertThe TLS/SSL client certificate store for SSL Client Authentication (2-way SSL) with the schema registry.
RegistryClientCertTypeThe type of key store used by the TLS/SSL client certificate given in RegistryClientCert .
RegistryClientCertPasswordThe password for the TLS/SSL client certificate given in RegistryClientCert .
RegistryClientCertSubjectThe subject of the TLS/SSL client certificate given in RegistryClientCert .
RegistryVersionVersion of the schema read from RegistryUrl for the specified topic.
RegistryServerCertThe certificate to be accepted from the schema registry when connecting using TLS/SSL.
Apache Kafka Connector for CData Sync

RegistryUrl

The server for the schema registry. When this property is specified, the driver will read Apache Avro schema from the server.

Remarks

Note this property provides no additional features when SerializationFormat is not set to Avro.

  • If you are connecting to Confluent Cloud, this corresponds to the Schema Registry endpoint value in Schemas > Schema Registry > Instructions.
  • If you are connecting to AWS Glue service, this corresponds to the ARN value of the AWS Registry you want to connect.

Apache Kafka Connector for CData Sync

RegistryType

Type of the schema specified for the a specific topic.

Remarks

Currently we do not support Protobuf. If this value is set to Auto, then we driver will try to detect the valid Schema Registry Type for the selected Topic.

Apache Kafka Connector for CData Sync

RegistryService

The Schema Registry service used for working with topic schemas.

Remarks

The Schema Registry service used for working with topic schemas.

Apache Kafka Connector for CData Sync

RegistryAuthScheme

The scheme used to authenticate to the schema registry.

Remarks

The schemes are as follows. Note that some schemes are only available when connecting to a specific RegistryService:

Auto Lets the Sync App decide automatically based on the other connection properties you have set.
None No authentication will be used.
Basic RegistryUser and RegistryPassword are used. In Confluent these are the API user/password, while in Glue these are the IAM access key/secret key.
SSLCertificate RegistryClientCert is used with SSL client authentication. This is only supported when connecting to a Confluent registry.

Apache Kafka Connector for CData Sync

RegistryUser

Username to authorize with the server specified in RegistryUrl .

Remarks

If you are connecting to Confluent Cloud, this corresponds to the Access Key value in Schemas > Schema Registry > API access.

Apache Kafka Connector for CData Sync

RegistryPassword

Password to authorize with the server specified in RegistryUrl .

Remarks

If you are connecting to Confluent Cloud, this corresponds to the Secret Key value in Schemas > Schema Registry > API access.

Apache Kafka Connector for CData Sync

RegistryClientCert

The TLS/SSL client certificate store for SSL Client Authentication (2-way SSL) with the schema registry.

Remarks

The name of the certificate store for the client certificate.

The RegistryClientCertType field specifies the type of the certificate store specified by RegistryClientCert. If the store is password protected, specify the password in RegistryClientCertPassword.

RegistryClientCert is used in conjunction with the RegistryClientCertSubject field in order to specify client certificates. If RegistryClientCert has a value, and RegistryClientCertSubject is set, a search for a certificate is initiated. See RegistryClientCertSubject for more information.

Designations of certificate stores are platform-dependent.

The following are designations of the most common User and Machine certificate stores in Windows:

MYA certificate store holding personal certificates with their associated private keys.
CACertifying authority certificates.
ROOTRoot certificates.
SPCSoftware publisher certificates.

In Java, the certificate store normally is a file containing certificates and optional private keys.

When the certificate store type is PFXFile, this property must be set to the name of the file. When the type is PFXBlob, the property must be set to the binary contents of a PFX file (for example, PKCS12 certificate store).

Apache Kafka Connector for CData Sync

RegistryClientCertType

The type of key store used by the TLS/SSL client certificate given in RegistryClientCert .

Remarks

This property can take one of the following values:

USER - defaultFor Windows, this specifies that the certificate store is a certificate store owned by the current user. Note that this store type is not available in Java.
MACHINEFor Windows, this specifies that the certificate store is a machine store. Note that this store type is not available in Java.
PFXFILEThe certificate store is the name of a PFX (PKCS12) file containing certificates.
PFXBLOBThe certificate store is a string (base-64-encoded) representing a certificate store in PFX (PKCS12) format.
JKSFILEThe certificate store is the name of a Java key store (JKS) file containing certificates. Note that this store type is only available in Java.
JKSBLOBThe certificate store is a string (base-64-encoded) representing a certificate store in JKS format. Note that this store type is only available in Java.
PEMKEY_FILEThe certificate store is the name of a PEM-encoded file that contains a private key and an optional certificate.
PEMKEY_BLOBThe certificate store is a string (base64-encoded) that contains a private key and an optional certificate.
PUBLIC_KEY_FILEThe certificate store is the name of a file that contains a PEM- or DER-encoded public key certificate.
PUBLIC_KEY_BLOBThe certificate store is a string (base-64-encoded) that contains a PEM- or DER-encoded public key certificate.
SSHPUBLIC_KEY_FILEThe certificate store is the name of a file that contains an SSH-style public key.
SSHPUBLIC_KEY_BLOBThe certificate store is a string (base-64-encoded) that contains an SSH-style public key.
P7BFILEThe certificate store is the name of a PKCS7 file containing certificates.
PPKFILEThe certificate store is the name of a file that contains a PuTTY Private Key (PPK).
XMLFILEThe certificate store is the name of a file that contains a certificate in XML format.
XMLBLOBThe certificate store is a string that contains a certificate in XML format.

Apache Kafka Connector for CData Sync

RegistryClientCertPassword

The password for the TLS/SSL client certificate given in RegistryClientCert .

Remarks

If the certificate store is of a type that requires a password, this property is used to specify that password to open the certificate store.

Apache Kafka Connector for CData Sync

RegistryClientCertSubject

The subject of the TLS/SSL client certificate given in RegistryClientCert .

Remarks

When loading a certificate the subject is used to locate the certificate in the store.

If an exact match is not found, the store is searched for subjects containing the value of the property. If a match is still not found, the property is set to an empty string, and no certificate is selected.

The special value "*" picks the first certificate in the certificate store.

The certificate subject is a comma separated list of distinguished name fields and values. For example, "CN=www.server.com, OU=test, C=US, [email protected]". The common fields and their meanings are shown below.

FieldMeaning
CNCommon Name. This is commonly a host name like www.server.com.
OOrganization
OUOrganizational Unit
LLocality
SState
CCountry
EEmail Address

If a field value contains a comma, it must be quoted.

Apache Kafka Connector for CData Sync

RegistryVersion

Version of the schema read from RegistryUrl for the specified topic.

Remarks

Version of the schema read from RegistryUrl for the specified topic.

Apache Kafka Connector for CData Sync

RegistryServerCert

The certificate to be accepted from the schema registry when connecting using TLS/SSL.

Remarks

If using a TLS/SSL connection, this property can be used to specify the TLS/SSL certificate to be accepted from the server. Any other certificate that is not trusted by the machine is rejected.

This property can take the following forms:

Description Example
A full PEM Certificate (example shortened for brevity) -----BEGIN CERTIFICATE----- MIIChTCCAe4CAQAwDQYJKoZIhv......Qw== -----END CERTIFICATE-----
A path to a local file containing the certificate C:\cert.cer
The public key (example shortened for brevity) -----BEGIN RSA PUBLIC KEY----- MIGfMA0GCSq......AQAB -----END RSA PUBLIC KEY-----
The MD5 Thumbprint (hex values can also be either space or colon separated) ecadbdda5a1529c58a1e9e09828d70e4
The SHA1 Thumbprint (hex values can also be either space or colon separated) 34a929226ae0819f2ec14b4a3d904f801cbb150d

If not specified, any certificate trusted by the machine is accepted.

Use '*' to signify to accept all certificates. Note that this is not recommended due to security concerns.

Apache Kafka Connector for CData Sync

Firewall

This section provides a complete list of the Firewall properties you can configure in the connection string for this provider.


PropertyDescription
FirewallTypeThe protocol used by a proxy-based firewall.
FirewallServerThe name or IP address of a proxy-based firewall.
FirewallPortThe TCP port for a proxy-based firewall.
FirewallUserThe user name to use to authenticate with a proxy-based firewall.
FirewallPasswordA password used to authenticate to a proxy-based firewall.
Apache Kafka Connector for CData Sync

FirewallType

The protocol used by a proxy-based firewall.

Remarks

This property specifies the protocol that the Sync App will use to tunnel traffic through the FirewallServer proxy. Note that by default, the Sync App connects to the system proxy; to disable this behavior and connect to one of the following proxy types, set ProxyAutoDetect to false.

Type Default Port Description
TUNNEL 80 When this is set, the Sync App opens a connection to Apache Kafka and traffic flows back and forth through the proxy.
SOCKS4 1080 When this is set, the Sync App sends data through the SOCKS 4 proxy specified by FirewallServer and FirewallPort and passes the FirewallUser value to the proxy, which determines if the connection request should be granted.
SOCKS5 1080 When this is set, the Sync App sends data through the SOCKS 5 proxy specified by FirewallServer and FirewallPort. If your proxy requires authentication, set FirewallUser and FirewallPassword to credentials the proxy recognizes.

To connect to HTTP proxies, use ProxyServer and ProxyPort. To authenticate to HTTP proxies, use ProxyAuthScheme, ProxyUser, and ProxyPassword.

Apache Kafka Connector for CData Sync

FirewallServer

The name or IP address of a proxy-based firewall.

Remarks

This property specifies the IP address, DNS name, or host name of a proxy allowing traversal of a firewall. The protocol is specified by FirewallType: Use FirewallServer with this property to connect through SOCKS or do tunneling. Use ProxyServer to connect to an HTTP proxy.

Note that the Sync App uses the system proxy by default. To use a different proxy, set ProxyAutoDetect to false.

Apache Kafka Connector for CData Sync

FirewallPort

The TCP port for a proxy-based firewall.

Remarks

This specifies the TCP port for a proxy allowing traversal of a firewall. Use FirewallServer to specify the name or IP address. Specify the protocol with FirewallType.

Apache Kafka Connector for CData Sync

FirewallUser

The user name to use to authenticate with a proxy-based firewall.

Remarks

The FirewallUser and FirewallPassword properties are used to authenticate against the proxy specified in FirewallServer and FirewallPort, following the authentication method specified in FirewallType.

Apache Kafka Connector for CData Sync

FirewallPassword

A password used to authenticate to a proxy-based firewall.

Remarks

This property is passed to the proxy specified by FirewallServer and FirewallPort, following the authentication method specified by FirewallType.

Apache Kafka Connector for CData Sync

Proxy

This section provides a complete list of the Proxy properties you can configure in the connection string for this provider.


PropertyDescription
ProxyAutoDetectThis indicates whether to use the system proxy settings or not. This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.
ProxyServerThe hostname or IP address of a proxy to route HTTP traffic through.
ProxyPortThe TCP port the ProxyServer proxy is running on.
ProxyAuthSchemeThe authentication type to use to authenticate to the ProxyServer proxy.
ProxyUserA user name to be used to authenticate to the ProxyServer proxy.
ProxyPasswordA password to be used to authenticate to the ProxyServer proxy.
ProxySSLTypeThe SSL type to use when connecting to the ProxyServer proxy.
ProxyExceptionsA semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .
Apache Kafka Connector for CData Sync

ProxyAutoDetect

This indicates whether to use the system proxy settings or not. This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.

Remarks

This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.

To connect to an HTTP proxy, see ProxyServer. For other proxies, such as SOCKS or tunneling, see FirewallType.

Apache Kafka Connector for CData Sync

ProxyServer

The hostname or IP address of a proxy to route HTTP traffic through.

Remarks

The hostname or IP address of a proxy to route HTTP traffic through. The Sync App can use the HTTP, Windows (NTLM), or Kerberos authentication types to authenticate to an HTTP proxy.

If you need to connect through a SOCKS proxy or tunnel the connection, see FirewallType.

By default, the Sync App uses the system proxy. If you need to use another proxy, set ProxyAutoDetect to false.

Apache Kafka Connector for CData Sync

ProxyPort

The TCP port the ProxyServer proxy is running on.

Remarks

The port the HTTP proxy is running on that you want to redirect HTTP traffic through. Specify the HTTP proxy in ProxyServer. For other proxy types, see FirewallType.

Apache Kafka Connector for CData Sync

ProxyAuthScheme

The authentication type to use to authenticate to the ProxyServer proxy.

Remarks

This value specifies the authentication type to use to authenticate to the HTTP proxy specified by ProxyServer and ProxyPort.

Note that the Sync App will use the system proxy settings by default, without further configuration needed; if you want to connect to another proxy, you will need to set ProxyAutoDetect to false, in addition to ProxyServer and ProxyPort. To authenticate, set ProxyAuthScheme and set ProxyUser and ProxyPassword, if needed.

The authentication type can be one of the following:

  • BASIC: The Sync App performs HTTP BASIC authentication.
  • DIGEST: The Sync App performs HTTP DIGEST authentication.
  • NEGOTIATE: The Sync App retrieves an NTLM or Kerberos token based on the applicable protocol for authentication.
  • PROPRIETARY: The Sync App does not generate an NTLM or Kerberos token. You must supply this token in the Authorization header of the HTTP request.

If you need to use another authentication type, such as SOCKS 5 authentication, see FirewallType.

Apache Kafka Connector for CData Sync

ProxyUser

A user name to be used to authenticate to the ProxyServer proxy.

Remarks

The ProxyUser and ProxyPassword options are used to connect and authenticate against the HTTP proxy specified in ProxyServer.

You can select one of the available authentication types in ProxyAuthScheme. If you are using HTTP authentication, set this to the user name of a user recognized by the HTTP proxy. If you are using Windows or Kerberos authentication, set this property to a user name in one of the following formats:

user@domain
domain\user

Apache Kafka Connector for CData Sync

ProxyPassword

A password to be used to authenticate to the ProxyServer proxy.

Remarks

This property is used to authenticate to an HTTP proxy server that supports NTLM (Windows), Kerberos, or HTTP authentication. To specify the HTTP proxy, you can set ProxyServer and ProxyPort. To specify the authentication type, set ProxyAuthScheme.

If you are using HTTP authentication, additionally set ProxyUser and ProxyPassword to HTTP proxy.

If you are using NTLM authentication, set ProxyUser and ProxyPassword to your Windows password. You may also need these to complete Kerberos authentication.

For SOCKS 5 authentication or tunneling, see FirewallType.

By default, the Sync App uses the system proxy. If you want to connect to another proxy, set ProxyAutoDetect to false.

Apache Kafka Connector for CData Sync

ProxySSLType

The SSL type to use when connecting to the ProxyServer proxy.

Remarks

This property determines when to use SSL for the connection to an HTTP proxy specified by ProxyServer. This value can be AUTO, ALWAYS, NEVER, or TUNNEL. The applicable values are the following:

AUTODefault setting. If the URL is an HTTPS URL, the Sync App will use the TUNNEL option. If the URL is an HTTP URL, the component will use the NEVER option.
ALWAYSThe connection is always SSL enabled.
NEVERThe connection is not SSL enabled.
TUNNELThe connection is through a tunneling proxy. The proxy server opens a connection to the remote host and traffic flows back and forth through the proxy.

Apache Kafka Connector for CData Sync

ProxyExceptions

A semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .

Remarks

The ProxyServer is used for all addresses, except for addresses defined in this property. Use semicolons to separate entries.

Note that the Sync App uses the system proxy settings by default, without further configuration needed; if you want to explicitly configure proxy exceptions for this connection, you need to set ProxyAutoDetect = false, and configure ProxyServer and ProxyPort. To authenticate, set ProxyAuthScheme and set ProxyUser and ProxyPassword, if needed.

Apache Kafka Connector for CData Sync

Logging

This section provides a complete list of the Logging properties you can configure in the connection string for this provider.


PropertyDescription
LogModulesCore modules to be included in the log file.
Apache Kafka Connector for CData Sync

LogModules

Core modules to be included in the log file.

Remarks

Only the modules specified (separated by ';') will be included in the log file. By default all modules are included.

See the Logging page for an overview.

Apache Kafka Connector for CData Sync

Schema

This section provides a complete list of the Schema properties you can configure in the connection string for this provider.


PropertyDescription
LocationA path to the directory that contains the schema files defining tables, views, and stored procedures.
BrowsableSchemasThis property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.
TablesThis property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.
ViewsRestricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.
Apache Kafka Connector for CData Sync

Location

A path to the directory that contains the schema files defining tables, views, and stored procedures.

Remarks

The path to a directory which contains the schema files for the Sync App (.rsd files for tables and views, .rsb files for stored procedures). The folder location can be a relative path from the location of the executable. The Location property is only needed if you want to customize definitions (for example, change a column name, ignore a column, and so on) or extend the data model with new tables, views, or stored procedures.

If left unspecified, the default location is "%APPDATA%\\CData\\ApacheKafka Data Provider\\Schema" with %APPDATA% being set to the user's configuration directory:

Apache Kafka Connector for CData Sync

BrowsableSchemas

This property restricts the schemas reported to a subset of the available schemas. For example, BrowsableSchemas=SchemaA,SchemaB,SchemaC.

Remarks

Listing the schemas from databases can be expensive. Providing a list of schemas in the connection string improves the performance.

Apache Kafka Connector for CData Sync

Tables

This property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.

Remarks

Listing the tables from some databases can be expensive. Providing a list of tables in the connection string improves the performance of the Sync App.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the tables you want in a comma-separated list. Each table should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Tables=TableA,[TableB/WithSlash],WithCatalog.WithSchema.`TableC With Space`.

Note that when connecting to a data source with multiple schemas or catalogs, you will need to provide the fully qualified name of the table in this property, as in the last example here, to avoid ambiguity between tables that exist in multiple catalogs or schemas.

Apache Kafka Connector for CData Sync

Views

Restricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.

Remarks

Listing the views from some databases can be expensive. Providing a list of views in the connection string improves the performance of the Sync App.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the views you want in a comma-separated list. Each view should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Views=ViewA,[ViewB/WithSlash],WithCatalog.WithSchema.`ViewC With Space`.

Note that when connecting to a data source with multiple schemas or catalogs, you will need to provide the fully qualified name of the table in this property, as in the last example here, to avoid ambiguity between tables that exist in multiple catalogs or schemas.

Apache Kafka Connector for CData Sync

Miscellaneous

This section provides a complete list of the Miscellaneous properties you can configure in the connection string for this provider.


PropertyDescription
AggregateMessagesSpecifies whether or not to return the message as a whole string.
CompressionTypeData compression type. Batches of data will be compressed together.
ConsumerPropertiesAdditional options used to configure Kafka consumers.
CreateTablePartitionsThe number of partitions assigned to a topic created with CREATE TABLE.
CreateTableReplicationFactorThe number of replicas assigned to a topic created with CREATE TABLE.
EnableIdempotenceIf set to true, the Apache Kafka will ensure messages are delivered in the correct order, and without duplicates.
FlattenArraysBy default, nested arrays won't show up if TypeDetectionScheme is set to SchemaRegistry. The FlattenArrays property can be used to flatten the elements of nested arrays into columns of their own. Set FlattenArrays to the number of elements you want to return from nested arrays.
GenerateSchemaFilesIndicates the user preference as to when schemas should be generated and saved.
MaximumBatchSizeSpecifies maximum batch size to gather before sending a request.
MaxRowsLimits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.
MessageKeyColumnIf specified, the message key sent to Apache Kafka will be read from this column.
MessageKeyTypeIf MessageKeyColumn is specified, this property must be set to the expected type for the pertinent column.
OffsetResetStrategySpecifies an offset for the consumer group.
OtherThese hidden properties are used only in specific use cases.
ProduceMetaSpecifies whether or not to send a meta message while producing the outgoing message.
ProducerPropertiesAdditional options used to configure Kafka producers.
PseudoColumnsThis property indicates whether or not to include pseudo columns as columns to the table.
ReadDurationThe duration which additional messages are allowed.
RowScanDepthThe maximum number of messages to scan for the columns available in the topic.
SerializationFormatSpecifies how to serialize/deserialize the incoming or outgoing message.
TimeoutThe value in seconds until the timeout error is thrown, canceling the operation.
TypeDetectionSchemeComma-separated list of options specifying how the provider will scan the data to determine the fields and datatypes for the bucket.
UseConfluentAvroFormatSpecifies how Avro data should be formatted during an INSERT.
UserDefinedViewsA filepath pointing to the JSON configuration file containing your custom views.
ValidateRegistryTopicsSpecifies whether or not to validate schema registry topics against the Apache Kafka broker. Only has an effect when TypeDetectionScheme =SchemaRegistry.
Apache Kafka Connector for CData Sync

AggregateMessages

Specifies whether or not to return the message as a whole string.

Remarks

When set to false, the result will be parsed and detected fields will appear in the resultset.

Apache Kafka Connector for CData Sync

CompressionType

Data compression type. Batches of data will be compressed together.

Remarks

The following values are supported:

NONE Messages will not be compressed.
GZIP Messages will be compressed using gzip.
SNAPPY Messages will be compressed using snappy.
LZ4 Messages will be compressed using lz4.

Apache Kafka Connector for CData Sync

ConsumerProperties

Additional options used to configure Kafka consumers.

Remarks

The Sync App exposes several Kafka consumer configuration values directly as connection properties. Internally, these are all mapped into properties that are passed to the Kafka client libraries.

If the Sync App does not expose an option for the consumer configuration, it can be set here. This option takes a connection string value and passes all its options directly to the consumer. For example, security.protocol=SASL_SSL;sasl.mechanism=SCRAM-SHA-512 sets the security.protocol and sasl.mechanism consumer properties.

Apache Kafka Connector for CData Sync

CreateTablePartitions

The number of partitions assigned to a topic created with CREATE TABLE.

Remarks

When executing a CREATE TABLE statement, the Sync App creates a new empty topic. By default, the Sync App creates this new topic with 1 partition.

You can create topics with more partitions by changing this setting. This can be useful if you plan on having multiple consumers process the messages on this topic.

Apache Kafka Connector for CData Sync

CreateTableReplicationFactor

The number of replicas assigned to a topic created with CREATE TABLE.

Remarks

When executing a CREATE TABLE statement, the Sync App creates a new empty topic. By default, the Sync App creates this topic with a replication factor of 3.

You can create topics with a different number of replicas by changing this setting. There are two main cases where this is useful:

  • When your cluster has fewer than 3 nodes. This setting should never be higher than the number of nodes in your cluster. For example, creating topics with a replication factor of 3 on a cluster with 2 nodes will fail.
  • When your cluster has more than 3 nodes and you want more safety in case of failover. Apache Kafka uses replicas to prevent data loss when a node fails, and if all the replicas fail then the topic is unavailable.

Apache Kafka Connector for CData Sync

EnableIdempotence

If set to true, the Apache Kafka will ensure messages are delivered in the correct order, and without duplicates.

Remarks

Gives each message a sequence number when enabled. Specifies how to serialize/deserialize the incoming or outgoing message.

Apache Kafka Connector for CData Sync

FlattenArrays

By default, nested arrays won't show up if TypeDetectionScheme is set to SchemaRegistry. The FlattenArrays property can be used to flatten the elements of nested arrays into columns of their own. Set FlattenArrays to the number of elements you want to return from nested arrays.

Remarks

Set FlattenArrays to the number of elements you want to return from nested arrays. The specified elements are returned as columns.

For example, you can return an arbitrary number of elements from an array of strings:

["FLOW-MATIC","LISP","COBOL"]
When FlattenArrays is set to 1, the preceding array is flattened into the following table:

Column NameColumn Value
languages.0FLOW-MATIC

Apache Kafka Connector for CData Sync

GenerateSchemaFiles

Indicates the user preference as to when schemas should be generated and saved.

Remarks

This property outputs schemas to .rsd files in the path specified by Location.

Available settings are the following:

  • Never: A schema file will never be generated.
  • OnUse: A schema file will be generated the first time a table is referenced, provided the schema file for the table does not already exist.
  • OnStart: A schema file will be generated at connection time for any tables that do not currently have a schema file.
  • OnCreate: A schema file will be generated by when running a CREATE TABLE SQL query.
Note that if you want to regenerate a file, you will first need to delete it.

Generate Schemas with SQL

When you set GenerateSchemaFiles to OnUse, the Sync App generates schemas as you execute SELECT queries. Schemas are generated for each table referenced in the query.

When you set GenerateSchemaFiles to OnCreate, schemas are only generated when a CREATE TABLE query is executed.

Generate Schemas on Connection

Another way to use this property is to obtain schemas for every table in your database when you connect. To do so, set GenerateSchemaFiles to OnStart and connect.

Apache Kafka Connector for CData Sync

MaximumBatchSize

Specifies maximum batch size to gather before sending a request.

Remarks

A batch can be formed by one or more messages. The size is specified in bytes.

Apache Kafka Connector for CData Sync

MaxRows

Limits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.

Remarks

Limits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.

Apache Kafka Connector for CData Sync

MessageKeyColumn

If specified, the message key sent to Apache Kafka will be read from this column.

Remarks

If specified, the message key sent to Apache Kafka will be read from this column.

Apache Kafka Connector for CData Sync

MessageKeyType

If MessageKeyColumn is specified, this property must be set to the expected type for the pertinent column.

Remarks

Be advised that, when this value is set to null, MessageKeyColumn will be ignored.

The available types are as follows:

Null The key column type will be treated as null.
String The key column type will be treated as a string.
Long The key column type will be treated as a long.
Integer The key column type will be treated as an integer.

Apache Kafka Connector for CData Sync

OffsetResetStrategy

Specifies an offset for the consumer group.

Remarks

Select one of the following strategies:

Latest Will only consume messages that are produced after the consumer group is created.
Earliest Will consume any unconsumed messages including any message produced before the lifetime of the consumer group.

Apache Kafka Connector for CData Sync

Other

These hidden properties are used only in specific use cases.

Remarks

The properties listed below are available for specific use cases. Normal driver use cases and functionality should not require these properties.

Specify multiple properties in a semicolon-separated list.

Integration and Formatting

DefaultColumnSizeSets the default length of string fields when the data source does not provide column length in the metadata. The default value is 2000.
ConvertDateTimeToGMTDetermines whether to convert date-time values to GMT, instead of the local time of the machine.
RecordToFile=filenameRecords the underlying socket data transfer to the specified file.

Apache Kafka Connector for CData Sync

ProduceMeta

Specifies whether or not to send a meta message while producing the outgoing message.

Remarks

This option is only used if SerializationFormat is set to CSV.

Apache Kafka Connector for CData Sync

ProducerProperties

Additional options used to configure Kafka producers.

Remarks

This option is like ConsumerProperties but applies to producers instead. Please refer to that property for more information.

Apache Kafka Connector for CData Sync

PseudoColumns

This property indicates whether or not to include pseudo columns as columns to the table.

Remarks

This setting is particularly helpful in Entity Framework, which does not allow you to set a value for a pseudo column unless it is a table column. The value of this connection setting is of the format "Table1=Column1, Table1=Column2, Table2=Column3". You can use the "*" character to include all tables and all columns; for example, "*=*".

Apache Kafka Connector for CData Sync

ReadDuration

The duration which additional messages are allowed.

Remarks

A timeout for the Sync App to stop waiting for additional messages to come.

Apache Kafka Connector for CData Sync

RowScanDepth

The maximum number of messages to scan for the columns available in the topic.

Remarks

Setting a high value may decrease performance. Setting a low value may prevent the data type from being determined properly.

Apache Kafka Connector for CData Sync

SerializationFormat

Specifies how to serialize/deserialize the incoming or outgoing message.

Remarks

Available formats:

NONE Message will be always BASE64 encoded on both consume and produce operations.
AUTO Attempt to automatically figure out the current topic's serialization format.
JSON Message will be serialized using the JSON format.
CSV Message will be serialized using the CSV format.
XML Message will be serialized using the XML format.
AVRO Message will be serialized using the AVRO format.

Apache Kafka Connector for CData Sync

Timeout

The value in seconds until the timeout error is thrown, canceling the operation.

Remarks

If Timeout = 0, operations do not time out. The operations run until they complete successfully or until they encounter an error condition.

If Timeout expires and the operation is not yet complete, the Sync App throws an exception.

Apache Kafka Connector for CData Sync

TypeDetectionScheme

Comma-separated list of options specifying how the provider will scan the data to determine the fields and datatypes for the bucket.

Remarks

The type dection schemes are:

NoneSetting TypeDetectionScheme to None will return all columns as string type.
RowScanSetting TypeDetectionScheme to RowScan will scan rows to heuristically determine the data type. The RowScanDepth determines the number of rows to be scanned. Can be used with RowScanDepth in order to change the number of rows to be scanned.
SchemaRegistrySetting TypeDetectionScheme to SchemaRegistry will determine will make use of the Schema Registry API and use a list of predefined AVRO schemas.
MessageOnlySetting TypeDetectionScheme to MessageOnly will push all information as a single aggregate value on a column named Message.

Apache Kafka Connector for CData Sync

UseConfluentAvroFormat

Specifies how Avro data should be formatted during an INSERT.

Remarks

By default the Sync App writes out Avro data as a series of file blocks (as defined in the Avro specification). Confluent tools and libraries cannot decode this format and it cannot be used with Confluent schema validation. However, it is more compact because it allows multiple rows of Avro data to be stored in a single message.

Enable this option if you use Confluent schema validation, or otherwise require compatibility with Confluent tools and libraries. Each row inserted into an Avro topic will be a separate message and contain a reference to an schema stored in the registry.

Note that this cannot be enabled if there is no RegistryUrl set or RegistryUrl points to an AWS Glue schema registry. AWS Glue schemas do not support schema IDs which are a key part of how Confluent handles Avro data.

Apache Kafka Connector for CData Sync

UserDefinedViews

A filepath pointing to the JSON configuration file containing your custom views.

Remarks

User Defined Views are defined in a JSON-formatted configuration file called UserDefinedViews.json. The Sync App automatically detects the views specified in this file.

You can also have multiple view definitions and control them using the UserDefinedViews connection property. When you use this property, only the specified views are seen by the Sync App.

This User Defined View configuration file is formatted as follows:

  • Each root element defines the name of a view.
  • Each root element contains a child element, called query, which contains the custom SQL query for the view.

For example:

{
	"MyView": {
		"query": "SELECT * FROM SampleTable_1 WHERE MyColumn = 'value'"
	},
	"MyView2": {
		"query": "SELECT * FROM MyTable WHERE Id IN (1,2,3)"
	}
}
Use the UserDefinedViews connection property to specify the location of your JSON configuration file. For example:
"UserDefinedViews", "C:\\Users\\yourusername\\Desktop\\tmp\\UserDefinedViews.json"

Apache Kafka Connector for CData Sync

ValidateRegistryTopics

Specifies whether or not to validate schema registry topics against the Apache Kafka broker. Only has an effect when TypeDetectionScheme =SchemaRegistry.

Remarks

Schema registries can include metadata for topics that cannot be accessed in Kafka. This can happen because the topic doesn't exist on the broker. It is also possible that the principal the connection is authenticated to does not have access to the topic.

By default, the Sync App will get a list of schemas from the registry and then filter out any that the broker does not report. All the remaining valid topics are exposed as tables. You can disable this behavior by setting this option to false. This will report all schemas in the registry as tables regardless of whether they are accessible on the broker.

Apache Kafka Connector for CData Sync

Third Party Copyrights

Java

These copyright notices only apply to the to the Java-based editions of the Sync App. This includes the JDBC eidtion, the Tibco adapter and the Tableau connector.

The Sync App makes use of the following libraries under the terms of the Apache 2.0 license:

  • kafka-clients
  • Jackson
  • jose4j
  • lz4-java
  • snappy-java
  • slf4j
  • OSGI

The Apache License Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

The Sync App also makes use of the zstd-jni library under the terms of the BSD license:

BSD-3-Clause License

Copyright (c) 2021 BSD All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

.NET

These copyright notices only apply to the to ADO.NET edition of the Sync App.

The Sync App makes use of the Confluent.Kafka library under the terms of the Apache 2.0 license.

The Apache License Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Copyright (c) 2023 CData Software, Inc. - All rights reserved.
Build 22.0.8462