Configuring a Kafka® connection

Setting up connection parameters

Depending on how your Kafka® server is set up, there are a number of other connection parameters you need to set. There are preset groups of options for common Kafka providers, as well as custom options that let you define the value for any parameter you need. We’ll go through those now.

In this section we’re configuring the details of ClickHouse’s connection to Kafka. That includes details such as the URL of the Kafka server, the name of the topic, and any authentication parameters you need. Those parameters can go in your Kafka configuration file. There are other parameters that configure a ClickHouse table that uses the Kafka engine; those are not part of a Kafka configuration file. (More on the Kafka table engine in a minute.)

A word about certificates

If the connection to your Kafka server requires certificates, you need to go through a couple of steps:

  1. Create a new file in the config.d directory. To do that, create a new setting with the name of your certificate, then paste the value of the certificate in the text box. Here’s an example for the file service.cert:

    Creating a certificate file
    Figure 1 - Creating a certificate file in the config.d directory

    See the section Configuring Settings for all the details on modifying server settings.
  2. Once you’ve created the settings for all the certificates you need, you can specify the locations of those certificates. The certificates are in the directory /etc/clickhouse-server/config.d. For service.cert, the location is /etc/clickhouse-server/config.d/service.cert.

Preset option groups

There are several groups of preset options available under the ADD PRESET button:

The Kafka Preset options menu

Figure 2 - The Kafka Preset options menu

Each set of options is targeted for a specific platform or Kafka deployment type, but check with your Kafka provider to see which parameters you need. In addition to the parameters added for you automatically, you can add your own custom parameters if needed.

Amazon MSK parameters

If your Kafka server is hosted by Amazon Managed Streaming for Apache Kafka (MSK), three parameters are added to the dialog:

  • security.protocol: Available options are plaintext, ssl, sasl_plaintext, or sasl_ssl.
  • sasl.username and sasl.password - Your SASL username and password

The parameters will look like this:

Amazon MSK parameters

Figure 3 - Amazon MSK parameters

Values with a down arrow icon let you select from a list of values; other value fields let you type whatever you need.

You also need to create a VPC connection to your MSK service. See the page Amazon VPC endpoint for Amazon MSK for complete details on creating the VPC connection.

SASL/SCRAM parameters

  • security.protocol: Available options are plaintext, ssl, sasl_plaintext, or sasl_ssl.
  • sasl.mechanism: Available options are GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512, or OAUTHBEARER.
  • sasl.username and sasl.password - Your SASL username and password

The parameters will look like this:

SASL/SCRAM parameters

Figure 4 - SASL/SCRAM parameters

Inline Kafka certificates parameters

  • ssl.key.pem: The path to your key.pem file.
  • ssl.certificate.pem: The path to your certificate.pem file.

See the section A word about certificates above for details on working with certificates.

The parameters will look like this:

Inline Kafka certificates parameters

Figure 5 - Inline Kafka certificates parameters

Kerberos parameters

  • security.protocol: Available options are plaintext, ssl, sasl_plaintext, or sasl_ssl.
  • sasl.kerberos.keytab: The path to your .keytab file. See the section A word about certificates above for details on working with certificates.
  • sasl.kerberos.principal: The name of your Kerberos principal.

The parameters will look like this:

Kerberos parameters

Figure 6 - Kerberos parameters

Confluent Cloud parameters

The Confluent documentation site has the details of all the Confluent parameters. The ones provided for you automatically are:

  • security.protocol: Available options are plaintext, ssl, sasl_plaintext, or sasl_ssl.
  • sasl.mechanism: Available options are GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512, or OAUTHBEARER.
  • sasl.username and sasl.password: Your SASL username and password
  • auto.offset.reset: Confluent supports three predefined values: smallest, latest, and none, although you may not need this parameter at all. Any other value throws an exception. See the auto.offset.reset documentation for all the details.
  • ssl.endpoint.identification.algorithm: https is the only option supported. This is typically used only with older servers; you can click the trash can icon to delete it if you don’t need it.
  • ssl.ca.location: The location of your SSL certificate file. See the discussion of certificates above for more information on working with certificates. As with ssl.endpoint.identification.algorithm, this is typically used only with older servers; you can click the trash can icon to delete it if you don’t need it.

Check the Confluent site to see which parameters you need and the values you should use. Here’s a working example:

Confluent Cloud parameters

Figure 7 - Confluent Cloud parameters

Adding other configuration options

The ADD OPTION button lets you add other configuration parameters. You can use a predefined option or create a custom options:

The Kafka Add Option menu

Figure 8 - The ADD OPTION menu

Predefined options

There are seven predefined options:

  • security.protocol: plaintext, ssl, sasl_plaintext, or sasl_ssl.
  • sasl.mechanism: GSSAPI, PLAIN, SCRAM-SHA-256, SCRAM-SHA-512, or OAUTHBEARER.
  • sasl.username and sasl.password
  • ssl.ca.location: the location of your SSL certificate
  • enable.ssl.certificate.verification: true or false
  • ssl.endpoint.identification.algorithm: https is the only option supported
  • debug: all is the only option supported

Custom options

A custom option simply gives you entry fields for a name and a value, letting you define any parameters your Kafka server needs. As an example, a connection to a Kafka topic hosted on Aiven cloud looks like this:

Aiven parameters

Figure 9 - Custom parameters to connect to Aiven Cloud

Once your parameters are set, click the CHECK button to make sure the connection to your Kafka server is configured correctly.