The Event Streams UI provides help with creating an Apache Kafka Java client application and discovering connection details for a specific topic.
Creating an Apache Kafka Java client application
You can create Apache Kafka Java client applications to use with Event Streams.
Download the JAR file from Event Streams, and include it in your Java build and classpaths before compiling and running Kafka Java clients.
- Log in to your Event Streams UI.
- Click the Toolbox tab.
- Go to the Apache Kafka Java client section and click Find out more.
- Click the Apache Kafka Client JAR link to download the JAR file. The file contains the Java class files and related resources needed to compile and run client applications you intend to use with Event Streams.
- Download the JAR files for SLF4J required by the Kafka Java client for logging.
- Include the downloaded JAR files in your Java build and classpaths before compiling and running your Apache Kafka Java client.
- Ensure you set up security.
Creating an Apache Kafka Java client application using Maven or Gradle
If you are using Maven or Gradle to manage your project, you can use the following snippets to include the Kafka client JAR and dependent JARs on your classpath.
- For Maven, use the following snippet in the
<dependencies>
section of yourpom.xml
file:
<dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> <version>2.1.0</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>1.7.25</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-simple</artifactId> <version>1.7.25</version> </dependency>
- For Gradle, use the following snippet in the
dependencies{}
section of yourbuild.gradle
file:
implementation group: 'org.apache.kafka', name: 'kafka-clients', version: '2.1.0' implementation group: 'org.slf4j', name: 'slf4j-api', version: '1.7.25' implementation group: 'org.slf4j', name: 'slf4j-simple', version: '1.7.25'
- Ensure you set up security.
Securing the connection
You must secure the connection from your client applications to Event Streams. To secure the connection, you must obtain the following:
- A copy of the server-side public certificate added to your client-side trusted certificates.
- An API key generated from the IBM Cloud Private UI.
Before connecting an external client, ensure the necessary certificates are configured within your client environment. Use the TLS and CA certificates if you provided them during installation, or use the following instructions to retrieve a copy.
In Event Streams 2018.3.1, copy the server-side public certificate and generate an API key as follows:
- Log in to your Event Streams UI.
- Click Connect to this cluster on the right.
- On the Connect a client tab, copy the address from the Bootstrap server section. This gives the bootstrap address for Kafka clients.
- From the Certificates section, download the server certificate. If you are using a Java client, use the Java truststore. Otherwise, use the PEM certificate.
- To generate API keys, go to the API key section and follow the instructions.
In Event Streams 2018.3.0, copy the server-side public certificate and generate an API key as follows:
To copy the certificate and related details:
- Log in to your Event Streams UI.
- Click the Topics tab.
- Select any topic in the list of topics.
- Click the Connection information tab.
- Copy the Broker URL. This is the Kafka bootstrap server.
- In the Certificates section, download the Java truststore or PEM certificate and provide it to your client application.
To generate an API key:
- Log in to your IBM Cloud Private cluster management console from a supported web browser by using the URL
https://<Cluster Master Host>:<Cluster Master API Port>
. The master host and port for your cluster are set during the installation of IBM Cloud Private. For more information, see the IBM Cloud Private documentation. - From the navigation menu, click Manage > Identity & Access-> Service IDs.
- Click Create a Service ID.
- Provide a name, a description, and select your namespace. Then click Create.
- Click the service id you created.
- Click Create Service Policy.
- Select a role, select
eventstreams
as your service type, select the Event Streams release instance you want to apply the policy to, and provide a Resource Type (for example, topic) and a Resource Identifier (for example, the name of the topic).
If you do not specify a resource type or identifier, the policy applies its role to all resources in the Event Streams instance. - Click Add.
- Click the API keys tab.
- Click Create API key.
- Provide a name and a description. Then click Create.
- Click Download to download a file containing the API key.
Important: To have access to the Connection information tab in the UI, you must have at least one topic. For example, if you are just starting out, use the starter application to generate topics.
Configuring your client
Add the certificate details and the API key to your Kafka client application to set up a secure connection from your application to your Event Streams instance. For example, for Java:
Properties properties = new Properties();
properties.put(CommonClientConfigs.BOOTSTRAP_SERVERS_CONFIG, "<broker_url>");
properties.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SASL_SSL");
properties.put(SslConfigs.SSL_PROTOCOL_CONFIG, "TLSv1.2");
properties.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, "<certs.jks_file_location>");
properties.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, "<truststore_password>");
properties.put(SaslConfigs.SASL_MECHANISM, "PLAIN");
properties.put(SaslConfigs.SASL_JAAS_CONFIG, "org.apache.kafka.common.security.plain.PlainLoginModule required "
+ "username=\"token\" password=\"<api_key>\";");
Replace <broker_url>
with your cluster’s broker URL, <certs.jks_file_location>
with the path to your truststore file, <truststore_password>
with "password"
, and <api_key>
with the API key copied from its file.
Note: In Event Streams 2018.3.1, you can copy the connection code snippet from the Event Streams UI with the broker URL already filled in for you. After logging in, click Connect to this cluster on the right, and click the Sample code tab. Copy the snippet from the Sample connection code section into your Kafka client application.