Post-installation tasks

Consider the following tasks after installing Event Streams.

Verifying an installation

To verify that your Event Streams installation deployed successfully, you can check the status of your instance through the OpenShift Container Platform web console or command line.

Check the status of the EventStreams instance through the OpenShift Container Platform web console

  1. Log in to the OpenShift Container Platform web console using your login credentials.
  2. Expand the Operators dropdown and select Installed Operators to open the Installed Operators page.
  3. Expand the Project dropdown and select the project the instance is installed in. Click the operator called IBM Event Streams managing the project.
  4. Select the Event Streams tab and search the Name column for the installed instance and click it.
  5. The Phase field will display the current state of the EventStreams custom resource. When the Event Streams instance is ready, the phase will display Ready, meaning the deployment has completed.

Check the status of the Event Streams instance through the command line

After all the components of an Event Streams instance are active and ready, the EventStreams custom resource will have a Ready phase in the status. To verify the status:

  1. Log in to your Red Hat OpenShift Container Platform as a cluster administrator by using the oc CLI (oc login).
  2. Run the oc get command as follows: oc get eventstreams

For example, the installation of the instance called development is complete when the STATUS returned by the oc get command displays Ready:

$ oc get eventstreams
>
NAME             STATUS
development      Ready

Note: It might take several minutes for all the resources to be created and the EventStreams instance to become ready.

Installing the Event Streams command-line interface

The Event Streams CLI is a plugin for the cloudctl CLI. Use the Event Streams CLI to manage your Event Streams instance from the command line. Examples of management activities include:

  • Creating, deleting, and updating Kafka topics.
  • Creating, deleting, and updating Kafka users.
  • Creating, deleting, and updating Kafka message schemas.
  • Managing geo-replication.
  • Displaying the cluster configuration and credentials.

To install the Event Streams CLI:

  1. Ensure you have the IBM Cloud Pak CLI (cloudctl) installed either by retrieving the binary from your cluster or downloading the binary from a release on the GitHub project.
    Note: Ensure you download the correct binary for your architecture and operating system.
  2. Log in to your Event Streams UI as an administrator.
  3. Click Toolbox in the primary navigation.
  4. Go to the Event Streams command-line interface section and click Find out more.
  5. Download the Event Streams CLI plug-in for your system by using the appropriate link.
  6. Install the plugin using the following command:
    cloudctl plugin install <path-to-plugin>
    

To start the Event Streams CLI and check all available command options in the CLI, use the cloudctl es command. For an exhaustive list of commands, you can run:

cloudctl es --help

To get help for a specific command, run:

cloudctl es <command> --help

To run commands after installing, log in and initialize the CLI as described in logging in.

Firewall and load balancer settings

Consider the following guidance about firewall and load balancer settings for your deployment.

Using OpenShift Container Platform routes

Event Streams uses OpenShift routes. Ensure your OpenShift router is set up as required.

Connecting clients

For instructions about connecting a client to your Event Streams instance, see connecting clients.

Setting up access

Secure your installation by managing the access your users and applications have to your Event Streams resources.

For example, associate your IBM Cloud Pak foundational services teams with your Event Streams instance to grant access to resources based on roles.

Scaling your Kafka Environment

Depending on the size of the environment that you are installing, consider scaling and sizing options. You might also need to change scale and size settings for your services over time. For example, you might need to add additional Kafka brokers over time.

See how to scale your Kafka environment.

Considerations for GDPR readiness

Consider the requirements for GDPR, including encrypting your data for protecting it from loss or unauthorized access.