Consider the following tasks after installing IBM Event Streams.
Verifying an installation
To verify that your Event Streams installation deployed successfully, you can check the status of your instance through the OpenShift Container Platform web console or command line.
Check the status of the EventStreams instance through the OpenShift Container Platform web console
- Log in to the OpenShift Container Platform web console using your login credentials.
- Expand the Operators dropdown and select Installed Operators to open the Installed Operators page.
- Expand the Project dropdown and select the project the instance is installed in. Click the operator called IBM Event Streams managing the project.
- Select the Event Streams tab and search the Name column for the installed instance and click it.
- The Phase field will display the current state of the EventStreams custom resource. When the Event Streams instance is ready, the phase will display
Ready, meaning the deployment has completed.
Check the status of the Event Streams instance through the command line
After all the components of an Event Streams instance are active and ready, the
EventStreams custom resource will have a
Ready phase in the status.
To verify the status:
- Log in to your Red Hat OpenShift Container Platform as a cluster administrator by using the
- Run the
oc getcommand as follows:
oc get eventstreams
For example, the installation of the instance called
development is complete when the
STATUS returned by the
oc get command displays
oc get eventstreams
An example output:
$ oc get eventstreams > NAME STATUS development Ready
Note: It might take several minutes for all the resources to be created and the
EventStreams instance to become ready.
Installing the Event Streams command-line interface
The Event Streams CLI is a plugin for the
cloudctl CLI. Use the Event Streams CLI to manage your Event Streams instance from the command line.
Examples of management activities include:
- Creating, deleting, and updating Kafka topics.
- Creating, deleting, and updating Kafka users.
- Creating, deleting, and updating Kafka message schemas.
- Managing geo-replication.
- Displaying the cluster configuration and credentials.
To install the Event Streams CLI:
- Ensure you have the IBM Cloud Pak CLI (
cloudctl) installed either by retrieving the binary from your cluster or downloading the binary from a release on the GitHub project.
Note: Ensure you download the correct binary for your architecture and operating system.
- Log in to your Event Streams instance as an administrator.
- Click Toolbox in the primary navigation.
- Go to the IBM Event Streams command-line interface section and click Find out more.
- Download the Event Streams CLI plug-in for your system by using the appropriate link.
- Install the plugin using the following command:
cloudctl plugin install <path-to-plugin>
To start the Event Streams CLI and check all available command options in the CLI, use the
cloudctl es command.
For an exhaustive list of commands, you can run:
cloudctl es --help
To get help for a specific command, run:
cloudctl es <command> --help
To use the Event Streams CLI against an OpenShift Container Platform cluster, do the following:
Log in to your cluster as an administrator by using the IBM Cloud Pak CLI:
cloudctl login -a https://<cluster_address>:<cluster_router_https_port>
To configure the CLI to connect to a specific Event Streams instance running a namespace:
cloudctl es init -n <namespace>
Firewall and load balancer settings
Consider the following guidance about firewall and load balancer settings for your deployment.
Using OpenShift Container Platform routes
Event Streams uses OpenShift routes. Ensure your OpenShift router is set up as required.
For instructions about connecting a client to your Event Streams instance, see connecting clients.
Setting up access
Secure your installation by managing the access your users and applications have to your Event Streams resources.
For example, associate your IBM Cloud Pak foundational services teams with your Event Streams instance to grant access to resources based on roles.
Depending on the size of the environment that you are installing, consider scaling and sizing options. You might also need to change scale and size settings for your services over time. For example, you might need to add additional Kafka brokers over time.
See how to scale your environment.
Considerations for GDPR readiness
Consider the requirements for GDPR, including encrypting your data for protecting it from loss or unauthorized access.