Attention: This version of Event Streams has reached End of Support. For more information about supported versions, see the support matrix.

Connecting to IBM MQ

You can set up connections between IBM MQ and Apache Kafka or Event Streams systems.

Available connectors

Connectors are available for copying data in both directions.

  • Kafka Connect source connector for IBM MQ:
    You can use the MQ source connector to copy data from IBM MQ into Event Streams or Apache Kafka. The connector copies messages from a source MQ queue to a target Kafka topic.
  • Kafka Connect sink connector for IBM MQ:
    You can use the MQ sink connector to copy data from Event Streams or Apache Kafka into IBM MQ. The connector copies messages from a Kafka topic into a MQ queue.

Kafka Connect: MQ source and sink connectors

Important: If you want to use IBM MQ connectors on IBM z/OS, you must prepare your setup first.

When to use

Many organizations use both IBM MQ and Apache Kafka for their messaging needs. Although they’re generally used to solve different kinds of messaging problems, users often want to connect them together for various reasons. For example, IBM MQ can be integrated with systems of record while Apache Kafka is commonly used for streaming events from web applications. The ability to connect the two systems together enables scenarios in which these two environments intersect.

Note: You can use an existing IBM MQ or Kafka installation, either locally or on the cloud. For convenience, it is recommended to run the Kafka Connect worker on the same OpenShift Container Platform cluster as Event Streams. If the network latency between MQ and Event Streams is significant, you might prefer to run the Kafka Connect worker close to the queue manager to minimize the effect of network latency. For example, if you have a queue manager in your datacenter and Kafka in the cloud, it’s best to run the Kafka Connect worker in your datacenter.