You can use the MQ source connector to copy data from IBM MQ into Event Streams or Apache Kafka. The connector copies messages from a source MQ queue to a target Kafka topic.
Kafka Connect can be run in standalone or distributed mode. This document contains steps for running the connector in distributed mode in OpenShift Container Platform. In this mode, work balancing is automatic, scaling is dynamic, and tasks and data are fault-tolerant. For more details on the difference between standalone and distributed mode see the explanation of Kafka Connect workers.
Prerequisites
To follow these instructions, ensure you have IBM MQ v8 or later installed.
Note: These instructions are for IBM MQ v9 running on Linux. If you are using a different version or platform, you might have to adjust some steps slightly.
Setting up the queue manager
You can set up a queue manager by using the local operating system to authenticate, or by using the IBM MQ Operator.
By using local operating system to authenticate
These sample instructions set up an IBM MQ queue manager that uses its local operating system to authenticate the user ID and password. The user ID and password you provide must already be created on the operating system where IBM MQ is running.
- Log in as a user authorized to administer IBM MQ, and ensure the MQ commands are on the path.
- Create a queue manager with a TCP/IP listener on port 1414:
crtmqm -p 1414 <queue_manager_name>
for example to create a queue manager calledQM1
usecrtmqm -p 1414 QM1
- Start the queue manager:
strmqm <queue_manager_name>
- Start the
runmqsc
tool to configure the queue manager:runmqsc <queue_manager_name>
- In
runmqsc
, create a server-connection channel:DEFINE CHANNEL(<channel_name>) CHLTYPE(SVRCONN)
- Set the channel authentication rules to accept connections requiring userid and password:
SET CHLAUTH(<channel_name>) TYPE(BLOCKUSER) USERLIST('nobody')
SET CHLAUTH('*') TYPE(ADDRESSMAP) ADDRESS('*') USERSRC(NOACCESS)
SET CHLAUTH(<channel_name>) TYPE(ADDRESSMAP) ADDRESS('*') USERSRC(CHANNEL) CHCKCLNT(REQUIRED)
- Set the identity of the client connections based on the supplied context (the user ID):
ALTER AUTHINFO(SYSTEM.DEFAULT.AUTHINFO.IDPWOS) AUTHTYPE(IDPWOS) ADOPTCTX(YES)
- Refresh the connection authentication information:
REFRESH SECURITY TYPE(CONNAUTH)
- Create a queue for the Kafka Connect connector to use:
DEFINE QLOCAL(<queue_name>)
- Authorize the IBM MQ user ID to connect to and inquire the queue manager:
SET AUTHREC OBJTYPE(QMGR) PRINCIPAL('<user_id>') AUTHADD(CONNECT,INQ)
- Authorize the IBM MQ user ID to use the queue:
SET AUTHREC PROFILE(<queue_name>) OBJTYPE(QUEUE) PRINCIPAL('<user_id>') AUTHADD(ALLMQI)
- Stop the
runmqsc
tool by typingEND
.
For example, for a queue manager called QM1
, with user ID alice
, creating a server-connection channel called MYSVRCONN
and a queue called MYQSOURCE
, you run the following commands in runmqsc
:
DEFINE CHANNEL(MYSVRCONN) CHLTYPE(SVRCONN)
SET CHLAUTH(MYSVRCONN) TYPE(BLOCKUSER) USERLIST('nobody')
SET CHLAUTH('*') TYPE(ADDRESSMAP) ADDRESS('*') USERSRC(NOACCESS)
SET CHLAUTH(MYSVRCONN) TYPE(ADDRESSMAP) ADDRESS('*') USERSRC(CHANNEL) CHCKCLNT(REQUIRED)
ALTER AUTHINFO(SYSTEM.DEFAULT.AUTHINFO.IDPWOS) AUTHTYPE(IDPWOS) ADOPTCTX(YES)
REFRESH SECURITY TYPE(CONNAUTH)
DEFINE QLOCAL(MYQSOURCE)
SET AUTHREC OBJTYPE(QMGR) PRINCIPAL('alice') AUTHADD(CONNECT,INQ)
SET AUTHREC PROFILE(MYQSOURCE) OBJTYPE(QUEUE) PRINCIPAL('alice') AUTHADD(ALLMQI)
END
The queue manager is now ready to accept connection from the connector and get messages from a queue.
By using the IBM MQ Operator
You can also use the IBM MQ Operator to set up a queue manager. For more information about installing the IBM MQ Operator and setting up a queue manager, see the IBM MQ documentation.
If you are using the IBM MQ Operator to set up a queue manager, you can use the following YAML file to create a queue manager with the required configuration:
-
Create a file called
custom-source-mqsc-configmap.yaml
and copy the following YAML content into the file to create the ConfigMap that has the details for creating a server connection channel calledMYSVRCONN
and a queue calledMYQSOURCE
:apiVersion: v1 kind: ConfigMap metadata: name: custom-source-mqsc data: source.mqsc: | DEFINE CHANNEL(MYSVRCONN) CHLTYPE(SVRCONN) SET CHLAUTH(MYSVRCONN) TYPE(BLOCKUSER) USERLIST('nobody') SET CHLAUTH('*') TYPE(ADDRESSMAP) ADDRESS('*') USERSRC(NOACCESS) SET CHLAUTH(MYSVRCONN) TYPE(ADDRESSMAP) ADDRESS('*') USERSRC(CHANNEL) CHCKCLNT(REQUIRED) ALTER AUTHINFO(SYSTEM.DEFAULT.AUTHINFO.IDPWOS) AUTHTYPE(IDPWOS) ADOPTCTX(YES) REFRESH SECURITY TYPE(CONNAUTH) DEFINE QLOCAL(MYQSOURCE) SET AUTHREC OBJTYPE(QMGR) PRINCIPAL('alice') AUTHADD(CONNECT,INQ) SET AUTHREC PROFILE(MYQSOURCE) OBJTYPE(QUEUE) PRINCIPAL('alice') AUTHADD(ALLMQI)
-
Create the ConfigMap by using the following command:
oc apply -f custom-source-mqsc-configmap.yaml
-
To create a queue manager with the required configuration, update the
spec.queueManager
section of theQueueManager
custom resource YAML file:... queueManager: ... mqsc: - configMap: name: custom-source-mqsc items: - source.mqsc
The queue manager is now ready to accept connection from the connector and get messages from a queue.
Configuring the connector to connect to MQ
To connect to IBM MQ and to your Event Streams or Apache Kafka cluster, the connector requires configuration settings added to a KafkaConnector
custom resource that represents the connector.
For IBM MQ connectors, you can generate the KafkaConnector
custom resource YAML file from either the Event Streams UI or the CLI. You can also use the CLI to generate a JSON file, which you can use in distributed mode where you supply the connector configuration through REST API calls.
The connector connects to IBM MQ using a client connection. You must provide the following connection information for your queue manager (these configuration settings are added to the spec.config
section of the KafkaConnector
custom resource YAML):
- The name of the target Kafka topic.
- The name of the IBM MQ queue manager.
- The connection name (one or more host and port pairs).
- The channel name.
- The name of the source IBM MQ queue.
- The user name and password if the queue manager is configured to require them for client connections.
Using the UI
Use the Event Streams UI to generate and download the KafkaConnector
custom resource YAML file for your IBM MQ source connector.
- Log in to the Event Streams UI from a supported web browser (see how to determine the login URL for your Event Streams UI).
- Click Toolbox in the primary navigation and scroll to the Connectors section.
- Go to the Add connectors to your Kafka Connect environment tile and click Connecting to IBM MQ?
- Ensure the MQ Source tab is selected and click Generate.
- In the dialog, enter the configuration of the
MQ Source
connector. - Click Download to generate and download the configuration file with the supplied fields.
- Open the downloaded configuration file and change the values of
mq.user.name
andmq.password
to the username and password that you used to configure your instance of MQ. Also set the labeleventstreams.ibm.com/cluster
to the name of your Kafka Connect instance.
Using the CLI
Use the Event Streams CLI to generate and download the KafkaConnector
custom resource YAML file for your IBM MQ source connector. You can also use the CLI to generate a JSON file for distributed mode.
- Initialize the Event Streams CLI by following the instructions in logging in.
- Run the
connector-config-mq-source
command to generate the configuration file for theMQ Source
connector.
For example, to generate a configuration file for an instance ofMQ
with the following information: a queue manager calledQM1
, with a connection point oflocalhost(1414)
, a channel name ofMYSVRCONN
, a queue ofMYQSOURCE
and connecting to the topicTSOURCE
, run the following command:cloudctl es connector-config-mq-source --mq-queue-manager="QM1" --mq-connection-name-list="localhost(1414)" --mq-channel="MYSVRCONN" --mq-queue="MYQSOURCE" --topic="TSOURCE" --file="mq-source" --format yaml
Note: Omitting the
--format yaml
flag will generate amq-source.properties
file which can be used for standalone mode. Specifying--format json
will generate amq-source.json
file which can be used for distributed mode outside the OpenShift Container Platform. - Change the values of
mq.user.name
andmq.password
to the username and password that you used to configure your instance of MQ. Also set the labeleventstreams.ibm.com/cluster
to the name of your Kafka Connect instance.
The final configuration file will resemble the following:
apiVersion: eventstreams.ibm.com/v1beta2
kind: KafkaConnector
metadata:
name: mq-source
labels:
# The eventstreams.ibm.com/cluster label identifies the KafkaConnect instance
# in which to create this connector. That KafkaConnect instance
# must have the eventstreams.ibm.com/use-connector-resources annotation
# set to true.
eventstreams.ibm.com/cluster: <kafka_connect_name>
spec:
class: com.ibm.eventstreams.connect.mqsource.MQSourceConnector
tasksMax: 1
config:
topic: TSOURCE
mq.queue.manager: QM1
mq.connection.name.list: localhost(1414)
mq.channel.name: MYSVRCONN
mq.queue: MYQSOURCE
mq.user.name: alice
mq.password: passw0rd
key.converter: org.apache.kafka.connect.storage.StringConverter
value.converter: org.apache.kafka.connect.storage.StringConverter
mq.record.builder: com.ibm.eventstreams.connect.mqsource.builders.DefaultRecordBuilder
A list of all the possible flags can be found by running the command cloudctl es connector-config-mq-source --help
. Alternatively, See the sample properties file for a full list of properties you can configure, and also see the GitHub README for all available configuration options.
Downloading the MQ Source connector
- Log in to the Event Streams UI from a supported web browser (see how to determine the login URL for your Event Streams UI).
- Click Toolbox in the primary navigation and scroll to the Connectors section.
- Go to the Add connectors to your Kafka Connect environment tile and click Connecting to IBM MQ?
- Ensure the
MQ Source
tab is selected and click Go to GitHub. Download the JAR file from the list of assets for the latest release.
Configuring Kafka Connect
Follow the steps in Set up a Kafka Connect environment. When adding connectors, add the MQ connector JAR you downloaded, and when starting the connector, use the YAML file you created earlier.
Verify the log output of Kafka Connect includes the following messages that indicate the connector task has started and successfully connected to IBM MQ:
$ oc logs <kafka_connect_pod_name>
...
INFO Created connector mq-source
...
INFO Connection to MQ established
...
Send a test message
- To add messages to the IBM MQ queue, run the
amqsput
sample and type in some messages:
/opt/mqm/samp/bin/amqsput <queue_name> <queue_manager_name>
- Log in to the Event Streams UI from a supported web browser (see how to determine the login URL for your Event Streams UI).
- Click Topics in the primary navigation and select the connected topic. Messages will appear in the message browser of that topic.
Advanced configuration
For more details about the connector and to see all configuration options, see the GitHub README.