If you have producer or consumer applications created in languages other than Java, use the following guidance to set them up to use schemas. You can also use the REST producer API to send messages that are encoded with a schema.
For a producer application:
- Retrieve the schema definition that you will be using from Apicurio Registry in Event Streams and save it in a local file.
- Use an Apache Avro library for your programming language to read the schema definition from the local file and encode a Kafka message with it.
- Set the schema registry headers in the Kafka message, so that consumer applications can understand which schema and version was used to encode the message, and which encoding format was used.
- Send the message to Kafka.
For a consumer application:
- Retrieve the schema definition that you will be using from Apicurio Registry in Event Streams and save it in a local file.
- Consume a message from Kafka.
- Check the headers for the Kafka message to ensure they match the expected schema ID and schema version ID.
- Use the Apache Avro library for your programming language to read the schema definition from the local file and decode the Kafka message with it.
Retrieving the schema definition from the schema registry
Using the UI
- Log in to your Event Streams UI as an administrator from a supported web browser (see how to determine the login URL for your Event Streams UI).
- Click Schema registry in the primary navigation and find your schema in the list.
- Copy the schema definition into a new local file.
- For the latest version of the schema, expand the row. Copy and paste the schema definition into a new local file.
- For a different version of the schema, click on the row and then select the version to use from the list of schema versions. Click the Schema definition tab and then copy and paste the schema definition into a new local file.
Using the CLI
- Log in to your cluster as an administrator by using the IBM Cloud Pak CLI:
cloudctl login -a https://<cluster_address>:<cluster_router_https_port>
- Run the following command to initialize the Event Streams CLI on the cluster:
cloudctl es init
- Run the following command to list all the schemas in the schema registry:
cloudctl es schemas
- Select your schema from the list and run the following command to list all the versions of the schema:
cloudctl es schema <schema-name>
- Select your version of the schema from the list and run the following command to retrieve the schema definition for the version and copy it into a new local file:
cloudctl es schema <schema-name> --version <schema-version-id> > <schema-definition-file>.avsc
Note: <schema-version-id>
is the integer ID that is displayed when listing schema versions using the following command:
cloudctl es schema <schema-name>
.
Setting headers in the messages you send to Event Streams Kafka
Set the following headers in the message to enable applications that use the Event Streams serdes
Java library to consume and deserialize the messages automatically. Setting these headers also enables the Event Streams UI to display additional details about the message.
The required message header keys and values are listed in the following table.
Header name | Header key | Header value |
---|---|---|
Schema ID | com.ibm.eventstreams.schemaregistry.schema.id |
The schema ID as a string. |
Schema version ID | com.ibm.eventstreams.schemaregistry.schema.version |
The schema version ID as a string. |
Message encoding | com.ibm.eventstreams.schemaregistry.encoding |
Either JSON for Avro JSON encoding, or BINARY for Avro binary encoding. |