If you are using the Confluent Platform schema registry, Event Streams provides a migration path for moving your Kafka consumers and producers over to use the Apicurio Registry in Event Streams.
Important: Support for Apicurio client libraries version 2.3.x and earlier is deprecated. Ensure all applications connecting to Event Streams that use the schema registry are using Apicurio client libraries version 2.5.0 or later, then migrate to the latest Apicurio.
Migrating schemas to Apicurio Registry in Event Streams
To migrate schemas, you can use schema auto-registration in your Kafka producer, or you can manually migrate schemas by downloading the schema definitions from the Confluent Platform schema registry and adding them to the Apicurio Registry in Event Streams.
Migrating schemas with auto-registration
When using auto-registration, the schema will be automatically uploaded to the Apicurio Registry in Event Streams, and named with the subject ID (which is based on the subject name strategy in use) and a random suffix.
Auto-registration is enabled by default in the Confluent Platform schema registry client library. To disable it, set the auto.register.schemas
property to false
.
Note: To auto-register schemas in the Apicurio Registry in Event Streams, you need credentials that have producer permissions and permission to create schemas. You can generate credentials by using the Event Streams UI or CLI.
Migrating schemas manually
To manually migrate the schemas, download the schema definitions from the Confluent Platform schema registry, and add them to the Apicurio Registry in Event Streams. When manually adding schemas to the Apicurio Registry in Event Streams, the provided schema name must match the subject ID used by the Confluent Platform schema registry subject name strategy.
If you are using the default TopicNameStrategy
, the schema name must be <TOPIC_NAME>-<'value'|'key'>
If you are using the RecordNameStrategy
, the schema name must be <SCHEMA_DEFINITION_NAMESPACE>.<SCHEMA_DEFINITION_NAME>
For example, if you are using the default TopicNameStrategy
as your subject name strategy, and you are serializing your data into the message value and producing to the MyTopic topic, then the schema name you must provide when adding the schema in the UI must be MyTopic-value
For example, if you are using the RecordNameStrategy
as your subject name strategy, and the schema definition file begins with the following, then the schema name you must provide when adding the schema in the UI must be org.example.Book
:
{
"type": "record",
"name": "Book",
"namespace": "org.example",
"fields": [
...
If you are using the Event Streams CLI, run the following command when adding the schema:
kubectl es schema-add --create --name org.example.Book --version 1.0.0 --file /path/to/Book.avsc
Migrating a Kafka producer application
To migrate a Kafka producer application that uses the Confluent Platform schema registry, secure the connection from your application to Event Streams, and add additional properties to enable the Confluent Platform schema registry client library to interact with the Apicurio Registry in Event Streams.
- Configure your producer application to secure the connection between the producer and Event Streams.
-
Retrieve the full URL for the Event Streams API endpoint, including the host name and port number by using the following command:
kubectl es init
-
Ensure you add the following schema properties to your Kafka producers:
Property name Property value schema.registry.url
https://<host name>:<API port>
basic.auth.credentials.source
SASL_INHERIT
You can also use the following code snippet for Java applications:
props.put(AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "https://<host name>:<API port>/apis/ccompat/v6"); props.put(AbstractKafkaSchemaSerDeConfig.BASIC_AUTH_CREDENTIALS_SOURCE, "SASL_INHERIT");
-
Set the Java SSL truststore JVM properties to allow the Confluent Platform schema registry client library to make HTTPS calls to the the Apicurio Registry in Event Streams. For example:
export KAFKA_OPTS="-Djavax.net.ssl.trustStore=/path/to/es-cert.jks \ -Djavax.net.ssl.trustStorePassword=password"
Migrating a Kafka consumer application
To migrate a Kafka consumer application that uses the Confluent Platform schema registry, secure the connection from your application to Event Streams, and add additional properties to enable the Confluent Platform schema registry client library to interact with the Apicurio Registry in Event Streams.
- Configure your consumer application to secure the connection between the consumer and Event Streams.
-
Retrieve the full URL for the Event Streams API endpoint, including the host name and port number by using the following command:
kubectl es init
-
Ensure you add the following schema properties to your Kafka producers:
Property name Property value schema.registry.url
https://<schema_registry_endpoint>/apis/ccompat/v6
basic.auth.credentials.source
SASL_INHERIT
You can also use the following code snippet for Java applications:
props.put(AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "https://<schema_registry_endpoint>/apis/ccompat/v6"); props.put(AbstractKafkaSchemaSerDeConfig.BASIC_AUTH_CREDENTIALS_SOURCE, "SASL_INHERIT");
-
Set the Java SSL truststore JVM properties to allow the Confluent Platform schema registry client library to make HTTPS calls to the Apicurio Registry in Event Streams. For example:
export KAFKA_OPTS="-Djavax.net.ssl.trustStore=/path/to/es-cert.jks \ -Djavax.net.ssl.trustStorePassword=password"