Find out what is new in Event Processing version 1.2.x.
Release 1.2.3
Deploy flows that are customized for production or test environments
Event Processing version 1.2.3 introduces a new flow export format that can be used for deploying jobs that are customized for production or test environments into an application mode Flink cluster.
In most cases, this new deployment mechanism provides a better user experience and can be used with an automation in a continuous integration and continuous delivery (CI/CD) pipeline.
Note: The Production - Flink Application cluster sample has been modified, and can only be used with the new deployment mechanism.
You can now edit flow details and export flows from within the canvas
In Event Processing 1.2.3 and later, both the Edit details and Export features are now also available in More options next to Run flow in the navigation banner of the canvas. Also, you can edit the flow name by clicking it in the canvas.
Support for Kubernetes 1.31
Event Processing version 1.2.3 introduces support for Kubernetes platforms version 1.31 that support the Red Hat Universal Base Images (UBI) containers.
Documentation: Highlighting differences between versions
Any difference in features or behavior introduced by Event Processing 1.2.3 compared to 1.2.2 or earlier is highlighted in this documentation by using the following graphic:
Security and bug fixes
Event Processing release 1.2.3 and IBM Operator for Apache Flink version 1.2.3 contain security and bug fixes.
Release 1.2.2
Use your Flink SQL with custom nodes
In Event Processing 1.2.2 and later, you can now use Custom nodes to unlock advanced SQL capabilities and run complex queries. Three new custom nodes are available: SQL source, SQL processor, and SQL destination. These nodes support Flink SQL, and can be configured and edited to meet your specific use cases. With the introduction of custom nodes, it is now possible to create flows that support changelog stream. For more information, see custom nodes and the associated tutorial about deduplicating repeated events.
Event Processing add-on for IBM Cloud Pak for Integration
Event Processing 1.2.2 and later is available as an add-on for IBM Cloud Pak for Integration. For more information, see licensing.
Support for Red Hat OpenShift Container Platform 4.17
Event Processing version 1.2.2 introduces support for Red Hat OpenShift Container Platform 4.17.
Documentation: Highlighting differences between versions
Any difference in features or behavior introduced by Event Processing 1.2.2 compared to 1.2.1 or earlier is highlighted in this documentation by using the following graphic:
Security and bug fixes
Event Processing release 1.2.2 and IBM Operator for Apache Flink version 1.2.2 contain security and bug fixes.
Release 1.2.1
Security and bug fixes
Event Processing release 1.2.1 and IBM Operator for Apache Flink version 1.2.1 contain security and bug fixes.
Release 1.2.0
Support for key and headers in the event source node
In Event Processing release 1.2.0 and later, if your Kafka topic messages include key and headers information, Event Processing automatically attempts to determine the key and headers, and you can define them as properties in the event source node.
Apache Flink updated to 1.19.1
IBM Operator for Apache Flink version 1.2.0 update includes Apache Flink version 1.19.1.
Updates to supported Kubernetes versions
To install Event Processing 1.2.0 and later 1.2.x versions, ensure that you have installed a Kubernetes version 1.25 or later. For more information about supported versions, see the support matrix.
Support for IBM z13 (s390x) is removed
Support for IBM z13 (s390x) is removed in Event Processing version 1.2.0 and later. Ensure that you deploy Event Processing 1.2.0 on IBM z14 or later systems.
For more information about supported versions, see the support matrix.
Security and bug fixes
Event Processing release 1.2.0 and IBM Operator for Apache Flink version 1.2.0 contain security and bug fixes.