Event Processing is a scalable, low-code event stream processing platform that helps you transform and act on data in real time.
Event Processing transforms event streaming data in real time, helping you turn events into insights. You can define flows that connect to event sources which bring event data (messages) from Apache Kafka into your flow, combined with processing actions you want to take on your events.
The event flow is represented as a graph of event sources, processors (actions), and event destinations. You can use the results of the processing to get and share insights on the business data, or to build automations.
The flows are run as Apache Flink jobs. Apache Flink is a framework and a distributed processing engine for stateful computations over event streams. In addition to being the processing engine for Event Processing, Flink is also a standalone engine you can run custom Flink SQL workloads with.
Features
Event Processing features include:
- A user interface (UI) designed to provide a low-code experience.
- A free-form layout canvas to create flows, with drag-and-drop functionality to add and join nodes.
- The ability to test your event flow while constructing it.
- The option to import and export flows in JSON format to reuse across different deployment instances.
- The option to download the output of the flow processing in a CSV file.