Building a simple flow

In this getting started tutorial, we’ll take you through how to use Event Processing to create a simple flow. The flow uses a filter node to select a subset of events.

For demonstration purposes, we use the scenario of a clothing company who want to move away from reviewing quarterly sales reports to reviewing orders in their region as they occur. Identifying large orders in real time helps the clothing company identify changes that are needed in sales forecasts much earlier. This information can also be fed back into their manufacturing cycle so that they can better respond to demand.

The steps in this getting started tutorial should take you about 10 minutes to complete.

Before you begin

  1. This getting started scenario assumes that all the capabilities in Event Automation are installed.
  2. Connect with your cluster administrator and get the server address for the topic you have to access.
  3. Keep your instance open on the topic page because you need to use information from it when you create your flow.
  4. Open Event Processing in a separate window or tab.

This getting started scenario assumes that there is an order details available through a Kafka topic, and the topic is discoverable in Event Endpoint Management. For example, in this scenario, the clothing company have a topic in their Event Endpoint Management catalog called ORDERS.NEW. This topic emits events for every new order that is made.

Step 1: Create a flow

  1. On the Event Processing home page, click Create.
  2. Provide a name and optionally a description for your flow.
  3. Click Create. The canvas is displayed with an event source node on it.

    Note: When you create a flow, an event source node is automatically added to your canvas. A purple checkbox unconfigured_node icon is displayed on the event source node indicating that the node is yet to be configured.

  4. To configure an event source, hover over the source node, and click Edit icon Edit. The Configure event source window is displayed.

The clothing company created a flow called Filter and provided a description to explain that this flow will be used to identify orders made in a specific region.

Create a flow

Save

User actions are saved automatically. For save status updates, see the canvas header.

  • Saving Saving indicates that saving is in progress.
  • Saved Save successful confirms success.
  • Failed Save failed indicates that there are errors. If an action fails to save automatically, you receive a notification to try the save again. Click Retry to re-attempt the save. When a valid flow is saved, you can proceed to run the job.
  • Stale Stale indicates that another user modified the flow. A pop-up window is displayed and depending on the Stale status you are prompted to select one of the following actions:
    • Save as a copy: Select this action to save the current flow as a new one without incorporating the changes made by the other user. The new flow is called ‘Copy of <flow-name>’.
    • Accept changes: Select this action to apply the latest updates that are made by the other user to the flow. For the Flow running case, you can view the running flow.
    • Home: Select this action to navigate back to the home page. The specific flow will no longer be available because it was deleted by another user.

Step 2: Configure an event source

  1. You need to provide the source of events that you want to process. To do this, start by adding an event source, select Add new event source > Next.
  2. In the Details section, provide a name for the node.
  3. In the Connect to Kafka cluster section, provide the server address of the Kafka cluster that you want to connect to. You can get the server address for the event source from your cluster administrator.

    Note: To add more addresses, click Add URL + and enter the server address.

  4. Click Next. The Access credentials pane is displayed.
  5. Provide the credentials that are required to access your Kafka cluster and topic. You can generate access credentials for accessing a stream of events from the Event Endpoint Management page. For more information, see subscribing to topics.
  6. Click Next. The Topic selection pane is displayed.
  7. Use the radio buttons to confirm the name of the topic that you want to process events from.
  8. Click Next. The Define event structure pane is displayed.
  9. Provide a schema or sample message available from the topic. To do this, click Upload a schema or sample message + and paste a valid schema into the Topic schema or the Sample message tab.

    Enter an Avro schema in the Topic schema tab, or click the Sample message tab and enter the sample message in JSON format. For more information, see event information.

  10. Set an event time and leave the event source to be saved for later reuse. Saving the connection details makes creating similar event sources a lot quicker because there is no need to enter the same details again.
  11. Click Configure. The canvas is displayed and your event source node has a green checkmark, which indicates that the node has been configured successfully.

The clothing company called their event source Orders and used the schema for their Order events topic in Event Endpoint Management to update the topic schema tab in Event Processing.

Define event structure

Step 3: Add a filter

  1. On the Palette, in the Processors section, drag a Filter node onto the canvas.
  2. Drag the output port on the event node to the input port on the filter to join the two nodes together.
  3. Hover over the source node and click Edit icon Edit.

The Configure filter window is displayed.

Step 4: Configure the filter

  1. Now, you need to configure the filter that defines the events that you are interested in. To do this, in the Details section, provide a name for the filter node.
  2. Click Next. The Define filter section is displayed.
  3. Use the Assistant to define a filter with your requirements by updating the Property to filter on, Condition, and Value fields.
  4. Click Add to expression.
  5. Click Configure. The canvas is displayed and your Filter node has a green checkmark, which indicates that the node has been configured successfully.

The clothing company called their filter EMEA orders and defined a filter that matches events with a region value of EMEA.

Defining a filter

Step 5: Run the flow

  1. The last step is to run your Event Processing flow and view the results.
  2. In the navigation banner, expand Run and select either Events from now or Include historical to run your flow.

A live view of results from your running flow automatically opens. The results view is showing the output from your flow - the result of processing any events that have been produced to your chosen Event Endpoint Management topic.

Tip: Include historical is useful while you are developing your flows because you don’t need to wait for new events to be produced to the topic. You can use all the events already on the Kafka topic to check that your flow is working the way that you want.

In the navigation banner, click Stop to stop the flow when you finish reviewing the results.

The clothing company selected Include historical to run the filter on the history of order events available on their Order events topic. All the orders from the EMEA region are displayed. This provides the company real-time information about orders placed in the region, and helps them review orders as they occur.

Viewing results

Flow statuses

A flow status indicates the current state of the flow. A flow can be in one of the following states:

  • Draft: Indicates that the flow includes one or more nodes that need to be configured. The flow cannot be run.
  • Event Processing 1.1.3 icon Valid: Indicates that all nodes in the flow are configured and valid. The flow is ready to run.
  • Event Processing 1.1.3 icon Invalid: Indicates that the nodes in the flow are configured but have a validation error, or a required node is missing. The flow cannot be run.
  • Running: Indicates that the flow is configured, validated, running, and generating output.
  • Error: Indicates that an error occurred during the runtime of a previously running flow.