Enrich events with reference data

Scenario

The security team wants to process a live stream of door access events, as part of a project to be able to identify and respond to unusual out-of-hours building access.

To begin, they want to create a stream of events for weekend building access, enriching that stream of events with additional information about the buildings from their database.

Before you begin

The instructions in this tutorial use the Tutorial environment, which includes a selection of topics each with a live stream of events, created to allow you to explore features in IBM Event Automation. Following the setup instructions to deploy the demo environment gives you a complete instance of IBM Event Automation that you can use to follow this tutorial for yourself.

You will also need to run the optional instructions for creating a PostgreSQL database. This database will provide a source of reference data that you will use to enrich the Kafka events.

Versions

This tutorial uses the following versions of Event Automation capabilities. Screenshots may differ from the current interface if you are using a newer version.

  • Event Endpoint Management 11.3.1
  • Event Processing 1.1.5

Instructions

Step 1 : Discover the topic to use

For this scenario, you need a source of door badge events.

  1. Go to the Event Endpoint Management catalog.

    screenshot

    If you need a reminder about how to access the Event Endpoint Management catalog you can review Accessing the tutorial environment.

  2. Find the Door badge events topic.

    screenshot

  3. Click into the topic to review the information about the events that are available here.

    Look at the sample message to see the properties in the door events, and get an idea of what to expect from events on this topic.

    screenshot

Tip: Keep this page open. It is helpful to have the catalog available while you work on your event processing flows, as it allows you to refer to the documentation about the events as you work. Complete the following steps in a separate browser window or tab.

Step 2 : Create a flow

  1. Go to the Event Processing home page.

    screenshot

    If you need a reminder about how to access the Event Processing home page, you can review Accessing the tutorial environment.

  2. Create a flow, and give it a name and description to explain that you will use it to create an enriched stream of weekend door badge events.

    screenshot

Step 3 : Provide a source of events

The next step is to bring the stream of events you discovered in the catalog into Event Processing.

  1. Update the Event source node.

    screenshot

    Hover over the node and click Edit icon Edit to configure the node.

  2. Add a new event source.

    screenshot

    Click Next.

  3. Get the server address for the event source from the Event Endpoint Management topic page.

    screenshot

    Click the Copy icon next to the Servers address to copy the address to the clipboard.

  4. Configure the new event source.

    screenshot

    Give the node a name that describes this stream of events: door events.

    Paste in the server address that you copied from Event Endpoint Management in the previous step.

    Click Next.

  5. Generate access credentials for accessing this stream of events from the Event Endpoint Management page.

    screenshot

    Click Generate access credentials at the top of the page, and provide your contact details.

  6. Copy the username and password from Event Endpoint Management and paste into Event Processing to allow access to the topic.

    screenshot

    The username starts with eem-.

    Click Next.

  7. Select JSON as the message format used in this topic.

    screenshot

    Did you know? The catalog page for this topic tells you that events on this topic are JSON strings.

    Click Next.

  8. Get the sample message for door badge events from Event Endpoint Management.

    screenshot

    Click Copy in the Sample message section to copy the sample message to the clipboard.

    You need to give Event Processing a description of the events available from the topic. The information in the sample message enables Event Processing to give guidance for creating event processing nodes.

  9. Paste the sample message into the JSON sample message box.

    screenshot

  10. Confirm that the type of the badgetime property has automatically been detected as Timestamp.

    screenshot

  11. Configure the event source to use the badgetime property as the source of the event time, and to tolerate lateness of up to 3 minutes.

    screenshot

  12. Click Configure to finalize the event source.

Step 4 : Derive additional properties

The next step is to define transformations that will derive additional properties to add to the events.

  1. Add a Transform node and link it to your event source.

    screenshot

    Create a transform node by dragging one onto the canvas. You can find this in the Processors section of the left panel.

    Click and drag from the small gray dot on the event source to the matching dot on the transform node.

    Hover over the node and click Edit icon Edit to configure the node.

  2. Give the transform node a name that describes what it will do: additional properties.

    screenshot

    Click Next.

  3. Compute two new additional properties using the transform node.

    screenshot

    You should call the first property day of week.

    This will identify the day of the week from the timestamp contained in the door event. This is created as a number, where 1 means Sunday, 2 means Monday, and so on.

    Use this function expression:

     DAYOFWEEK ( CAST(badgetime AS DATE) )
    

    You should call the second property building.

    Door IDs are made up of: <building id> - <floor number> - <door number>

    For example: H-0-36

    For your second property, you should use the function expression:

     REGEXP_EXTRACT(`door`, '([A-Z]+)-.*', 1)
    

    This expression will capture the building ID from the start of the door ID.

  4. Click Next and then Configure to finalize the transform.

Step 5 : Test the flow

The next step is to run your event processing flow and view the results.

  1. Use the Run menu, and select Include historical to run your filter on the history of door badge events available on this Kafka topic.

    screenshot

    Verify that the day of the week is being correctly extracted from the timestamp, and that the building ID is correctly being extracted from the door ID.

Step 6 : Filter to events of interest

The next step is to identify door badge events that occur at weekends. The additional day of week property that you computed in the transform node will be helpful for this.

  1. Create a Filter node and link it to a transform node.

    screenshot

    Create a filter node by dragging one onto the canvas. You can find this in the Processors section of the left panel.

    Click and drag from the small gray dot on the event source to the matching dot on the filter node.

    Hover over the node and click Edit icon Edit to configure the node.

  2. Give the filter node a name that describes the events it should identify: weekend door events.

    screenshot

    Click Next.

  3. Define a filter that matches door badge events with a day of week value that indicates Saturday or Sunday.

    screenshot

    Filter expression:

     `day of week` = CAST (1 AS BIGINT)
       OR
     `day of week` = CAST (7 AS BIGINT)
    

    Did you know? Including line breaks in your expressions can make them easier to read.

  4. Click Configure to finalize the filter.

Step 7 : Test the flow

The next step is to run your event processing flow again and view the results.

  1. Use the Run menu, and select Include historical to run your filter on the history of door badge events available on this Kafka topic.

    screenshot

    Verify that all events are for door badge events with a timestamp of a Saturday or Sunday.

Step 8 : Enrich the events

The next step is to add additional information about the building to these out-of-hours door events.

  1. Add a Database node to the flow.

    screenshot

    Create a database node by dragging one onto the canvas. You can find this in the Enrichment section of the left panel.

  2. Give the database node a name and paste the JDBC URI for your database into the Database URL field.

    screenshot

    Get the JDBC URI for your PostgreSQL database by following the instructions for Accessing PostgreSQL database tables.

  3. Click Next.

    screenshot

    The database username and password were included in the JDBC URI, so no additional credentials are required.

  4. Choose the buildings database table.

    screenshot

  5. Use the assistant to define a join that matches events with the database row about the same building.

    screenshot

    Match the building value from the Kafka events with the database row using the buildingid column.

    Click Next.

  6. Choose the database columns to include in your output events.

    screenshot

    Include the street name and security contact columns.

    There is no need to include the buildingid column because this value is already contained in the events.

    Click Next.

  7. Choose the properties to output.

    screenshot

    For example, the day of week property was useful for our filter, but we may not need to output it as a finished result.

    The screenshot shows an example set of properties that could be useful for this demo scenario.

  8. Click Configure to finalize the enrichment.

Step 9 : Test the flow

The final step is to run your event processing flow and view the results.

  1. Use the Run menu, and select Include historical to run your filter on the history of door badge events available on this Kafka topic.

    screenshot

Recap

You used a transform node to compute additional properties from a stream of events.

You then further enhanced the stream of events to increase their business relevance by enriching the events with reference data from a database. Relevant data in the database was identified based on the new computed properties.