Converting timestamp formats

Scenario

The logistics team want to track the extremes from the temperature and humidity readings from the various sensors in the warehouse IoT network. They want to get the hourly min/max values for sensors from each building.

However, the IoT network generates sensor readings with timestamps that are a different format to the default SQL timestamp format.

{
    "sensortime": "Thu Jun 22 21:23:44 GMT 2023",
    "sensorid": "A-2-21",
    "temperature": 21.6,
    "humidity": 48
}

They have decided that they cannot use the timestamp from the Kafka message metadata (the time that the message is produced to the topic). This is because they know that transferring sensor readings from the IoT to Kafka can be slow, causing events to be delayed by up to five minutes. It is important that they use the time that the sensor reading was taken, as recorded in the event payload.

They need to pre-process the events to convert the timestamp into a format that will be easier to use.

Before you begin

The instructions in this tutorial use the Tutorial environment, which includes a selection of topics each with a live stream of events, created to allow you to explore features in IBM Event Automation. Following the setup instructions to deploy the demo environment gives you a complete instance of IBM Event Automation that you can use to follow this tutorial for yourself.

Versions

This tutorial uses the following versions of Event Automation capabilities. Screenshots may differ from the current interface if you are using a newer version.

  • Event Streams 11.3.0
  • Event Endpoint Management 11.1.1
  • Event Processing 1.1.1

Instructions

Step 1 : Discover the topic to use

For this scenario, you need access to the sensor reading events.

  1. Go to the Event Endpoint Management catalog.

    screenshot

    If you need a reminder about how to access the Event Endpoint Management catalog you can review Accessing the tutorial environment.

    If there are no topics in the catalog, you may need to complete the tutorial setup step to populate the catalog.

  2. The SENSOR.READINGS topic contains events from the IoT network.

    screenshot

    The documentation in the catalog helpfully explains that the timestamps in the events are recorded using a EEE MMM dd HH:mm:ss zzz yyyy format.

Step 2 : Create the pre-processing flow

The first event processing flow will read the events from the sensor readings topic, and produce them to a new Kafka topic with the timestamps rewritten in an SQL format.

  1. Go to the Event Processing home page.

    screenshot

    If you need a reminder about how to access the Event Processing home page, you can review Accessing the tutorial environment.

  2. Create a flow, and give it a name and description to explain that it will pre-process sensor readings to rewrite the timestamp.

    screenshot

  3. Create an Event source node.

    screenshot

    Create an event source node by dragging one onto the canvas. You can find this in the Events section of the left panel.

  4. Add an event source.

    screenshot

    Hover over the event source node and click Edit icon Edit to configure the node.

  5. Use the server address from the Event Endpoint Management catalog.

    screenshot

    Click the Copy button next to the Servers address in the Event Endpoint Management catalog to copy the address to the clipboard.

  6. Use the Generate access credentials button in the Event Endpoint Management catalog to create a username and password, and use that to configure the Event Processing event source.

    screenshot

  7. Select the SENSOR.READINGS topic, then click “Next”.

  8. Click the “Upload a schema or sample message” button.

  9. Paste in the sample message from the catalog.

    screenshot

  10. Click Configure to finalize the event source.

  11. Add a Transform node and link it to your event source.

    screenshot

    Create a transform node by dragging one onto the canvas. You can find this in the Processors section of the left panel.

    Click and drag from the small gray dot on the event source to the matching dot on the filter node.

  12. Hover over the transform node and click Edit icon Edit to configure the node.

    Name the transform node rewrite timestamp.

  13. Create a new property called timestamp.

  14. Use the assistant to create a TO_TIMESTAMP expression.

    screenshot

    The catalog gives you:

    • The name of the property containing the existing timestamp value (sensortime).
    • The format that it is in (EEE MMM dd HH:mm:ss zzz yyyy).

    Use these values to create the expression.

  15. Click Insert into expression, and then click “Next”.

  16. Remove the event_time and sensortime properties as you won’t need them.

    screenshot

  17. Click Configure to finalize the transform.

Step 3 : Produce pre-processed events to a topic

The next step is to send the modified events to a new topic that can be used as a source for multiple other event processing flows.

  1. Go to the Event Streams home page.

    screenshot

    If you need a reminder about how to access the Event Streams web UI, you can review Accessing the tutorial environment.

  2. Click the Create a topic tile.

  3. Create a topic called SENSOR.READINGS.SQLTIME.

    screenshot

    You can use the default values for all the properties of the topic.

  4. Click into the new topic page, and then click Connect to this topic.

  5. Get the server address for the new topic from the Event Streams topic page.

    screenshot

    Click the copy button for the Internal Kafka listener to copy the address to the clipboard.

  6. Create an Event destination node.

    screenshot

    Create an event destination node by dragging one onto the canvas. You can find this in the Events section of the left panel.

  7. Configure the event destination node using the Event Streams server address and credentials.

    screenshot

    If you need a reminder of the password for the kafka-demo-apps user, you can review the Accessing Kafka topics section of the Tutorial Setup instructions.

  8. Choose the new SENSOR.READINGS.SQLTIME topic.

    screenshot

  9. Click Configure to finalize the event destination.

  10. Use the Run menu, and select Include historical to run your filter on the history of order events available on this Kafka topic.

    screenshot

  11. Confirm that the events have re-written timestamps from Event Processing.

    screenshot

  12. Confirm that the transformed events are produced to the new topic from Event Streams.

    screenshot

  13. You can leave the flow running to continue producing transformed events to the new topic.

Step 4 : Share the transformed events

The SENSOR.READINGS.SQLTIME topic now has events with a timestamp that can be used to perform time-based analysis.

Any other teams or colleagues who want to perform time-based analysis on the sensor reading events will need the timestamp reformatted in the same way, so it makes sense to share this topic in the catalog.

  1. Go to the Event Endpoint Management topics page.

    screenshot

  2. Click Add topic.

    screenshot

  3. Reuse the existing connection details to the Event Streams cluster.

    screenshot

  4. Choose the new SENSOR.READINGS.SQLTIME topic.

    screenshot

  5. Click Add topic.

  6. Click the new SENSOR.READINGS.SQLTIME topic.

  7. Use Edit information to provide documentation for the topic.

    screenshot

  8. Include a sample message from the Event Streams topic page.

    screenshot

  9. Click Save.

  10. Click the Create option button in the Options tab.

    screenshot

  11. Create a name for the access option you are creating.

    screenshot

    Calling the option “Self service” is a good way to describe a topic being published without any approval requirements.

  12. Create a topic alias for the option.

    As this is the only option, SENSOR.READINGS.SQLTIME is a reasonable alias.

  13. As we are not adding any additional controls, click Next.

    screenshot

  14. Click Publish.

    screenshot

  15. Choose the gateway group.

    screenshot

  16. Click Save.

Step 5 : Process the transformed sensor readings

The final step is to process the transformed sensor readings to identify the minimum and maximum sensor readings for each building in each hour.

  1. Go to the Event Processing home page.

  2. Create a flow, and give it a name and description to explain that it will process sensor readings.

    screenshot

  3. Create an Event source node using the SENSOR.READINGS.SQLTIME topic from the Event Endpoint Management catalog.

  4. Use the sample message from the catalog to configure the event source.

    screenshot

  5. Modify the type of the timestamp property that you created to be Timestamp (instead of the default “STRING”).

    screenshot

  6. Choose the timestamp property to be used as the source of event time, and allow events to be up to 5 minutes late.

    screenshot

    The catalog page for the original SENSOR.READINGS topic warns that events from this IoT sensor network can be delayed for up to five minutes after the sensor reading is captured. Configuring Event Processing to wait for up to five minutes for sensor readings ensures that events won’t be missed.

  7. Click Configure to finalize the event source.

  8. Create a Transform node to extract the building name from the sensor ID.

    The sensor ID is made up of:

    <building id> - <floor number> - <sensor number>
    

    For example:

    DE-2-11
    

    screenshot

    Suggested function expression:

    REGEXP_EXTRACT(`sensorid`, '([A-Z]+)\-([0-9]+)\-([0-9]+)', 1)
    

    The regular expression function is simply capturing the first set of letters before the first hyphen character.

  9. Add an Aggregate node to the flow.

    screenshot

  10. Configure the aggregate to work in 1-hour windows.

    screenshot

  11. Create aggregate functions to get the MIN and MAX values for temperature and humidity.

    screenshot

  12. Group by building.

    screenshot

  13. Rename the output properties to be readable.

    screenshot

Step 6 : Run the flow

The final step is to run your completed flow, tracking the min and max sensor readings for each building from your transformed stream of events.

  1. Use the Run menu, and select Include historical to run your filter on the history of sensor events available on this topic.

    screenshot

Recap

A pre-processing flow can allow you to work with events that include timestamps in a wide variety of formats.