Skip to main contentIBM Cloud Patterns

Log Streaming

What is LogDNA Streaming?

Streaming enables LogDNA to produce content to a message bus queue and Topic. LogDNA streaming helps you to connect third party consumers of topics to ingest into dashboards for visualization of event log data.

Third party horizonal technologies such as Splunk, used in organizations for application management, security and compliance are able to leverage IBM Cloud Log Analysis with LogDNA Streaming.

The content in this pattern, we’ll walk through setting up a sample event stream by utilizing IBM Cloud Event Streams and LogDNA Streaming, to produce events to a topic for a simple consumer application to ingest.

IBM Cloud Event Streams is a high-through put message bus built with Kafka, see What is Event Streams and Choosing Your Plan

Create IBM Event Streams Instance

You’ll need to use a previous LogDNA instance or create a new one, see Provisioning an Instance

For the demo, an IBM Cloud Event Streams instance is required. Log into IBM Cloud > services and create your Event Streams instance, see Provision Event Streams Instance

Once your IBM Event Stream instance is setup, create a Service Credential account and then expand it to show the username, password (apikey) and kafka_brokers_sasl, similar to the screen shown below.

Service Credential

Make a copy of the Service Credential username, password and Kafka_brokers_sasl URL.

Setup Event Streams Demo

You’ll need to setup a sample Topic, Producer and Consumer, see Getting Started Tutorial.

For this demo, choose the kafka-java-console-sample from the Git Hub repo, event-streams-samples.

Make sure to create the sample topic ”kafka-java-console-sample-topic” in your IBM Cloud Event Streams instance, similar to the screen shown below.

Topic

Once your Producer is up and running with the kafka-java-console-sample-topic, you should see a screen similar to the one shown below.

Producer

After you’ve completed the Getting started Tutorial, you should now have a Consumer running as well, similar to the screen shown below.

Consumer

Configure LogDNA Streaming

Note: You’ll need an instance of LogDNA, see Provisioning an Instance

Lets get started by following the steps below:

  1. Login to IBM Cloud, then go to > Observability > logging, and choose your instance of LogDNA.
  2. Select View LogDNA
  3. Click on the Settings gear icon and choose Streaming.

If your LogDNA Streaming has been whitelisted, you should see a screen similar to the one shown below.

LogDNA Streaming

To configure LogDNA Streaming, follow the instructions below:

  1. Obtain the IBM Cloud Event Streams Service Credential, username, password and Kafka_brokers_sasl URL.
  2. Configure the Event Streams Service Credential information in the LogDNA Streaming fields.
  3. Click Save.
  4. Next, you’ll need to stop the Producer from producing events.
  5. Stop the Producer by typing in the command Cntrl+C.
  6. Now bring up the Consumer terminal window, notice the constant, INFO No messages consumed text, similar to the screen shown below.
Consumer

Start LogDNA Streaming to Consumer

Let’s go a head and start LogDNA Streaming, you may see a notice from LogDNA Streaming asking if you’ve received the sample messages similar to the screen shown below, click yes.

Samples

Add in an email recipient and then click start streaming, similar to the screen shown below.

Streaming Configuration

LogDNA should now have been successfully configured. If you have a Kubernetes cluster,VSI or any other services configured with LogDNA enabled, you should see activity in the Consumer terminal, similar to the screen shown below.

LogDNA Streaming Started

LogDNA Streaming is active, similar to the screen shown below.

Consumer Recevied LogDNA Streams