Log Streaming
What is LogDNA Streaming?
Streaming enables LogDNA to produce content to a message bus queue and Topic. LogDNA streaming helps you to connect third party consumers of topics to ingest into dashboards for visualization of event log data.
Third party horizonal technologies such as Splunk, used in organizations for application management, security and compliance are able to leverage IBM Cloud Log Analysis with LogDNA Streaming.
The content in this pattern, we’ll walk through setting up a sample event stream by utilizing IBM Cloud Event Streams and LogDNA Streaming, to produce events to a topic for a simple consumer application to ingest.
IBM Cloud Event Streams is a high-through put message bus built with Kafka, see What is Event Streams and Choosing Your Plan
- Create IBM Event Streams Instance
- Setup Event Streams Demo
- Configure LogDNA Streaming
- Start LogDNA Streaming to Consumer
Create IBM Event Streams Instance
You’ll need to use a previous LogDNA instance or create a new one, see Provisioning an Instance
For the demo, an IBM Cloud Event Streams instance is required. Log into IBM Cloud > services and create your Event Streams instance, see Provision Event Streams Instance
Once your IBM Event Stream instance is setup, create a Service Credential account and then expand it to show the username, password (apikey) and kafka_brokers_sasl, similar to the screen shown below.
Make a copy of the Service Credential username, password and Kafka_brokers_sasl URL.
Setup Event Streams Demo
You’ll need to setup a sample Topic, Producer and Consumer, see Getting Started Tutorial.
For this demo, choose the kafka-java-console-sample from the Git Hub repo, event-streams-samples.
Make sure to create the sample topic ”kafka-java-console-sample-topic” in your IBM Cloud Event Streams instance, similar to the screen shown below.
Once your Producer is up and running with the kafka-java-console-sample-topic, you should see a screen similar to the one shown below.
After you’ve completed the Getting started Tutorial, you should now have a Consumer running as well, similar to the screen shown below.
Configure LogDNA Streaming
Note: You’ll need an instance of LogDNA, see Provisioning an Instance
Lets get started by following the steps below:
- Login to IBM Cloud, then go to > Observability > logging, and choose your instance of LogDNA.
- Select View LogDNA
- Click on the Settings gear icon and choose Streaming.
If your LogDNA Streaming has been whitelisted, you should see a screen similar to the one shown below.
To configure LogDNA Streaming, follow the instructions below:
- Obtain the IBM Cloud Event Streams Service Credential, username, password and Kafka_brokers_sasl URL.
- Configure the Event Streams Service Credential information in the LogDNA Streaming fields.
- Click Save.
- Next, you’ll need to stop the Producer from producing events.
- Stop the Producer by typing in the command Cntrl+C.
- Now bring up the Consumer terminal window, notice the constant, INFO No messages consumed text, similar to the screen shown below.
Start LogDNA Streaming to Consumer
Let’s go a head and start LogDNA Streaming, you may see a notice from LogDNA Streaming asking if you’ve received the sample messages similar to the screen shown below, click yes.
Add in an email recipient and then click start streaming, similar to the screen shown below.
LogDNA should now have been successfully configured. If you have a Kubernetes cluster,VSI or any other services configured with LogDNA enabled, you should see activity in the Consumer terminal, similar to the screen shown below.
LogDNA Streaming is active, similar to the screen shown below.