ELK STACK - Elasticsearch, Logstash and Kibana Integration

Prev Next

Introduction

Elasticsearch is a distributed, free, and open search and analytics engine for all types of data, including textual, numerical, geospatial, structured, and unstructured. TheELK Stackis a collection of three open-source products - Elasticsearch, Logstash, and Kibana.

Elasticsearch allows you to store, search, and analyze huge volumes of data quickly and in near real-time and give back answers in milliseconds. Logstash, one of the core products of the Elastic Stack, is used to aggregate and process data and send it to Elasticsearch. Kibana is a data visualization and management tool for Elasticsearch that provides real-time histograms, line graphs, pie charts, and maps.

We have 2 methods to integrate with ELK stack.

  1. This method relies on Test Data Webhook API to send data directly to elastic search using a public facing endpoint accessible over http or https. This data then can be visualized using Kibana.
  2. This Integrations relies on Catchpoint's REST API to pull test data and stores it to a CSV file. Logstash dynamically ingests, transforms, and ships data to elastic search engine using csv filters. Kibana lets you visualize Catchpoint data in Elastic search and navigate the Elastic Stack.

Method 1: Test Data Webhook

Prerequisites

  • Elasticsearch
  • Kibana
  • Public facing endpoint to accept data from Catchpoint.
  • Catchpoint test enabled with Test Data Webhook.

Installation and Configuration

Setup In Catchpoint portal:

  1. In the Catchpoint Portal, go to the API
  2. Under Test data Webhook, Add URL link.
    1. Append this path to the public endpoint pointing to Elasticsearch /<index_name>/_doc. This sends the data as document to a particular index.
    2. If there are authentication headers for the endpoint, add them as request headers. Expand Request link to add if required.
  3. Enter a Name.
  4. Enter endpoint URL.
  5. Under Format choose Template.
  6. Click Add New.
  7. Provide a template Name.
  8. Select Format as JSON.
  9. Paste the below JSON template.
    {"@timestamp": "${timestamp(YYYY-MM-DDTHH:MI:SS.MSCZ)}", "Catchpoint": ${JsonPayload}}
  10. Click Save.
  11. Select the newly created template.
  12. Add email address to notify in case it fails to send data over API.
  13. Click Save button on the top of the page.

Note: Index name is used to reference your data in Kibana.

Setup in Kibana:
Create an Index pattern with time stamp in Kabana. This can be done once the data from Catchpoint is being pushed into Elasticsearch. With the first data push, index will be created will use the same to create an index pattern.

  1. Open Kibana.
  2. Expand Menu and select Stack Management.
  3. Click on Index Patterns.
  4. Click Create index pattern.
  5. Type in the index name you provided in the endpoint.
  6. Click Next Step.
  7. Under Time field, select @timestamp.
  8. Click Create index pattern.

Result

To view the data, go to menu and select Discover and from the drop down select the index pattern you just created.Elasticsearch.JPG

Based on you Requirements build dashboards to consume the data.
ELK.JPG

Method 2: REST API

Prerequisites

  • Elasticsearch
  • Logstash
  • Kibana

Installation and Configuration

Set up the Catchpoint REST API endpoint

  1. In the Catchpoint Portal, go to the API
  2. Set up a REST API
  3. Retrieve the Key and Secret. These correspond to the CatchpointKey and CatchpointSecret

Set up NodeJS app locally

  1. Download and extract the Nodejs attached along with this article.
  2. In the .env file from Catchpoint - API directory
    1. Update CatchpointKey and CatchpointSecret with the REST API values from above.
    2. CatchpointTestId with a Catchpoint Test Id.
    3. Interval to pull the data from Catchpoint API for the give time frequency (in minutes). mceclip0.png
  3. Download and install packages and dependencies.
    $ npm install
  4. Use the provided script to retrieve the last 15 minutes of raw test data and save it to csv file.
    $ node index.js

Set up Logstash configuration file

  1. Download and move .conf file to Logstash home directory.
  2. Update the column names depending on the test type. Column names are also provided in the file named columns.txt along with this article.
  3. Make sure you convert the required column names to Integer using mutate filters. mceclip1.png
  4. Update hosts and index as per your set up. If authentication is required, please provide username and password for your elastic search account.
    mceclip2.png
  5. The sample config catchpoint.conf file which is provided works for web test type.

Note: One test type can be configured at a time in the .conf file.
Running-Logstash-command-line

  1. Navigate to Logstash home directory.
  2. Run Logstash from the command line.
    $ bin/logstash -f catchpoint.conf

Result

Create an index pattern in Kibana.

  1. Access Kibana from the browser.

  2. Open the main menu, then click toStack Management > Index Patterns.

  3. ClickCreate index pattern.
    mceclip3.png

  4. Start typing in theIndex pattern field, and Kibana looks for the names of Elasticsearch indices that match your input. Use a wildcard (*) to match multiple indices. For example: catchpoint*

  5. ClickNext step.

  6. If Kibana detects an index with a timestamp, expand theTime field menu, and then specify the default field for filtering your data by time.

If your index doesn't have time-based data, or if you don't want toselect the default timestamp field, chooseI don't want to use the Time Filter.

If you don't set a default time field, you will not be able to use global time filters on your dashboards. This is useful if you have multiple time fields and want to create dashboards that combine visualizations based on different timestamps.

  1. ClickCreate index pattern. Kibana is now configured to use your Elasticsearch data.
  2. Select this index pattern when you search and visualize your data.

To create a visualization:

  1. Click on Visualize in the side navigation.
  2. Click the Create new visualization button or the + button.
  3. Choose the visualization type as Lens.
  4. Select Line as chart type.
  5. For Horizontal axis select Test Runtime.
    mceclip4.png
  6. For Vertical axis select any Metric. You can also breakdown by Test Name or Test Id or Node Name.
    mceclip5.png

ELK STACK.zip