Logstash: use logstash to analyze Service API data

I remember in the previous article“ Logstash: API analysis using ELK stack ”Use Logstash to analyze the API of some indicators. In today's exercise, I will show how to use Logstash to analyze the service APIs of some log classes. We know that in many cases, we can quickly use some scripts to analyze the data of some service APIs. This is helpful for us to quickly analyze some data. In data import, we can use the rich filter provided by Logstash to clean, enrich and transform data.

In today's exercise, I will use shodan.io Take the website as an example:

Shodan is an interconnected search engine. In the above, when we search for the word china, it will display those hosts with the word. I won't introduce more about Shodan here. Shodan actually provides a Service API interface for our clients to call. We can address To apply for a developer account and get the developer secret key.

Below, I use a Python application to obtain the query results:


import os
from shodan import Shodan
import time
import requests
import pprint
import sys
import logging


# api_key = os.environ[SHODAN_API_KEY]
api = Shodan(api_key)

search = sys.argv[1]

# Search Shodan
results = api.search(search)

# print(f"Total count: {len(results['matches'])}")

# Show the results
for result in results['matches']:
    ip_address = result['ip_str']
    domains = result['domains']
    logging.info('"ip_addr":"{}","domains":"{}"'.format(ip_address, domains))

In order to run the above application, we must install shodan:

pip3 install shodan

We can run it in the following ways:

python3 shodan_scanner.py "china" > shodan.log

Above, we search for the word "china" and save the search results to Shodan Log in. After we finish running, we can go to Shodan The following logs can be found in the log file:



From above, we can see that it has an IP address_ An IP address such as addr. We can use the information provided by Logstash geoip Filter to enrich this data.


input {
  file {
    path => [ "/Users/liuxg/python/shodan/shodan.log" ]
    start_position => "beginning"
    sincedb_path => "/dev/null"
    codec   => "json"

filter {
    geoip {
        source => "ip_addr"
        target => "geo"

    if [geo][latitude] and [geo][logitude] {
        mutate {
            add_field => {
                "location" => ["%{[geo][latitude]},%{[geo][logitude]}"]
        mutate {
            convert => {
                "location" => "float"

output { 
  stdout {
    codec => rubydebug

  elasticsearch {
      hosts => ["http://localhost:9200"]
      index => "shodan"

Above, we import the data through file input. When using, you need to use your own file path instead of the above path. In the above, I used the geoip filter to enrich the data.

Before running the above Logstash pipeline, we enter the following command in Kibana:

PUT shodan
  "mappings": {
    "properties": {
      "geo": {
        "properties": {
          "location": {
            "type": "geo_point"

The above command defines Geo Data type of location. It's a Geo_ Data type of point.

We can run Logstash with the following command:

sudo ./bin/logstash -f logstash.conf

We can see the following output in the terminal of Logstash:

From the above, we can see that through the use of geoip, we get more fields about location.

We need to create an index schema for the shaodan index. We can find these data in Discover:


Because it also has a location field, we can also use the Maps application to display the location of the document:

Keywords: Database Big Data ElasticSearch LogStash elastic

Added by kostasls on Sat, 25 Dec 2021 20:54:09 +0200