Elasticsearch for Offensive Security - 9 minutes read


Elasticsearch for Offensive Security – Marco Lancini

Have you ever been in a network penetration test where the scope is so huge you end up with dozens of files containing Nmap scan results, each of which, in turn, contains a multitude of hosts? If the answer is yes, you might be interested in this blog post.

Following is the process I recently went through to find a way to triage the results, while enabling concurrent collaboration between team mates. We will see how using traditional “defensive” tools for Offensive security data analysis has advantages over the traditional when parsing and analysing data.

Finally, I’m going to provide the full source code of the setup I ended up with. Hopefully this will give someone else with a similar need some help in the future.

The final setup can be found on Github: https://github.com/marco-lancini/docker_offensive_elk.

If you are still reading, it probably means you want to move away from the traditional -based approach. But what other alternatives do we have?

I started by taking a look at something I always overlooked: Nmap HTML reporting. I’m not sure how many people are aware and actually using this, but it is indeed possible to take an XML output file from Nmap and pass it to an XML processor (like ) that will turn it into an HTML file. For those interested, the full process for obtaining a result like the one shown in the image below can be found on the Nmap website:

Recently, improved XLS implementations started to appear. One example is nmap-bootstrap-xsl, which is a nmap XSL implementation based on Bootstrap:

However, this approach has a few drawbacks in my opinion. First of all, unless Nmap was started with the switch, one has to go throw every single output file to replace the XSL stylesheet reference so to make it point to the exact location of the file on the current machine. Second, and more importantly, this still doesn’t scale.

Having discarded the HTML path, I then remembered a blog post from my ex-colleague Vincent Yiu, where he started leveraging Splunk for offensive operations. This was an interesting thought, as more and more often we see people using so called “defense” tools for offense as well. Splunk was definitely a no-go for me (due to licensing issues), but after some research I then finally stumbled upon into this blog post: “Using Nmap + Logstash to Gain Insight Into Your Network”.

I’ve heard of ELK (more on this below) before, but I never properly looked at it, probably because I was classifying it as a “defense” tool used mainly by SOC analysts. What caught my eye was the fact that the blog post above was explaining how to:

So, what is the ELK Stack? “ELK” is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana. Elasticsearch is a search and analytics engine. Logstash is a server‑side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a “stash” like Elasticsearch. Kibana lets users visualize data with charts and graphs in Elasticsearch.

I’m not going into much details explaining the different components of this stack, but for anyone interested I highly recommend “The Complete Guide to the ELK Stack” which gives a very nice overview of the stack and of its three major components (feel free to skip the “Installing ELK” section, as we will take a different approach here).

What I’m interested here is to see how Elasticsearch can be used not only for detection (defense), but for offense as well.

The following is a full walkthrough that led me to the final setup. Those uninterested can jump straight to the "Play with Data" section.

As a starting point we will use an awesome repository put together by , that will allow us to spin up a full ELK stack in seconds, thanks to docker-compose:

After cloning the repository, we can see from the file that three services will be started. Here is the modified file, where I added container names (for clarity) and a mean for Elasticsearch to persist data, even after removing its container, by mounting a volume on the host ():

As some readers pointed out, create the folder and ensure it is owned by your own user:

By default, the stack exposes the following ports:

Give Kibana a few seconds to initialize, then access the Kibana web UI running at: http://localhost:5601.

For a complete ELK newbie, that was a bit of a challenge, until I found the following post: “How to Index NMAP Port Scan Results into Elasticsearch”. This wasn’t a complete solution, but a good starting point. Let’s start from there and build on it.

First of all, we will need the Logstash Nmap codec plugin. A Logstash codec simply provide a way to specify how raw data should be decoded, regardless of source. This means that we can use the Nmap codec to read Nmap XML from a variety of inputs. We could read it off a message queue or via syslog for instance, before passing the data on to the Nmap codec. Luckily, plugging this in was as easy as modifying the Logstash Dockerfile located at :

Next, to put this into Elasticsearch we need to create a mapping. A mapping template is available from the Github repository of the Logstash Nmap codec. We can download it and place it in

Finally, we need to modify the logstash configuration file located at so to add filters and output options for the new Nmap plugin:

We are going to use a modified version of VulntoES to ingest the results and import them into Elasticsearch. In order to do so, I created a new folder for a new service that will actually ingest data.

In the listing above, the folder contains:

We now just need to add this new container to the file:

Notice how we are mapping the local folder in the container under the path . We are going to use this “shared” folder to pass the Nmap results across.

This is how your project folder should look like after all these modifications:

Once done, make sure to rebuild the images using the command.

The last step consists in creating an index that will be used to index the data to:

With ELK properly configured, it’s time to play with our data.

In order to be able to ingest our Nmap scans, we will have to output the results in an XML formatted report () that can be parsed by Elasticsearch. Once done with the scans, place the reports in the folder and run the ingestor:

Now that we have imported some data, it’s time to start delving into Kibana’s capabilities.

The “Discover” view presents all the data in your index as a table of documents, and allows to interactively explore your data: we have access to every document in every index that matches the selected index pattern. You can submit search queries, filter the search results, and view document data. You can also see the number of documents that match the search query and get field value statistics. This is great to triage targets by filtering, for example, by open ports or service.

The “Dashboard” view, instead, displays a collection of visualizations and searches. You can arrange, resize, and edit the dashboard content and then save the dashboard so you can share it. This can be used to create an highly customised overview of your data.

The dashboard itself is interactive: you can apply filters to see the visualizations updated in realtime to reflect the queried content (in the example below I filtered by port ).

For those interested, I exported my example dashboard in an easy-to-reimport json file:

Traditional “defensive” tools can be effectively used for Offensive security data analysis, helping your team collaborate and triage scan results.

In particular, Elasticsearch offers the chance to aggregate a multitude of disparate data sources, query them with a unified interface, with the aim of extracting actionable knowledge from a huge amount of unclassified data.

Source: Marcolancini.it

Powered by NewsAPI.org

Keywords:

ElasticsearchComputer securityComputer networkPenetration testNmapServer (computing)BlogConcurrent computingCollaborationToolComputer securityData analysisParsingSource codeGitHubNmapHTMLXMLNmapXMLCentral processing unitHTMLProcess (computing)NmapWebsiteMicrosoft ExcelNmapBootingXSLTNmapXSLTBootstrap (front-end framework)NmapCommand-line interfaceXSLComputer fileHTMLBlogSplunkSplunkNmapElkBlogElkMooseAcronymOpen-source modelElasticsearchKibanaElasticsearchWeb search engineWeb analyticsServer-sideElasticsearchKibanaElasticsearchMooseRed deerElasticsearchMooseStevedoreElasticsearchDataServer (computing)Directory (computing)User (computing)Call stackKibanaKibanaWorld Wide WebUser interfaceMooseNewbieNmapPort scannerElasticsearchNmapCodecPlug-in (computing)CodecSource codeNmapCodecNmapXMLMessage queueSyslogObject (computer science)DataNmapCodecElasticsearchTemplate processorGitHubRepository (version control)NmapCodecConfiguration fileFilter (software)Command-line interfaceNmapPlug-in (computing)ElasticsearchDirectory (computing)DataComputer fileShared resourceNmapDataElkDataNmapInput/outputXMLElasticsearchDirectory (computing)DataKibanaDataDataWeb pageSearch engine indexingSearch engine indexingWeb search queryContent-control softwareWeb search engineDocument management systemDataStatisticsTriageXboxVisualization (graphics)XboxXboxDataInteractivityReal-time computingDashboard (business)JSONComputer fileComputer securityData analysisTriageElasticsearchInformation retrieval