Download Support Guide

Nipper and Elastic Integration

Reducing your mean time to detect misconfigurations and vulnerabilities in firewalls, switches and routers, Titania Nipper accurately audits network devices, prioritizes risks and provides exact technical fixes to help remediate issues.

Nipper’s accurate audit data – such as your detailed compliance posture against standards including DISA STIG, DHS CDM/NIST 800-53 and PCI – can now be injected into the Elastic Stack via JSON, where the combined solution provides greater scope to analyze and remediate large numbers of your machines on a daily basis.

The Kibana dashboard then gives you the power to examine your security posture from different angles, filtering by categories of error and drilling down to precise detail about devices/models impacted and how to mitigate risks.

This user guide shows you step-by-step how to aggregate your Nipper audit reports in Elasticsearch and use your Kibana dashboard to explore the data.




Prerequisites for Aggregating Nipper Audit Reports in Elasticsearch

Before you begin, please ensure you have completed the prerequisite technical set up:

»   You have downloaded the necessary scripts via Bitbucket: Nipper_Elastic_Integration

»   Nipper (v 2.6.3 or above) is licensed and installed on your local Windows 10 machine,

»   WSL is configured and available to run Logstash,

»   Elastic and Kibana are installed and running on your local machine*, there is no security on the Elastic Index, and

»   Docker Desktop is installed on Windows 10 (a powershell script is provided in the Nipper_Elastic_ Integration folder to pull and run the containers).

*If Elastic and Kibana are installed remotely, the URLs provided in this guide will need to be updated accordingly, and the Logstash conf script adjusted to connect to the instance. An example file ‘ls_with_creds. conf’ is provided in the Nipper_Elastic_Integration folder.

For further information on installing the Elastic stack, please refer to the Elastic website.

New to Nipper?

You can download the Nipper Beginner’s Guide from the Titania website:

»   If you need to install Nipper:

  »   Go to the ‘Downloading Nipper’ section of the Nipper Beginner’s Guide

»   If you need to install your license:

  »   Go to the ‘Downloading your license’ section of the Nipper Beginner’s Guide

»   To audit your devices and generate reports:

  »   Open Nipper and select ‘New Report’ on the Nipper homepage. Step-by-step guides to generating each report can also be found on the website: www.titania.com/support

Step 1 – Configuring Nipper to emit JSON in the correct format

Logstash expects JSON in NDJSON. This means that each JSON Object appears on a separate line in the file, and not encapsulated in an array.

In order to configure Nipper to emit the JSON in the correct format you need to:

»   Open Nipper and click ‘Settings’

»   Click the ‘Logging’ icon and open the ‘File’ tab

»   Ensure that:

  »   ‘Enable logging to File’ is checked

  »   The file path to the output file is OK

  »   ‘Compact JSON’ is selected from the dropdown

  »   ‘Stream output’ is checked, and

  »   ‘Select All’ Logging Trigger Levels is checked

»   Finally, click ‘OK’ to confirm the settings.

Step 2 – Running an audit

»   Now click the ‘Reports’ icon to choose the audit you wish to run

»   Follow the onscreen instructions to choose the network device configurations you wish to include in your reports scope

»   Click ‘Finish’

»   The file will now appear in the specified directory.

If there are lots of devices being audited and/ or lots of audit types being conducted, it can take time to write out the file after the audit is complete.

Listing the size of the file a few times until it stops growing in size ensures that the process is complete.

Please note Nipper will append to this file if further audits are performed, so you may wish to move/ delete the file before performing a subsequent audit.

The contents of nipper.json should look similar to the fragment below, which is shown as an example:

* Note there is no ‘[‘ opening bracket. Just a ‘{‘ opening bracket, and the JSON record is all on one line.

Step 3 – Creating the Elastic index

»   Now click the ‘Reports’ icon to choose the audit you wish to run

»   Select the ‘Dev Tools’ icon from the left hand toolbar

»   Now configure the index and apply a mapping. The mapping extends the index length of some fields, and masks out those not needed.

Locate the .txt file script (shown right) in the Nipper_Elastic_Integration folder to copy and paste into the Console panel.

»   Once the text has been pasted into the console, click anywhere inside the text, then click the ‘Run’ arrow in the top right hand corner.

This action creates an index called ‘nipper’ with the correct mappings to accept the data from the tool.

If the index already exists, then you will get an error in the right hand pane after clicking ‘Run’.

»   If you wish to start afresh, issue a ‘DELETE / nipper’ on the Console pane, and then try again.

There is no need to replace the index creation text, just append it in the Console window, click on it, then click ‘Run’. Once the index is deleted, you can return to the creation text, click that, and press ‘Run’ again.

You now have an index with the correct mapping to accept Titania data.

Step 4 – Use Logstash to inject Nipper output into the Elasticsearch index

The next step is to get the data into the index. An easy way to do this is using Logstash from the Elastic ELK stack. To do this, Logstash needs a config file.

» Locate the text file named ‘I.conf’ (shown right) in the Nipper_Elastic_Integration folder.

If you are using a cloud version of Elastic, then the example file ls_with_creds.conf (with appropriate changes for the Elastic URI, and the access credentials can be used instead. An example is shown on the right.)


» Now invoke Logstash: cat nipper.json | logstash -f l.conf » The nipper.json data is now in Elastic.


Below it is invoked on a WSL (windows subsystem for Linux) Ubuntu instance.
Note the output to the console issues some warnings, but completes successfully:

Step 5 – Creating a Kibana index pattern

»   Firstly, click on the ‘Settings’ icon in the Kibana dashboard

»   And click on the ‘Index Patterns’ link

»   Click on the blue ‘Create Index Pattern’ button

»   Now type the name of the index you created into the index pattern box

You don’t have to type the complete name - you can use wild cards (this helps if you want Kibana to look over multiple Elastic indexes) - but in this case, typing nipper* works.

It will tell you Kibana has matched with the Elastic index called nipper*.

»   Click the ‘Next Step’ button

»   Select date_time from the drop down box, and click the ‘Create Index’ pattern.

The date_time is the field you mapped to contain the date of the events in the Nipper JSON output.

You will now see that the index has been created.

»   Next, click on the ‘Discover’ icon on the left toolbar.

If the data you are analysing wasn’t created in the last 15 minutes, it is likely you will need to change the time window with the calendar item to see the data.

»   Now you should see the data loaded into Elastic. In this case there are 3221 records.

»   Load in the dashboard

»   And select the ‘Settings’ menu again

»   Select the ‘Saved Objects’ link

»   Click the ‘Import Objects’ button

»   Now from the requester, import the nipper_kibana_dashboard.ndjson provided in the Nipper_Elastic_Integration folder.

This file contains the definitions of example visualisations, as well as a dashboard containing those visualisations.

»   And finally, click on the Nipper dashboard link.

Step 6 – Exploring the data

»   You will now be presented with a dashboard like this allowing you to click and filter on the results in the usual Kibana manner

»   Scroll down the dashboard to see heat maps and detailed audit findings and vulnerabilities

Here you can explore your security posture from different angles, filtering by categories of error and drilling down to precise detail about devices/ models impacted and how to mitigate risks...

Conclusion and further help

If you have followed this guide, you will see how quick and easy it is to aggregate your Nipper audit reports in Elasticsearch.

Now you can explore your data in Kibana, prioritize your risks and use Nipper’s exact technical fixes to help remediate any vulnerabilities or issues on your network.

If you would like any help or advice about the steps or scripts included in this guide, simply contact our dedicated Support team on:

Tel: (+44)1905 888 785
Email: support@titania.com

Our solution advisors will be more than happy to help walk you through this or any other auditing processes with our Nipper software.

Example analytics shows the prioritization of remediation that can be achieved when audit data is combined with value chain data on the mission criticality of the device/network.