First of all thanks to Shaun Sixty that maybe is the only other human in the Earth, other than myself, which is interested in how to store ModSecurity logs in Elasticsearch :) I hope that a day we’ll find others life forms…
Before start is important that you know that my setup is NOT based on ModSecurity 2.9 and Apache as usual. I’m using Nginx and Libmodsecurity (or ModSecurity v3) and in this article, I’ll show you my “Audit Log” configuration and a Python script that read ModSecurity logs and sends it to Elasticsearch.
In my Nginx configuration, all websites has their own ModSecurity Rules File configuration. This allows me to choose if and where store the audit logs. For example:
server {
listen 80;
server_name www.example.com; modsecurity on;
modsecurity_rules_file modsecurity/www.example.com.conf; ...}
Inside the modsecurity_rules_file you need to edit the SecAuditLog part as following:
SecRuleEngine On
SecRequestBodyAccess On# ...SecAuditEngine On
SecAuditLogRelevantStatus "^[0-9]+"
SecAuditLogParts ABCIJDEFHZ
SecAuditLogType concurrent
SecAuditLogStorageDir /usr/local/nginx/logs/modsecurity/www.example.com
A little explanation is required here. The first parameter SecAuditEngine On simply activates the ModSecurity Audit Log Engine. The SecAuditLogRelevantStatus is a regular expression matched over the HTTP response status code (200, 404, etc…) that act as a filter that, if match, an Audit Log is created (in my case, all response code will match). The SecAuditLogParts parameter tells to the audit log engine which part of the request and of the response it has to log (see the reference manual for more information). The SecAuditLogType parameter is set to “concurrent” that means: one file per transaction is used for audit logging, be careful because it could be a bottleneck for I/O on your HD (I solved this by using a DigitalOcean droplet for example). SecAuditLogStorageDir simply set the destination of each log files (remember to use absolute path here).
Then you need to create the following directories, and make it writeable by the nginx user:
$ mkdir /usr/local/nginx/logs/modsecurity
$ mkdir /usr/local/nginx/logs/modsecurity/www.example.com
$ chown -R nobody:nogroup /usr/local/nginx/logs/modsecurity/*
Now, if you reload your Nginx configuration, you should see many files growing under the SecAuditLogStorageDir path. All files should be in JSON format. Now is time to parse them and send them to Elasticsearch!
Clone the python parser from my GitHub (feel free to edit it):
$ cd /opt/
$ git clone https://github.com/theMiddleBlue/modsecurity-to-elasticsearch.git
$ cd modsecurity-to-elasticsearch/
$ vi modsec_parser.py
and check if the Elasticsearch URL is right:
es = Elasticsearch(['http://127.0.0.1:9200'])
Before run it, maybe you need to install the elasticsearch-py SDK made by Elastic. You can use pip (apt-get install python-pip):
pip install elasticsearch
then run it:
python modsec_parser.py -d <auditlog directory>
The script simply recursively read each file in the “-d” directory, loads the JSON from the file content and sends it to Elasticsearch using the elasticsearch-py module. It will convert the “messages” list to many different arrays in order to solve the problem of “objects in array” that are not well supported. When you start it, you should see something like this:
It creates an index per day (named modsecurity_yyyymmdd) and it writes all incoming logs in the index with the date on which the file has been read. You can change it if you need to store all logs into a single index. You can run it in background with:
python modsec_parser.py -d <auditlog directory> > /dev/null 2>&1 &
Now you need to configure an index pattern as following:
then you can start to make searches and reports!
Have fun! :)