Monday, February 23, 2009

Importing Nepenthes honeypot logs into Mysql

This is a quick and dirty way of importing Nepenthes submissions log into mysql database, then use php to generate some statistics.

The first part is to prepare the logs:
cat sub |grep http://|sed 's/\[//'|sed 's/\]//'|sed -e 's/->/ /g' | sed 's/http:/ http/g' | sed 's/:/ /3g' |sed 's\/\ \g' |sed 's/T/ /'|sed 's/ /,/g2'|sed 's/,,,/,/g'|sed 's/,,/,/g' >db_http

cat sub |sed 's/\[//'|sed 's/\]//'|grep link://|sed -e 's/->/ /g' | sed 's/http:/ /g' | sed 's/:/ /3g' |sed 's\//\ \' |sed 's\/\ \'|sed 's/T/ /'|sed 's/ /,/g2'|sed 's/,,,/,/g'|sed 's/,,/,/g' >db_link

cat sub |grep tftp://|sed 's/\[//'|sed 's/\]//'|sed -e 's/->/ /g' | sed 's/:/ /3g' |sed 's\/\ \g' |sed 's/T/ /'|sed 's/ /,/g2'|sed 's/,,,/,/g'|sed 's/,,/,/g' >db_tftp

cat sub |grep ' ftp'|sed 's/\[//'|sed 's/\]//'|sed 's\/\ \g' |sed 's/T/ /'|sed -e 's/->/ /g'|sed 's/:/ /3g'| sed 's/@/ /g'|awk '{print $1,$2,$3,$4,$5,$8,$9,$10,$11}'|sed 's/ /,/g2'|sed 's/,,,/,/g'|sed
's/,,/,/g' >db_ftp

The above 4 lines will parse the submissions log and generate 4 different files "db_http, db_ftp, db_tftp, and db_link", these files are ready to be imported in the database with the below fields:
"date, time, attacker_ip,sensor,protocol, malware_srv,malware_srv_port,file,md5 "

Now it is easy to create a database with the above fields and import the 4 generated files into it.

#mysql --user=root --password=password

CREATE DATABASE nepenthes;
USE nepenthes;

CREATE TABLE submissions (
date datetime NOT NULL default '0000-00-00 00:00:00',
attacker_ip varchar(80) NOT NULL default '',
sensor varchar(80) NOT NULL default '',
protocol varchar(80) NOT NULL default '',
malware_srv varchar(80) NOT NULL default '',
malware_srv_port varchar(80) NOT NULL default '',
file varchar(80) NOT NULL,
md5 varchar(80) NOT NULL
);

ALTER TABLE submissions ADD `uniqueid` VARCHAR(32) NOT NULL default '';
ALTER TABLE submissions ADD INDEX ( `attacker_ip` );
ALTER TABLE submissions ADD INDEX ( `sensor` );
ALTER TABLE submissions ADD INDEX ( `malware_srv` );
ALTER TABLE submissions ADD INDEX ( `md5` );


LOAD DATA INFILE '/usr/local/src/nepenthes/db_http' INTO TABLE submissions FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
LOAD DATA INFILE '/usr/local/src/nepenthes/db_tftp' INTO TABLE submissions FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
LOAD DATA INFILE '/usr/local/src/nepenthes/db_ftp' INTO TABLE submissions FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
LOAD DATA INFILE '/usr/local/src/nepenthes/db_local' INTO TABLE submissions FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';


Please note and change the 4 files location above according to location of your files.

Now you will have the logs uploaded in your database.
The next step is to automate this process, I will not go through the automation, but it should be easy using a daily cron tab job on nepenthes sensor to stop nepenthes, rename the submissions log file, send the renamed file to a remote reporting server, where the above import operation will take place also using a daily cron job.

The second part is to generate a web reports with geoip location, I have used maxmind free geoip country database, first install maxmind light database locally then use the sample code below, it is just a proof of concept, you should spend some time on it to be user friendly...






No comments: