Saturday, July 16, 2005

Network Monitoring: Storage of capture data

I recently played around with trying to store some pcap capture data in a MySQL database so that I could analyze it and look for trends. I had the capture set to create 20MB full content files so that I could manipulate them easily:

tcpdump -s 1515 -C 20 -w content.lpc

I next created a Ruby script that would open the pcap file and write the data that I wanted to store to a CSV file that I would then bulk load into the MySQL database. This part worked very well and very quickly. I found that when I inserted the data into an InnoDB table, while only storing the source IP, destination IP and port, and the time of the packet, that 20 capture files would take up 1GB of space. Not only that, but it turned out to be over 1.3 million packets. This amount of data is really testing my SQL skills, as I try to create intelligent queries that will allow me to aggregate the data on specific parameters.

Anyone have any better solutions?

1 comment:

Josh Miller said...

Try looking at 100+GB of data with Ethereal and try to make sense of things...:).

I need something more robust and with much more capacity. Right now I'm creating a distributed Ruby solution that runs on 4+ machines at 100% CPU 24/7.