I didn’t realize I hadn’t posted this on the forum and I had just responded to one person.
First, thank you Daniel for your help.
I’m running bro 2.4.1 on Ubuntu 12.04 and collecting live traffic.
I have bro configured to print logs in in json with timestamps in ISO8601. My problem is some timestamps in the logs are printed with days and years in the future and past. Below is an example of logs with the TS in 2016. I have many other examples I’m happy to share if it would help people.
So here are my configs:
for bro I use the following to change the logs to json:
@load tuning/json-logs
redef LogAscii::json_timestamps = JSON::TS_ISO8601;
redef LogAscii::use_json = T;
If I perform a search of the printed logs on my sensor for dates that are in the future I see the following:
zcat -r /nsm/bro/logs/* | grep -r “ts”:"2016
{"ts":"2016-08-27T16:10:11.269010Z","uid":"CfxBj3Sb7mz0Dmv8i","id.orig_h":"192.168.10.100","id.orig_p":62662,"id.resp_h":"104.85.62.152","id.resp_p":443,"name":"window_recision","notice":false,"peer":"satcon99-eth2-2"}
{"ts":"2016-02-06T19:13:33.416241Z","fuid":"FJsBOM25KVPMKSxu07","tx_hosts":["23.31.220.33"],"rx_hosts":["17.110.224.215"],"conn_uids":["CIxuLC3WAExVyJisll"],"source":"SSL","depth":0,"analyzers":["SHA1","MD5","X509"],"mime_type":"application/pkix-cert","duration":0.0,"local_orig":true,"is_orig":true,"seen_bytes":674,"missing_bytes":0,"overflow_bytes":0,"timedout":false,"md5":"2a465a2f7f29fc9677683337114d5cb0","sha1":"2f6854f72d69c47ebadd2a859cade1f7492190a2"}
{"ts":"2016-01-30T17:12:50.116824Z","uid":"Cw4Sfa3mwiDWVraPq5","id.orig_h":"23.31.220.33","id.orig_p":12912,"id.resp_h":"17.164.1.22","id.resp_p":443,"proto":"tcp","duration":0.042421,"orig_bytes":0,"resp_bytes":0,"conn_state":"SH","local_orig":true,"local_resp":false,"missed_bytes":0,"history":"FA","orig_pkts":2,"orig_ip_bytes":80,"resp_pkts":0,"resp_ip_bytes":0,"tunnel_parents":[],"resp_cc":"US","sensorname":"satcon99-eth3"}
{"ts":"2016-01-19T00:00:00.078926Z","uid":"CPrS9S3LLs3oGVRAQ2","id.orig_h":"192.168.10.115","id.orig_p":64606,"id.resp_h":"50.31.185.42","id.resp_p":80,"name":"unescaped_special_URI_char","notice":false,"peer":"satcon99-eth2-2"}
{"ts":"2016-10-20T23:59:59.681405Z","fuid":"FQsWce4xIknwzbuBCg","tx_hosts":["54.187.59.44"],"rx_hosts":["192.168.10.143"],"conn_uids":["C8EhsO2mX4IVRshw07"],"source":"HTTP","depth":0,"analyzers":["MD5","SHA1"],"mime_type":"image/gif","duration":0.0,"local_orig":false,"is_orig":false,"seen_bytes":43,"total_bytes":43,"missing_bytes":0,"overflow_bytes":0,"timedout":false,"md5":"325472601571f31e1bf00674c368d335","sha1":"2daeaa8b5f19f0bc209d976c02bd6acb51b00b0a"}
{"ts":"2016-06-09T12:00:00.819970Z","uid":"Con3l44xdVoLgXarc3","id.orig_h":"23.31.220.33","id.orig_p":12721,"id.resp_h":"66.119.33.141","id.resp_p":80,"name":"data_before_established","notice":false,"peer":"satcon99-eth3-1"}
{"ts":"2016-04-20T23:59:59.918852Z","uid":"CBXuzu4OTSZyy7OHS7","id.orig_h":"23.31.220.33","id.orig_p":5109,"id.resp_h":"173.194.123.104","id.resp_p":443,"name":"possible_split_routing","notice":false,"peer":"satcon99-eth3-1"}
{"ts":"2016-06-16T23:59:59.011608Z","uid":"CCmxue4UZiXJPZTcG4","id.orig_h":"192.168.10.104","id.orig_p":57418,"id.resp_h":"10.1.72.2","id.resp_p":5601,"proto":"tcp","duration":3.001216,"orig_bytes":0,"resp_bytes":0,"conn_state":"S0","local_orig":true,"local_resp":false,"missed_bytes":0,"history":"S","orig_pkts":4,"orig_ip_bytes":208,"resp_pkts":0,"resp_ip_bytes":0,"tunnel_parents":[],"sensorname":"satcon99-eth2"}
My assumption is that there is an error in the way bro is parsing the unix time and converting it to JSON. Is this a bug?? Why would the sensor be outputting logs with TS in the future.
It has been suggested to try
redef LogAscii::json_timestamps = TS_MILLIS
(Thanks Daniel)
I am trying that in the morning but I wanted to know if there were any other who have experienced this issue when printing json logs with ISO8601 timestamps.