bro_json-logs

I am using the following to output bro logs to JSON:

#!bin/bash

Configure bro to write JSON logs

mkdir -p /opt/bro/share/bro/site/scripts
sudo tee /opt/bro/share/bro/site/scripts/json-logs.bro << EOF
@load tuning/json-logs

redef LogAscii::json_timestamps = JSON::TS_ISO8601;
redef LogAscii::use_json = T;
EOF

sudo tee -a /opt/bro/share/bro/site/local.bro << EOF

Load policy for JSON output

@load scripts/json-logs
EOF

I ship my logs with logstash into my ELK stack. In logstash I am using the date match ts ISO8601. This is reading my dates correctly the issue is, I think, the bro to json function of bro is not parsing the date correctly.

I begin to get records in logs with dates all over the map. Some dates are in the past, 1970, some are in the future, 2024, and some are correct. I have debugged my ELK stack down to the output of bro logs on the sensor and it appears that dates printed in bro logs are getting printed incorrectly and causing strange indices down stream in my ELK stack.

Has anyone had this issue? Is there a bug in the way bro is reading UNIX and translating to ISO8601? Any thoughts on correcting this. I don’t mind having TS in UNIX and having Logstash do the translating for me, but how would I do that with bro.

Thanks

Do you have an example of one of these log entries?

copied it from my kibana _source in the json view
(I use a patch to change id.orig_h in id_orig_h for elastic 2.0)
{
    "ts": "2015-11-17T18:23:15.519645Z",
    "uid": "CtqA6r1V0ob769KrQh",
    "id_orig_h": "192.168.1.122",
    "id_orig_p": 123,
    "id_resp_h": "17.253.84.253",
    "id_resp_p": 123,
    "proto": "udp",
    "duration": 0.393113,
    "orig_bytes": 48,
    "resp_bytes": 48,
    "conn_state": "SF",
    "missed_bytes": 0,
    "history": "Dd",
    "orig_pkts": 1,
    "orig_ip_bytes": 76,
    "resp_pkts": 1,
    "resp_ip_bytes": 76,
    "tunnel_parents": [],
    "resp_location": "37.323002,-122.032204"
  }

I didn’t realize I hadn’t posted this on the forum and I had just responded to one person.

First, thank you Daniel for your help.

I’m running bro 2.4.1 on Ubuntu 12.04 and collecting live traffic.

I have bro configured to print logs in in json with timestamps in ISO8601. My problem is some timestamps in the logs are printed with days and years in the future and past. Below is an example of logs with the TS in 2016. I have many other examples I’m happy to share if it would help people.

So here are my configs:

for bro I use the following to change the logs to json:

@load tuning/json-logs

redef LogAscii::json_timestamps = JSON::TS_ISO8601;

redef LogAscii::use_json = T;

If I perform a search of the printed logs on my sensor for dates that are in the future I see the following:

zcat -r /nsm/bro/logs/* | grep -r “ts”:"2016

{"ts":"2016-08-27T16:10:11.269010Z","uid":"CfxBj3Sb7mz0Dmv8i","id.orig_h":"192.168.10.100","id.orig_p":62662,"id.resp_h":"104.85.62.152","id.resp_p":443,"name":"window_recision","notice":false,"peer":"satcon99-eth2-2"}

{"ts":"2016-02-06T19:13:33.416241Z","fuid":"FJsBOM25KVPMKSxu07","tx_hosts":["23.31.220.33"],"rx_hosts":["17.110.224.215"],"conn_uids":["CIxuLC3WAExVyJisll"],"source":"SSL","depth":0,"analyzers":["SHA1","MD5","X509"],"mime_type":"application/pkix-cert","duration":0.0,"local_orig":true,"is_orig":true,"seen_bytes":674,"missing_bytes":0,"overflow_bytes":0,"timedout":false,"md5":"2a465a2f7f29fc9677683337114d5cb0","sha1":"2f6854f72d69c47ebadd2a859cade1f7492190a2"}

{"ts":"2016-01-30T17:12:50.116824Z","uid":"Cw4Sfa3mwiDWVraPq5","id.orig_h":"23.31.220.33","id.orig_p":12912,"id.resp_h":"17.164.1.22","id.resp_p":443,"proto":"tcp","duration":0.042421,"orig_bytes":0,"resp_bytes":0,"conn_state":"SH","local_orig":true,"local_resp":false,"missed_bytes":0,"history":"FA","orig_pkts":2,"orig_ip_bytes":80,"resp_pkts":0,"resp_ip_bytes":0,"tunnel_parents":[],"resp_cc":"US","sensorname":"satcon99-eth3"}

{"ts":"2016-01-19T00:00:00.078926Z","uid":"CPrS9S3LLs3oGVRAQ2","id.orig_h":"192.168.10.115","id.orig_p":64606,"id.resp_h":"50.31.185.42","id.resp_p":80,"name":"unescaped_special_URI_char","notice":false,"peer":"satcon99-eth2-2"}

{"ts":"2016-10-20T23:59:59.681405Z","fuid":"FQsWce4xIknwzbuBCg","tx_hosts":["54.187.59.44"],"rx_hosts":["192.168.10.143"],"conn_uids":["C8EhsO2mX4IVRshw07"],"source":"HTTP","depth":0,"analyzers":["MD5","SHA1"],"mime_type":"image/gif","duration":0.0,"local_orig":false,"is_orig":false,"seen_bytes":43,"total_bytes":43,"missing_bytes":0,"overflow_bytes":0,"timedout":false,"md5":"325472601571f31e1bf00674c368d335","sha1":"2daeaa8b5f19f0bc209d976c02bd6acb51b00b0a"}

{"ts":"2016-06-09T12:00:00.819970Z","uid":"Con3l44xdVoLgXarc3","id.orig_h":"23.31.220.33","id.orig_p":12721,"id.resp_h":"66.119.33.141","id.resp_p":80,"name":"data_before_established","notice":false,"peer":"satcon99-eth3-1"}

{"ts":"2016-04-20T23:59:59.918852Z","uid":"CBXuzu4OTSZyy7OHS7","id.orig_h":"23.31.220.33","id.orig_p":5109,"id.resp_h":"173.194.123.104","id.resp_p":443,"name":"possible_split_routing","notice":false,"peer":"satcon99-eth3-1"}

{"ts":"2016-06-16T23:59:59.011608Z","uid":"CCmxue4UZiXJPZTcG4","id.orig_h":"192.168.10.104","id.orig_p":57418,"id.resp_h":"10.1.72.2","id.resp_p":5601,"proto":"tcp","duration":3.001216,"orig_bytes":0,"resp_bytes":0,"conn_state":"S0","local_orig":true,"local_resp":false,"missed_bytes":0,"history":"S","orig_pkts":4,"orig_ip_bytes":208,"resp_pkts":0,"resp_ip_bytes":0,"tunnel_parents":[],"sensorname":"satcon99-eth2"}

My assumption is that there is an error in the way bro is parsing the unix time and converting it to JSON. Is this a bug?? Why would the sensor be outputting logs with TS in the future.

It has been suggested to try
redef LogAscii::json_timestamps = TS_MILLIS
(Thanks Daniel)

I am trying that in the morning but I wanted to know if there were any other who have experienced this issue when printing json logs with ISO8601 timestamps.

Thank you Daniel and Derek, it appears that changing the bro ts to TS_MILLIS and using a logstash date match UNIX_MS filter has fixed the strange date issue.

This leads me to believe there is some issue how bro interprets prints ISO8601 timestamps. But for now this work around will suffice.

Thanks again

That's really odd because it is not interpreting them. The timestamp is already in seconds, it just passes it on to strftime:

    time_t t = time_t(val->val.double_val);
    if ( strftime(buffer, sizeof(buffer), "%Y-%m-%dT%H:%M:%S", gmtime(&t)) > 0 )

the TS_MILLIS setting just outputs val->val.double_val * 1000 as-is