offloading the processing to different nodes

Dear all,

Lets imagine that I have a low end machine capturing traffic and want to send the pre-filtered events into a more beefy remote machine for analysis and event capturing. Can I do that?

Based on bro’s Input framework, I believe I can redirect an entire tcpdump into it BUT, I want some filtering to happen upfront, though the MAIN processing work should be executed somewhere else.

From what I understood (based on this architectural description), my low end computer in charge of the sniffing would run the “manager” code and my beefy machine(s) would run the workers. Is that how I would set things up?

And who writes the outputs, is it the workers OR do the workers pass the result back to the manager?

Also, if I were to use the File analysis framework, would it be possible to extract and analyze the files in the beefy computers instead of the manager node? I suspect I’ll have to transfer the full connection flow (so the file can be extracted) and that will generate a LOT of traffic (which is something I want to avoid). Are my assumptions correct?

Thank you,

Luis Silva