Problems parallelizing btests

btest:topic/robin/parallel has a version of btest that can run tests
in parallel. That works pretty well, except for two issues with Bro's
standard tests:

- The coverage analysis doesn't like running in parallel, it messes
   up the state file. Jon, do you think we could get that to work
   somehow?

- The communication tests can't be parallelized because they use the
   same port. The btest in the branch supports groups of tests that
   are executed sequentially, which solves the problem. But
   unfortunately that comes at the expenses of losing much of the
   speed-up that parallelizing would otherwise be able to achieve
   (because the communication tests take the longest).

   I'm wondering if we could randomize the ports being used in some
   form. But not sure how that would look like.

Other than these, it's actually pretty cool to run the tests in
parallel. :slight_smile:

Btw, that branch also has a new option to rerun only tests that failed
last time.

Robin

I'm wondering if we could randomize the ports being used in some
  form. But not sure how that would look like.

How about we read in the port to use as an environment variable? Btest could just set that before running each test (maybe we could limit it to only set it for communication tests?).

Other than these, it's actually pretty cool to run the tests in
parallel. :slight_smile:

Looking forward to playing with it!

  .Seth

Do they really need to be random or just unique for each test? If the latter, maybe the port could be derived from the test names which themselves could be numbered.

- The coverage analysis doesn't like running in parallel, it messes
  up the state file. Jon, do you think we could get that to work
  somehow?

Yeah, what I'm thinking is to have Brofiler.cc pass BRO_PROFILER_FILE through mkstemp() instead of fopen() and then change that env. var. in the btest.cfg files to use some .XXXXX suffix so that each bro instance writes coverage state to a unique file. Then I've already got a script in testing/scripts/coverage-calc that glues coverage files together. I'll go ahead and try that real quick and commit if it works out.

+Jon

How about we read in the port to use as an environment variable?
Btest could just set that before running each test

Do they really need to be random or just unique for each test? If the
latter, maybe the port could be derived from the test names which
themselves could be numbered.

Just unique is indeed fine. How about a combination of the two: broctl
numbers all tests internally and passes the current test's number on
via an environment variable. The test can then derive a port from
that, and it's a bit more general in that we might end up using the
number for other purposes too.

The remaining piece is then using the environment variable to
configure the port for Bro and Broccoli. Would be nice if we could do
that centrally somehow, not manually in each test needing it.

This branch could use some testing as well, btw. I've also
restructured things internally a bit. README isn't updated yet but the
new options are:

  -j THREADS, --jobs=THREADS
                        number of threads to run tests in simultaniously; 0
                        disables threading
  -g GROUP, --group=GROUP
                        execute only test of given group, or '-' for those
                        without any group
  -r, --rerun Execute commands for tests that failed last time

(For the Bro tests, one currently needs to remove the
BRO_PROFILER_FILE variable from btest.cfg to make it work.)

Robin

It appears that OutputHandlers.py isn't getting installed. When I run
btest (after doing "python setup.py install"), I see this error:

ImportError: No module named OutputHandlers

Oops, yeah. I always run it directly from the source directory, which
is why I didn't notice. Will fix.

Robin

It seems that using the "-f" option (without "-b" or "-v") now
prevents the status message for each test from being output.
The following patch should fix this bug:

--- a/btest
+++ b/btest
@@ -927,11 +927,9 @@ if Options.diagfile:

  if Options.verbose:
      output_handlers += [Verbose(Options, )]