A suite of regression tests to help automate validation of new builds/features in j2ms2/tConvert/other tools
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
haavee bac1e818d1 git submodule should be gotten from https 1 week ago
ES085A Test description in README.md format 1 week ago
jive-toolchain-verify @ e09d4c3be0 Added toolchain verify submodule 2 weeks ago
python Download testdata 2 weeks ago
.gitignore Important files that shoulda been added 3 weeks ago
.gitmodules git submodule should be gotten from https 1 week ago
.run_main_from_here Important files that shoulda been added 3 weeks ago
README.md Add jive-toolchain-verify tests 2 weeks ago
main Init & update submodule automatically 2 weeks ago



Provide regression test(s) for j2ms2 / tConvert that can be run in (semi)automated fashion. The programs translate JIVE software correlator (SFXC) output into CASA MeasurementSet and FITS-IDI format.

The goal of this testsuite is to verify that changes in the source code do not alter the data and meta data content unexpectedly by running the toolchain-under-test on data with known properties.

The test suite tests the following:

  • verify that the j2ms2 and tConvert under test run succesfully
  • compare several aspects of the newly created output with pre-existing known-good output:
    • data content is still the same (integrated weight per baseline/source combination)
    • meta data content is still the same (antenna, source, and frequency table content)

The test cleans up after itself if all tests have passed without errors. Otherwise all artefacts (produced data sets and log files) will be left and a message is printed containing the location of the artefacts.


The regression test suite depends on the following other tools:


j2ms2 and tConvert come in a pair, typically built or installed from the same source distribution. The framework uses Python unittest to execute. However, due to the fact that python -m unittest discover does not allow passing of any arguments to subsequent tests, a workaround was introduced.

$> cd /path/to/checked-out-jive-toolchain-regression
$> ./main [options]

You can run ./main -h for help on [options]

Which python and/or jplotter to use?

Pass any/all of the following options to main to set the path to the desired version to use:

--python PATH       Path to the Python interpreter to use
--jplotter PATH     Path to the jplotter program to use

Which j2ms2/tConvert to test?

The (optional) parameter to pass is which j2ms2/tConvert to test. By default the program looks in your $PATH environment for j2ms2 and tConvert and uses those.

Since the toolchain moved to using CMake, the binaries can be in slightly different locations, depending on whether the freshly built binaries or the installed binaries are desired.

The test code supports the following (mutually exclusive) command line options:

--cmake-build-dir DIR       use binaries DIR/app/{j2ms2,tConvert}/{j2ms2,tConvert}
--cmake-install-dir DIR     use binaries DIR/bin/{j2ms2,tConvert}