Member Site › Forums › Rosetta 3 › Rosetta 3 – Build/Install › Unit Tests failing for Rosetta installed with mpi on Ubuntu 16.04
- This topic has 15 replies, 4 voices, and was last updated 7 years, 11 months ago by Anonymous.
-
AuthorPosts
-
-
June 14, 2016 at 5:11 am #2440Anonymous
Dear Users,
I have been trying to install and run Rosetta 2016.15 weekly build (with MPI) on a core i7 system on Ubuntu 16.04. For installation, I used the following command
> python ./scons.py mode=release extras=mpi bin
and compiled the test scripts using:
> python ./scons.py mode=release extras=mpi cat=test
The system already has openmpi-1.10.2. The installation was without any errors. However, when I ran the unit tests, I got numerous errors with a 27% pass statistic.
Following is the output for the unit tests:
> python test/run.py –mode=release –extras=mpi
Identifying platform…
Platform found: release/linux/4.4/64/x86/gcc/5.3/mpi
Running one suite: RestrictNonSurfaceToRepackingOperationTests
Test suite: RestrictNonSurfaceToRepackingOperationTests (test/protocols/toolbox/task_operations/RestrictNonSurfaceToRepackingOperation.cxxtest.hh)
[bplab-PC:26395] *** Process received signal ***
[bplab-PC:26395] Signal: Segmentation fault (11)
[bplab-PC:26395] Signal code: (128)
[bplab-PC:26395] Failing at address: (nil)
[bplab-PC:26395] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354a0)[0x7fb1d766e4a0]
[bplab-PC:26395] [ 1] /lib/x86_64-linux-gnu/libc.so.6(strlen+0x26)[0x7fb1d76c3d16]
[bplab-PC:26395] [ 2] /usr/local/lib/libopen-pal.so.13(opal_argv_join+0x44)[0x7fb1d6407974]
[bplab-PC:26395] [ 3] /usr/local/lib/libmpi.so.12(ompi_mpi_init+0x29c)[0x7fb1d82e571c]
[bplab-PC:26395] [ 4] /usr/local/lib/libmpi.so.12(MPI_Init+0x15d)[0x7fb1d8308c2d]
[bplab-PC:26395] [ 5] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init8init_mpiEiPPc+0x43)[0x7fb1dd9e2813]
[bplab-PC:26395] [ 6] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init4initEiPPc+0x1a)[0x7fb1dd9e700a]
[bplab-PC:26395] [ 7] ./protocols.test[0x6830ab]
[bplab-PC:26395] [ 8] ./protocols.test[0xb39ce9]
[bplab-PC:26395] [ 9] ./protocols.test[0x53ad20]
[bplab-PC:26395] [10] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fb1d7659830]
[bplab-PC:26395] [11] ./protocols.test[0x53ae59]
[bplab-PC:26395] *** End of error message ***
Running one suite: FragmentCandidatesTests
Test suite: FragmentCandidatesTests (test/protocols/frag_picker/FragmentCandidatesTests.cxxtest.hh)
[bplab-PC:26400] *** Process received signal ***
[bplab-PC:26400] Signal: Segmentation fault (11)
[bplab-PC:26400] Signal code: Address not mapped (1)
[bplab-PC:26400] Failing at address: 0x21
[bplab-PC:26400] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354a0)[0x7f40924514a0]
[bplab-PC:26400] [ 1] /lib/x86_64-linux-gnu/libc.so.6(strlen+0x26)[0x7f40924a6d16]
[bplab-PC:26400] [ 2] /usr/local/lib/libopen-pal.so.13(opal_argv_join+0x44)[0x7f40911ea974]
[bplab-PC:26400] [ 3] /usr/local/lib/libmpi.so.12(ompi_mpi_init+0x29c)[0x7f40930c871c]
[bplab-PC:26400] [ 4] /usr/local/lib/libmpi.so.12(MPI_Init+0x15d)[0x7f40930ebc2d]
[bplab-PC:26400] [ 5] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init8init_mpiEiPPc+0x43)[0x7f40987c5813]
[bplab-PC:26400] [ 6] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init4initEiPPc+0x1a)[0x7f40987ca00a]
[bplab-PC:26400] [ 7] ./protocols.test[0x5d5d6b]
[bplab-PC:26400] [ 8] ./protocols.test[0xb2fc0e]
[bplab-PC:26400] [ 9] ./protocols.test[0xb39d3d]
[bplab-PC:26400] [10] ./protocols.test[0x53ad20]
[bplab-PC:26400] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f409243c830]
[bplab-PC:26400] [12] ./protocols.test[0x53ae59]
[bplab-PC:26400] *** End of error message ***
Running one suite: EntityCorrespondenceTests
Test suite: EntityCorrespondenceTests (test/protocols/pack_daemon/EntityCorrespondence.cxxtest.hh)
[bplab-PC:26407] *** Process received signal ***
[bplab-PC:26407] Signal: Segmentation fault (11)
[bplab-PC:26407] Signal code: Address not mapped (1)
[bplab-PC:26407] Failing at address: 0x21
[bplab-PC:26407] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354a0)[0x7f030b98d4a0]
[bplab-PC:26407] [ 1] /lib/x86_64-linux-gnu/libc.so.6(strlen+0x26)[0x7f030b9e2d16]
[bplab-PC:26407] [ 2] /usr/local/lib/libopen-pal.so.13(opal_argv_join+0x44)[0x7f030a726974]
[bplab-PC:26407] [ 3] /usr/local/lib/libmpi.so.12(ompi_mpi_init+0x29c)[0x7f030c60471c]
[bplab-PC:26407] [ 4] /usr/local/lib/libmpi.so.12(MPI_Init+0x15d)[0x7f030c627c2d]
[bplab-PC:26407] [ 5] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init8init_mpiEiPPc+0x43)[0x7f0311d01813]
[bplab-PC:26407] [ 6] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init4initEiPPc+0x1a)[0x7f0311d0600a]
[bplab-PC:26407] [ 7] ./protocols.test[0x6b0043]
[bplab-PC:26407] [ 8] ./protocols.test[0xb2fc0e]
[bplab-PC:26407] [ 9] ./protocols.test[0xb39d3d]
[bplab-PC:26407] [10] ./protocols.test[0x53ad20]
[bplab-PC:26407] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f030b978830]
[bplab-PC:26407] [12] ./protocols.test[0x53ae59]
[bplab-PC:26407] *** End of error message ***
Running one suite: RestrictToCDRH3LoopTest
Test suite: RestrictToCDRH3LoopTest (test/protocols/toolbox/task_operations/RestrictToCDRH3Loop.cxxtest.hh)
[bplab-PC:26412] *** Process received signal ***
[bplab-PC:26412] Signal: Segmentation fault (11)
[bplab-PC:26412] Signal code: Address not mapped (1)
[bplab-PC:26412] Failing at address: 0x21
[bplab-PC:26412] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354a0)[0x7f9a7669d4a0]
[bplab-PC:26412] [ 1] /lib/x86_64-linux-gnu/libc.so.6(strlen+0x26)[0x7f9a766f2d16]
[bplab-PC:26412] [ 2] /usr/local/lib/libopen-pal.so.13(opal_argv_join+0x44)[0x7f9a75436974]
[bplab-PC:26412] [ 3] /usr/local/lib/libmpi.so.12(ompi_mpi_init+0x29c)[0x7f9a7731471c]
[bplab-PC:26412] [ 4] /usr/local/lib/libmpi.so.12(MPI_Init+0x15d)[0x7f9a77337c2d]
[bplab-PC:26412] [ 5] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init8init_mpiEiPPc+0x43)[0x7f9a7ca11813]
[bplab-PC:26412] [ 6] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init4initEiPPc+0x1a)[0x7f9a7ca1600a]
[bplab-PC:26412] [ 7] ./protocols.test[0x6852db]
[bplab-PC:26412] [ 8] ./protocols.test[0xb39ce9]
[bplab-PC:26412] [ 9] ./protocols.test[0x53ad20]
[bplab-PC:26412] [10] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f9a76688830]
[bplab-PC:26412] [11] ./protocols.test[0x53ae59]
[bplab-PC:26412] *** End of error message ***
Running one suite: SimpleCycpepPredictApplicationTests
Test suite: SimpleCycpepPredictApplicationTests (test/protocols/cyclic_peptide_predict/SimpleCycpepPredictApplication.cxxtest.hh)
[bplab-PC:26417] *** Process received signal ***
[bplab-PC:26417] Signal: Segmentation fault (11)
[bplab-PC:26417] Signal code: Address not mapped (1)
[bplab-PC:26417] Failing at address: 0x21
[bplab-PC:26417] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354a0)[0x7f3c867784a0]
[bplab-PC:26417] [ 1] /lib/x86_64-linux-gnu/libc.so.6(strlen+0x26)[0x7f3c867cdd16]
[bplab-PC:26417] [ 2] /usr/local/lib/libopen-pal.so.13(opal_argv_join+0x44)[0x7f3c85511974]
[bplab-PC:26417] [ 3] /usr/local/lib/libmpi.so.12(ompi_mpi_init+0x29c)[0x7f3c873ef71c]
[bplab-PC:26417] [ 4] /usr/local/lib/libmpi.so.12(MPI_Init+0x15d)[0x7f3c87412c2d]
[bplab-PC:26417] [ 5] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init8init_mpiEiPPc+0x43)[0x7f3c8caec813]
[bplab-PC:26417] [ 6] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init4initEiPPc+0x1a)[0x7f3c8caf100a]
[bplab-PC:26417] [ 7] ./protocols.test[0x6d2e10]
[bplab-PC:26417] [ 8] ./protocols.test[0xb2fc0e]
[bplab-PC:26417] [ 9] ./protocols.test[0xb39d3d]
[bplab-PC:26417] [10] ./protocols.test[0x53ad20]
[bplab-PC:26417] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f3c86763830]
[bplab-PC:26417] [12] ./protocols.test[0x53ae59]
[bplab-PC:26417] *** End of error message ***
Running one suite: SandwichFeaturesTests
Test suite: SandwichFeaturesTests (test/protocols/features/SandwichFeaturesTests.cxxtest.hh)
[bplab-PC:26422] *** Process received signal ***
[bplab-PC:26422] Signal: Segmentation fault (11)
[bplab-PC:26422] Signal code: Address not mapped (1)
[bplab-PC:26422] Failing at address: 0x21
[bplab-PC:26422] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354a0)[0x7f0dc1e144a0]
[bplab-PC:26422] [ 1] /lib/x86_64-linux-gnu/libc.so.6(strlen+0x26)[0x7f0dc1e69d16]
[bplab-PC:26422] [ 2] /usr/local/lib/libopen-pal.so.13(opal_argv_join+0x44)[0x7f0dc0bad974]
[bplab-PC:26422] [ 3] /usr/local/lib/libmpi.so.12(ompi_mpi_init+0x29c)[0x7f0dc2a8b71c]
[bplab-PC:26422] [ 4] /usr/local/lib/libmpi.so.12(MPI_Init+0x15d)[0x7f0dc2aaec2d]
[bplab-PC:26422] [ 5] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init8init_mpiEiPPc+0x43)[0x7f0dc8188813]
[bplab-PC:26422] [ 6] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init4initEiPPc+0x1a)[0x7f0dc818d00a]
[bplab-PC:26422] [ 7] ./protocols.test[0x561f8c]
[bplab-PC:26422] [ 8] ./protocols.test[0xb2fc0e]
[bplab-PC:26422] [ 9] ./protocols.test[0xb39d3d]
[bplab-PC:26422] [10] ./protocols.test[0x53ad20]
[bplab-PC:26422] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f0dc1dff830]
[bplab-PC:26422] [12] ./protocols.test[0x53ae59]
[bplab-PC:26422] *** End of error message ***
Running one suite: JD2ResourceManagerJobInputterMultipleJobTagsTests
Test suite: JD2ResourceManagerJobInputterMultipleJobTagsTests (test/protocols/jd2/JD2ResourceManagerJobInputterMultipleJobTags.cxxtest.hh)
(0) Adding implicit resource ‘pdb_resource_1xu1FH_D.pdb’ for job whose startstruct is given as pdb=’1xu1FH_D.pdb’.
(0) Adding implicit resource ‘pdb_resource_1xu1FH_D.pdb’ for job whose startstruct is given as pdb=’1xu1FH_D.pdb’.
(0) Adding implicit resource ‘pdb_resource_1xu1FH_D.pdb’ for job whose startstruct is given as pdb=’1xu1FH_D.pdb’.
(0) Adding implicit resource ‘pdb_resource_1xu1FH_D.pdb’ for job whose startstruct is given as pdb=’1xu1FH_D.pdb’.
(0) Adding implicit resource ‘pdb_resource_1xu1FH_D.pdb’ for job whose startstruct is given as pdb=’1xu1FH_D.pdb’.
(0) Adding implicit resource ‘pdb_resource_1xu1FH_D.pdb’ for job whose startstruct is given as pdb=’1xu1FH_D.pdb’.
(0) Adding implicit resource ‘pdb_resource_1xu1FH_D.pdb’ for job whose startstruct is given as pdb=’1xu1FH_D.pdb’.
Running one suite: MakeCanonicalHelixTest
Test suite: MakeCanonicalHelixTest (test/protocols/membrane/benchmark/MakeCanonicalHelixTest.cxxtest.hh)
[bplab-PC:26431] *** Process received signal ***
[bplab-PC:26431] Signal: Segmentation fault (11)
[bplab-PC:26431] Signal code: Address not mapped (1)
[bplab-PC:26431] Failing at address: 0x21
[bplab-PC:26431] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354a0)[0x7f389314e4a0]
[bplab-PC:26431] [ 1] /lib/x86_64-linux-gnu/libc.so.6(strlen+0x26)[0x7f38931a3d16]
[bplab-PC:26431] [ 2] /usr/local/lib/libopen-pal.so.13(opal_argv_join+0x44)[0x7f3891ee7974]
[bplab-PC:26431] [ 3] /usr/local/lib/libmpi.so.12(ompi_mpi_init+0x29c)[0x7f3893dc571c]
[bplab-PC:26431] [ 4] /usr/local/lib/libmpi.so.12(MPI_Init+0x15d)[0x7f3893de8c2d]
[bplab-PC:26431] [ 5] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init8init_mpiEiPPc+0x43)[0x7f38994c2813]
[bplab-PC:26431] [ 6] /home/bp-lab/install-fold/rosetta_src_2016.15.58628_bundle/main/source/build/src/release/linux/4.4/64/x86/gcc/5.3/mpi/libcore.5.so(_ZN4core4init4initEiPPc+0x1a)[0x7f38994c700a]
[bplab-PC:26431] [ 7] ./protocols.test[0x764439]
A lot more of the above followed. It noticed a lot of Segmentation fault errors and it seems all errors have something to do with MPI shared libraries, particularly libcore.5.so.
I also tried to test a Rosetta installation (without MPI); although it fared much better in this case (97% tests passed), the result was still not 100%.
Kindly advise as to how these errors may be rectified.
On the same note, I needed one clarification. Do we need to install Rosetta with MPI for running it on all 8 cores of an i7 system? I was wondering if Rosetta built without MPI can utilize all 8 cores (I think it does not as during the unit tests I noticed that only a single core was being used). I have also tested and run GROMACS on the same system; it does not require MPI libraries for running on all cores.
Thank you in advance.
Abhishek Acharya
-
June 14, 2016 at 4:48 pm #11633Anonymous
I’m going to guess these are bleeding-edge errors – you are using versions of openmpi and gcc beyond what we are testing. I can’t fix any of them without far more detail than we have here (and a system with similar specs, which I don’t have). I’ve asked our test server admin to let me know what versions are proofed on the testing server so you can maybe try those.
The non-MPI unit test failures concern me a lot more than the MPI enabled ones. (I’ve never run the unit tests in mpi but I think you can.) Which tests are failing in non-MPI?
Except for a handful of specific tools, MPI is *never* needed – it just makes file management easier. Running Rosetta on 8 processors independently requires running it in 8 directories so the outputs don’t crash into each other (input_0001.pdb gets overwritten 7 times, etc). The main use of MPI is just to minimally communicate so that processors use different output numbers. So, no, you don’t need MPI. Exceptions include mpi_msd, probably loophash, and a few other tools.
Most Rosetta algorithms are Monte Carlo, so you are supposed to run hundreds/thousands of structures before sorting for your “results” – this parallelizes perfectly because those runs are independent anyway. That’s why you can usually just run on all N processors independently.
-
June 14, 2016 at 10:39 pm #11634Anonymous
Versions we test to:
clang version 3.4.2
GCC: 4.8.3
Open MPI: 1.6.4
We don’t run the unit tests in MPI (only selected integration tests). This means that probably none of them are supposed to work in MPI. Certainly the unit tests do some bizarre bits of setup that may be incompatible with MPI. I guess this means we probably oughtn’t be looking too deeply into the MPI unit tests. Do let me know about the non-MPI unit tests.
-
June 17, 2016 at 4:32 am #11656Anonymous
I will be using Rosetta for mostly active site design and protein engineering. I also expect it to be used for docking and generation peptide libraries. I can’t foresee exact applications as we are still exploring rosetta and its capabilities.
Will it work if I install a local copy of the working (older) gcc version and install rosetta by specifying the path to the local copy? I am not sure if rosetta has some other dependencies. Please let me know if I need anything else.
Thank you.
-
December 26, 2016 at 2:56 pm #12041Anonymous
I am using gcc 4..8.5 Ubuntu version 16.04.1 LTS
Using the command “sudo python test/run.py” I run the unit test the result is:
Total number of tests: 2451
number tests passed: 2445
number tests failed: 6
failed tests:
protocols.test: StructureDataTests:test_move_segments
protocols.test: StructureDataTests:test_enzdes_remarks
protocols.test: StructureDataTests:test_non_peptidic_bonds
protocols.test: AlignResiduesMoverTests:test_align_theozyme
protocols.test: StructureDataTests:test_delete_segment
protocols.test: AlignResiduesMoverTests:test_tomponent_cstfile
Success rate: 99%
End of Unit test summaryDone!
What should i do about it.
-
December 26, 2016 at 2:56 pm #12562Anonymous
I am using gcc 4..8.5 Ubuntu version 16.04.1 LTS
Using the command “sudo python test/run.py” I run the unit test the result is:
Total number of tests: 2451
number tests passed: 2445
number tests failed: 6
failed tests:
protocols.test: StructureDataTests:test_move_segments
protocols.test: StructureDataTests:test_enzdes_remarks
protocols.test: StructureDataTests:test_non_peptidic_bonds
protocols.test: AlignResiduesMoverTests:test_align_theozyme
protocols.test: StructureDataTests:test_delete_segment
protocols.test: AlignResiduesMoverTests:test_tomponent_cstfile
Success rate: 99%
End of Unit test summaryDone!
What should i do about it.
-
December 26, 2016 at 2:56 pm #13083Anonymous
I am using gcc 4..8.5 Ubuntu version 16.04.1 LTS
Using the command “sudo python test/run.py” I run the unit test the result is:
Total number of tests: 2451
number tests passed: 2445
number tests failed: 6
failed tests:
protocols.test: StructureDataTests:test_move_segments
protocols.test: StructureDataTests:test_enzdes_remarks
protocols.test: StructureDataTests:test_non_peptidic_bonds
protocols.test: AlignResiduesMoverTests:test_align_theozyme
protocols.test: StructureDataTests:test_delete_segment
protocols.test: AlignResiduesMoverTests:test_tomponent_cstfile
Success rate: 99%
End of Unit test summaryDone!
What should i do about it.
-
June 16, 2016 at 6:34 am #11648Anonymous
Thank you for your response.
Here is the summary of the unit test on non-MPI installation. This used GCC 5.3.1. Clang is not installed on the system.
Unit test summary
Total number of tests: 2285
number tests passed: 2235
number tests failed: 50
failed tests:
numeric.test: GeneralizedEigenSolverTests:test_4x4_complex_problem
protocols.test: DynamicAggregateFunctionTests:test_daf_vector_NPD_PROPERTY_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_correspondence_file
protocols.test: AntibodyTaskOps:test_AddCDRProfilesOperation
protocols.test: InterfaceTest:test_InterfaceTest
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_nocap_IN
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_listfile2
protocols.test: DynamicAggregateFunctionTests:test_daf_POSE_ENERGY_into_VECTOR_VARIABLE_command
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command_func_vecvarname
protocols.test: DynamicAggregateFunctionTests:test_daf_mean_of_vector_variable_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_duplicate_varname
protocols.test: AntibodyTaskOps:test_AddCDRProfileSetsOperation
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_secondary_resfile
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_listfile
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_varname2
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command_missing_right_curly_brace
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command_bad_varname
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_AA_SET
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_two_local_variables
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_function_varname
protocols.test: DynamicAggregateFunctionTests:test_initialize_empty_DAF
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_duplicated_local_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command_bad_vecvarname
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_varname_function_name
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_pdb_file
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_aa_list_missing_lcurly
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_varname
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_aa_list
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_two_local_variables_misplaced_comma
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_AA_SET_bad_name
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command
protocols.test: DynamicAggregateFunctionTests:test_daf_POSE_ENERGY_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_varname_illegal_name
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_pdb_file2
protocols.test: DynamicAggregateFunctionTests:test_daf_scalar_NPD_PROPERTY_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_secondary_resfile2
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_varname_duplicated
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SUB_EXPRESSION_command
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command_missing_equals_sign
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_read_simple_file
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_aa_list_missing_comma
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_illegal_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_correspondence_file2
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command_missing_commas
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command
protocols.test: TestCCDLoopClosureMover:test_CCDLoopClosureMover_exceptions
utility.test: UtilityNumbersTests:test_undefined
Success rate: 97%
End of Unit test summaryDone!
Please let me know if i need to install the older version of gcc to make this work.
Thanks again.
-
June 16, 2016 at 12:55 pm #11649Anonymous
It’s probably “working” fine as-is – those tests are mostly for classes you are unlikely to use. I’ll post your compiler version and list of failing tests on the mailing list to see if anyone is interested in poking at it.
-
June 16, 2016 at 10:32 pm #11655Anonymous
The mailing list thinks there’s an issue with std:map in GCC 5.2 plus that’s being exposed by this code, but nobody’s in position to debug it (you’re using newer versions than we are). If you’d let me know which applications you intend to use I can make an estimate of whether the code tested by the failing tests is likely to be exercised.
-
June 17, 2016 at 4:17 pm #11659Anonymous
Do you have the clang compiler installed already? If you add “cxx=clang” to the scons commandline, it will build with clang (executables will end with .linuxclangrelease). If you’re running the unit tests, you would then also have to add “–compiler=clang” to the test/run.py commandline.
Many people in the Rosetta community use clang, and if you’re worried about your gcc compile, it might provide you with some peace of mind. — You probably wouldn’t want to do an MPI compile with clang (unless your sysadmin set up the MPI libraries for clang), but as you’re probably not going need MPI …
-
June 17, 2016 at 8:25 pm #11668Anonymous
“Will it work if I install a local copy of the working (older) gcc version and install rosetta by specifying the path to the local copy?”
Probably, with some effort. There’s a cxx_ver flag to change the gcc version that Scons uses. I’d just use Rosetta as-is and not worry about it.
-
June 18, 2016 at 11:20 am #11671Anonymous
I ran unit tests on rosetta compiled with clang (without MPI). The version that was available on this system was clang-3.8.0. The summary is as follows:
Unit test summary
Total number of tests: 2280
number tests passed: 2234
number tests failed: 46
failed tests:
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command
protocols.test: DynamicAggregateFunctionTests:test_daf_vector_NPD_PROPERTY_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_correspondence_file
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_varname_duplicated
protocols.test: InterfaceTest:test_InterfaceTest
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_nocap_IN
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_listfile2
protocols.test: DynamicAggregateFunctionTests:test_daf_POSE_ENERGY_into_VECTOR_VARIABLE_command
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command_func_vecvarname
protocols.test: DynamicAggregateFunctionTests:test_daf_mean_of_vector_variable_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_listfile
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_varname2
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command_missing_right_curly_brace
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_AA_SET
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_two_local_variables
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_function_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_duplicated_local_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command_bad_vecvarname
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_varname_function_name
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_pdb_file
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_aa_list_missing_lcurly
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_missing_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_aa_list
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command_two_local_variables_misplaced_comma
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_AA_SET_bad_name
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_duplicate_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_POSE_ENERGY_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_varname_illegal_name
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_pdb_file2
protocols.test: DynamicAggregateFunctionTests:test_daf_scalar_NPD_PROPERTY_command
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_secondary_resfile2
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_EXPRESSION_command
protocols.test: DynamicAggregateFunctionTests:test_initialize_empty_DAF
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SUB_EXPRESSION_command
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command_missing_equals_sign
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_read_simple_file
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_SET_CONDITION_command_from_aa_list_missing_comma
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_VECTOR_command_illegal_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_correspondence_file2
protocols.test: DynamicAggregateFunctionTests:test_EntityFunc_AA_SET_command_missing_commas
protocols.test: DynamicAggregateFunctionTests:test_daf_VECTOR_VARIABLE_command_bad_varname
protocols.test: DynamicAggregateFunctionTests:test_daf_STATE_command_missing_secondary_resfile
protocols.test: TestCCDLoopClosureMover:test_CCDLoopClosureMover_exceptions
Success rate: 97%
End of Unit test summaryDone!
I now notice that smlewis mentioned that rosetta has been tested with clang-3.4.2; probably another bleeding edge error as I use a more recent version of clang. Now I went back to check if clang 3.4.2 is available in ubuntu software archive, but what I found was the oldest version available for install was v3.5; version-3.4 is obsolete for Ubuntu-16.04 . I don’t know how different is clang-3.5 compared to the previous version, but I will still try to compile rosetta with clang-3.5 and see if it works. Will post the results.
If this fails, i think I will for now work with rosetta in this state only and hope none of the failed classes are required in my case.
Thank you for your help.
-
January 1, 2017 at 8:14 pm #12046Anonymous
A) It strikes me as odd that you need sudo to run the unit tests.
looking back on this thread – all of the failures in DynamicAggregateFunctionTests turned out to be due to errors in our test code (ubsan or addsan errors, I forget which [and don’t really know what they are anyway]) – probably there are bugs in our tests that your particular setup is exposing.
You don’t mention which version of Rosetta you are testing, but recent weeklies (I think 3.7 as well) are C++11, which gcc 4.8 does not completely support. That might be the issue.
-
January 1, 2017 at 8:14 pm #12567Anonymous
A) It strikes me as odd that you need sudo to run the unit tests.
looking back on this thread – all of the failures in DynamicAggregateFunctionTests turned out to be due to errors in our test code (ubsan or addsan errors, I forget which [and don’t really know what they are anyway]) – probably there are bugs in our tests that your particular setup is exposing.
You don’t mention which version of Rosetta you are testing, but recent weeklies (I think 3.7 as well) are C++11, which gcc 4.8 does not completely support. That might be the issue.
-
January 1, 2017 at 8:14 pm #13088Anonymous
A) It strikes me as odd that you need sudo to run the unit tests.
looking back on this thread – all of the failures in DynamicAggregateFunctionTests turned out to be due to errors in our test code (ubsan or addsan errors, I forget which [and don’t really know what they are anyway]) – probably there are bugs in our tests that your particular setup is exposing.
You don’t mention which version of Rosetta you are testing, but recent weeklies (I think 3.7 as well) are C++11, which gcc 4.8 does not completely support. That might be the issue.
-
-
AuthorPosts
- You must be logged in to reply to this topic.