Member Site › Forums › Rosetta 3 › Rosetta 3 – Build/Install › rosetta mpi build
- This topic has 4 replies, 3 voices, and was last updated 5 years, 2 months ago by Anonymous.
-
AuthorPosts
-
-
October 18, 2019 at 6:58 am #3280Anonymous
I’m trying to build rosetta with mpi and found that because of file `src/utility/crash_report.cc` line 178 lacks semicolon at the end of the line it fails to build.
This happens with version 3.11 and 2019.40 (latest).
And I want to ask that is it able to build rosetta with cmake?
I’m currently trying to build by cmake at directory `cmake/build_sharedmpi` with ninja and intel compiler but it fails to build.
Is there any suitable compiler for the build?(maybe gcc?)
-
October 18, 2019 at 1:11 pm #15017Anonymous
You’re right, this looks like a bug. We can fix this in master but the fix may take some time to get to you. Are you comfortable editting `Rosetta/main/source/src/utility/crash_report.cc`and adding a semicolon at the end of that line? That should fix your problem immediately.
-
October 21, 2019 at 6:14 am #15023Anonymous
Yes, I already fixed the source code on my PC and compile on that file is fine.
-
-
October 18, 2019 at 3:23 pm #15018Anonymous
Out of curiosity, which MPI implementation are you using? (This error should only show up if you’re using an implementation of MPI that we didn’t consider, and I thought we had considered most of the major ones.)
Regarding CMake, the CMake compile is somewhat supported, but to a rather limited extent. (Mainly just for internal development people to use.) Scons is still the official way to build Rosetta. — One issue with CMake is there’s different levels of support for the various builds. The regular release and debug builds are well-used under CMake, but some of the other builds are less well used. In particular, some of the more niche builds (build_sharedmpi is probably one of them) aren’t necessarily robust. They may have been written for a particular person on a particular system, without regard for how well it may work on other machines/with other compilers. (For build_sharedmpi I’d probably assume GCC with OpenMPI, though I don’t know for sure.)
Generally, if you’re having problems building, I would recommend sticking with the scons build commands.
-
October 21, 2019 at 5:01 pm #15024Anonymous
I’m currently trying to compile it on computer cluster, and on it it uses HPE MPI or Intel MPI library.
I loaded Intel MPI library, which is based on MPICH2 so it might work if I change some build option but I’m not sure because the error message during CMake compile seems to be executable linking problem like below.
[1/597] Linking CXX executable torsional_potential_corrections
FAILED: torsional_potential_corrections
: && /home/app/intel/impi/2018.4.274/bin64/mpicxx -O3 -finline-limit=20000 -s -pipe -w -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -std=c++11 -pipe -ffor-scope -ftemplate-depth-256 -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -I /usr/include -I /usr/local/include -I src -I external/include -I src/platform/linux -Wl,--no-as-needed -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wall -Wextra -pedantic -Werror -Wno-long-long -Wno-strict-aliasing -Wno-unused-variable -Wno-unused-parameter -Wno-unused-function -rdynamic CMakeFiles/torsional_potential_corrections.dir/home/userid/rosetta_src_2019.35.60890_bundle/main/source/src/apps/public/weight_optimization/torsional_potential_corrections.cc.o -o torsional_potential_corrections -L/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/boost_1_55_0 -L/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/lib -Wl,-rpath,/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/boost_1_55_0:/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi/../../external/lib:/home/userid/rosetta_src_2019.35.60890_bundle/main/source/cmake/build_sharedmpi: libdevel.so libprotocols.8.so libprotocols.7.so libprotocols_e.6.so libprotocols_d.6.so libprotocols_c.6.so libprotocols_b.6.so libprotocols_a.6.so libprotocols_h.5.so libprotocols_g.5.so libprotocols_f.5.so libprotocols_e.5.so libprotocols_d.5.so libprotocols_c.5.so libprotocols_b.5.so libprotocols_a.5.so libprotocols.4.so libprotocols.3.so libprotocols_b.2.so libprotocols_a.2.so libprotocols.1.so libcore.5.so libcore.4.so libcore.3.so libcore.2.so libcore.1.so libbasic.so libnumeric.so libutility.so libObjexxFCL.so -lz -ldl -lstdc++ libcifparse.so liblibxml2.so libcppdb.so libsqlite3.so && :
libprotocols_a.2.so: undefined reference to `protocols::genetic_algorithm::EntityElement::operator=(protocols::genetic_algorithm::EntityElement const&)'
ninja: build stopped: subcommand failed.I’m also trying scons, though it stops compiling with some different error message which might be caused by my environment variable.
Actually I successfully finished compile on my local PC (OpenMPI installed) so this might be compiler dependent.
-
-
-
AuthorPosts
- You must be logged in to reply to this topic.