Member Site › Forums › Rosetta 3 › Rosetta 3 – Build/Install › gcc error during MPI compilation, unrecognized option ‘-plugin’
- This topic has 27 replies, 3 voices, and was last updated 7 years, 11 months ago by Anonymous.
-
AuthorPosts
-
-
January 12, 2017 at 3:45 am #2564Anonymous
Hello,
I am trying to complie the week46 built (rosetta_bin_linux_2016.46.59086_bundle) on a machine with 1) python/2.7.11 2) gcc/4.9.2 3) openmpi/1.10.1
I get the following error:
mpicc -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libsqlite3.so -shared build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/dbio/sqlite3/sqlite3.os -L/opt/applications/openmpi/1.10.1/gnu/lib -L/opt/applications/openmpi/1.10.1/gnu/lib/openmpi -L/opt/rh/devtoolset-3/root/usr/lib -L/opt/rh/devtoolset-3/root/usr/lib64 -L/opt/applications/python/2.7.11/gnu/lib -L/opt/applications/python/2.7.11/gnu/third-party/pytom/external/lib -L/usr/lib -L/usr/local/lib -Lexternal/lib
/usr/bin/ld: unrecognized option ‘-plugin’
/usr/bin/ld: use the –help option for usage information
collect2: error: ld returned 1 exit status
What’s the easiest way to get around this? any suggestion would be highly appreciated.
Thanks!
-
January 12, 2017 at 3:28 pm #12081Anonymous
I don’t see plugin as an option in our settings files, nor do I see it in the error snippet posted. Can you post more of the error? It should be relatively fast to compile at -j1, it will pretty quickly skip to where the error was, and then all the important bits will be at the bottom.
Am I correct in assuming this is a cluster administered by “somebody else” or do you have control of the machine?
-
January 12, 2017 at 3:28 pm #12602Anonymous
I don’t see plugin as an option in our settings files, nor do I see it in the error snippet posted. Can you post more of the error? It should be relatively fast to compile at -j1, it will pretty quickly skip to where the error was, and then all the important bits will be at the bottom.
Am I correct in assuming this is a cluster administered by “somebody else” or do you have control of the machine?
-
January 12, 2017 at 3:28 pm #13123Anonymous
I don’t see plugin as an option in our settings files, nor do I see it in the error snippet posted. Can you post more of the error? It should be relatively fast to compile at -j1, it will pretty quickly skip to where the error was, and then all the important bits will be at the bottom.
Am I correct in assuming this is a cluster administered by “somebody else” or do you have control of the machine?
-
January 12, 2017 at 3:43 pm #12082Anonymous
Are you sure the mpicc is what you think it is? What does `mpicc –version` print for you?
Also, do you know if the mpi compiler is installed correctly? Have you been able to successfully compile other MPI-using programs with it?
If this is a cluster administered by someone else, double-check the documentation or consult the sysadmins to make sure you’re setting up everything correctly for an MPI compile. Often with clusters there’s cluster-specific module loading commands that you have to run in order to make sure that the MPI compilation environment is set up correctly. (Attempting to compile without running them first will lead to errors.)
-
January 12, 2017 at 3:43 pm #12603Anonymous
Are you sure the mpicc is what you think it is? What does `mpicc –version` print for you?
Also, do you know if the mpi compiler is installed correctly? Have you been able to successfully compile other MPI-using programs with it?
If this is a cluster administered by someone else, double-check the documentation or consult the sysadmins to make sure you’re setting up everything correctly for an MPI compile. Often with clusters there’s cluster-specific module loading commands that you have to run in order to make sure that the MPI compilation environment is set up correctly. (Attempting to compile without running them first will lead to errors.)
-
January 12, 2017 at 3:43 pm #13124Anonymous
Are you sure the mpicc is what you think it is? What does `mpicc –version` print for you?
Also, do you know if the mpi compiler is installed correctly? Have you been able to successfully compile other MPI-using programs with it?
If this is a cluster administered by someone else, double-check the documentation or consult the sysadmins to make sure you’re setting up everything correctly for an MPI compile. Often with clusters there’s cluster-specific module loading commands that you have to run in order to make sure that the MPI compilation environment is set up correctly. (Attempting to compile without running them first will lead to errors.)
-
January 13, 2017 at 3:36 pm #12089Anonymous
Um, if “plugin” doesn’t appear as an argument to ld, then it’s pretty hard for us to debug where “plugin” is appearing.
Maybe your ld isn’t the normal linker? Do you have it aliased to something? What does “which ld” return? What does “/usr/bin/ld –help” return?
-
January 13, 2017 at 3:36 pm #12610Anonymous
Um, if “plugin” doesn’t appear as an argument to ld, then it’s pretty hard for us to debug where “plugin” is appearing.
Maybe your ld isn’t the normal linker? Do you have it aliased to something? What does “which ld” return? What does “/usr/bin/ld –help” return?
-
January 13, 2017 at 3:36 pm #13131Anonymous
Um, if “plugin” doesn’t appear as an argument to ld, then it’s pretty hard for us to debug where “plugin” is appearing.
Maybe your ld isn’t the normal linker? Do you have it aliased to something? What does “which ld” return? What does “/usr/bin/ld –help” return?
-
January 13, 2017 at 6:12 pm #12090Anonymous
Thanks for you reply!
” which ld” returns “/usr/bin/ld” and I have attached the output of “usr/bin/ld –help” here. How can I test whether my ld is a normal linker or it’s aliased to anything?
-
January 13, 2017 at 6:12 pm #12611Anonymous
Thanks for you reply!
” which ld” returns “/usr/bin/ld” and I have attached the output of “usr/bin/ld –help” here. How can I test whether my ld is a normal linker or it’s aliased to anything?
-
January 13, 2017 at 6:12 pm #13132Anonymous
Thanks for you reply!
” which ld” returns “/usr/bin/ld” and I have attached the output of “usr/bin/ld –help” here. How can I test whether my ld is a normal linker or it’s aliased to anything?
-
January 16, 2017 at 2:20 am #12095Anonymous
Looks like regular ld to me. I’m kind of stumped, since “plugin” doesn’t occur in the error message. Does non-MPI compiling work? Does the cluster have an MPI hello world tool to check the MPI install?
-
January 16, 2017 at 2:20 am #12616Anonymous
Looks like regular ld to me. I’m kind of stumped, since “plugin” doesn’t occur in the error message. Does non-MPI compiling work? Does the cluster have an MPI hello world tool to check the MPI install?
-
January 16, 2017 at 2:20 am #13137Anonymous
Looks like regular ld to me. I’m kind of stumped, since “plugin” doesn’t occur in the error message. Does non-MPI compiling work? Does the cluster have an MPI hello world tool to check the MPI install?
-
January 12, 2017 at 11:37 pm #12086Anonymous
Yes you are correct. This is a cluster and I don’t have an admin control. “plugin” appears nowhere else in the output. Here is more of the error:
mpicc -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libxml2/encoding.os -c -std=c99 -isystem external/boost_1_55_0/ -isystem external/ -isystem external/include/ -isystem external/dbio/ -pipe -Wno-long-long -Wno-strict-aliasing -march=core2 -mtune=generic -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wno-unused-variable -Wno-unused-parameter -DTRIO_HAVE_CONFIG_H -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -DBOOST_MATH_NO_LONG_DOUBLE_MATH_FUNCTIONS -DPTR_STD -DNDEBUG -DUSEMPI -Iexternal/include -Iexternal/boost_1_55_0 -Iexternal/libxml2/include -Iexternal -Iexternal/dbio -I/usr/include -I/usr/local/include external/libxml2/encoding.c
mpicc -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libxml2/entities.os -c -std=c99 -isystem external/boost_1_55_0/ -isystem external/ -isystem external/include/ -isystem external/dbio/ -pipe -Wno-long-long -Wno-strict-aliasing -march=core2 -mtune=generic -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wno-unused-variable -Wno-unused-parameter -DTRIO_HAVE_CONFIG_H -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -DBOOST_MATH_NO_LONG_DOUBLE_MATH_FUNCTIONS -DPTR_STD -DNDEBUG -DUSEMPI -Iexternal/include -Iexternal/boost_1_55_0 -Iexternal/libxml2/include -Iexternal -Iexternal/dbio -I/usr/include -I/usr/local/include external/libxml2/entities.c
mpiCC -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libcifparse.so -Wl,-rpath=/gpfs/home/maziar/rosetta/rosetta_bin_linux_2016.46.59086_bundle/main/source/build/external/release/linux/2.6/64/x86/gcc/4.9/mpi -Wl,-rpath=$ORIGIN -Wl,-rpath=$ORIGIN/../lib -shared build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/BlockIO.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifDataInfo.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifExcept.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifFileReadDef.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParentChild.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParserBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifScannerBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifString.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DataInfo.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DicFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICParserBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICScannerBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/Exceptions.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/GenCont.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/GenString.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ISTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ITTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/mapped_ptr_vector.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/mapped_vector.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ParentChild.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/RcsbFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/RcsbPlatform.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/Serializer.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/TableFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/TTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParser.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifScanner.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICParser.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICScanner.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regcomp.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regerror.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regexec.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regfree.os -Lexternal/lib -Lbuild/external/release/linux/2.6/64/x86/gcc/4.9/mpi -Lexternal -L/opt/applications/openmpi/1.10.1/gnu/lib -L/opt/applications/openmpi/1.10.1/gnu/lib/openmpi -L/opt/rh/devtoolset-3/root/usr/lib -L/opt/rh/devtoolset-3/root/usr/lib64 -L/opt/applications/python/2.7.11/gnu/lib -L/opt/applications/python/2.7.11/gnu/third-party/pytom/external/lib -L/usr/lib -L/usr/local/lib
/usr/bin/ld: unrecognized option ‘-plugin’
/usr/bin/ld: use the –help option for usage information
collect2: error: ld returned 1 exit status
scons: *** [build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libcifparse.so] Error 1
scons: building terminated because of errors.
-
January 12, 2017 at 11:37 pm #12607Anonymous
Yes you are correct. This is a cluster and I don’t have an admin control. “plugin” appears nowhere else in the output. Here is more of the error:
mpicc -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libxml2/encoding.os -c -std=c99 -isystem external/boost_1_55_0/ -isystem external/ -isystem external/include/ -isystem external/dbio/ -pipe -Wno-long-long -Wno-strict-aliasing -march=core2 -mtune=generic -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wno-unused-variable -Wno-unused-parameter -DTRIO_HAVE_CONFIG_H -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -DBOOST_MATH_NO_LONG_DOUBLE_MATH_FUNCTIONS -DPTR_STD -DNDEBUG -DUSEMPI -Iexternal/include -Iexternal/boost_1_55_0 -Iexternal/libxml2/include -Iexternal -Iexternal/dbio -I/usr/include -I/usr/local/include external/libxml2/encoding.c
mpicc -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libxml2/entities.os -c -std=c99 -isystem external/boost_1_55_0/ -isystem external/ -isystem external/include/ -isystem external/dbio/ -pipe -Wno-long-long -Wno-strict-aliasing -march=core2 -mtune=generic -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wno-unused-variable -Wno-unused-parameter -DTRIO_HAVE_CONFIG_H -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -DBOOST_MATH_NO_LONG_DOUBLE_MATH_FUNCTIONS -DPTR_STD -DNDEBUG -DUSEMPI -Iexternal/include -Iexternal/boost_1_55_0 -Iexternal/libxml2/include -Iexternal -Iexternal/dbio -I/usr/include -I/usr/local/include external/libxml2/entities.c
mpiCC -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libcifparse.so -Wl,-rpath=/gpfs/home/maziar/rosetta/rosetta_bin_linux_2016.46.59086_bundle/main/source/build/external/release/linux/2.6/64/x86/gcc/4.9/mpi -Wl,-rpath=$ORIGIN -Wl,-rpath=$ORIGIN/../lib -shared build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/BlockIO.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifDataInfo.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifExcept.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifFileReadDef.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParentChild.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParserBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifScannerBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifString.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DataInfo.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DicFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICParserBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICScannerBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/Exceptions.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/GenCont.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/GenString.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ISTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ITTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/mapped_ptr_vector.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/mapped_vector.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ParentChild.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/RcsbFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/RcsbPlatform.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/Serializer.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/TableFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/TTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParser.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifScanner.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICParser.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICScanner.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regcomp.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regerror.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regexec.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regfree.os -Lexternal/lib -Lbuild/external/release/linux/2.6/64/x86/gcc/4.9/mpi -Lexternal -L/opt/applications/openmpi/1.10.1/gnu/lib -L/opt/applications/openmpi/1.10.1/gnu/lib/openmpi -L/opt/rh/devtoolset-3/root/usr/lib -L/opt/rh/devtoolset-3/root/usr/lib64 -L/opt/applications/python/2.7.11/gnu/lib -L/opt/applications/python/2.7.11/gnu/third-party/pytom/external/lib -L/usr/lib -L/usr/local/lib
/usr/bin/ld: unrecognized option ‘-plugin’
/usr/bin/ld: use the –help option for usage information
collect2: error: ld returned 1 exit status
scons: *** [build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libcifparse.so] Error 1
scons: building terminated because of errors.
-
January 12, 2017 at 11:37 pm #13128Anonymous
Yes you are correct. This is a cluster and I don’t have an admin control. “plugin” appears nowhere else in the output. Here is more of the error:
mpicc -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libxml2/encoding.os -c -std=c99 -isystem external/boost_1_55_0/ -isystem external/ -isystem external/include/ -isystem external/dbio/ -pipe -Wno-long-long -Wno-strict-aliasing -march=core2 -mtune=generic -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wno-unused-variable -Wno-unused-parameter -DTRIO_HAVE_CONFIG_H -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -DBOOST_MATH_NO_LONG_DOUBLE_MATH_FUNCTIONS -DPTR_STD -DNDEBUG -DUSEMPI -Iexternal/include -Iexternal/boost_1_55_0 -Iexternal/libxml2/include -Iexternal -Iexternal/dbio -I/usr/include -I/usr/local/include external/libxml2/encoding.c
mpicc -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libxml2/entities.os -c -std=c99 -isystem external/boost_1_55_0/ -isystem external/ -isystem external/include/ -isystem external/dbio/ -pipe -Wno-long-long -Wno-strict-aliasing -march=core2 -mtune=generic -O3 -ffast-math -fno-finite-math-only -funroll-loops -finline-functions -finline-limit=20000 -s -Wno-unused-variable -Wno-unused-parameter -DTRIO_HAVE_CONFIG_H -fPIC -DBOOST_ERROR_CODE_HEADER_ONLY -DBOOST_SYSTEM_NO_DEPRECATED -DBOOST_MATH_NO_LONG_DOUBLE_MATH_FUNCTIONS -DPTR_STD -DNDEBUG -DUSEMPI -Iexternal/include -Iexternal/boost_1_55_0 -Iexternal/libxml2/include -Iexternal -Iexternal/dbio -I/usr/include -I/usr/local/include external/libxml2/entities.c
mpiCC -o build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libcifparse.so -Wl,-rpath=/gpfs/home/maziar/rosetta/rosetta_bin_linux_2016.46.59086_bundle/main/source/build/external/release/linux/2.6/64/x86/gcc/4.9/mpi -Wl,-rpath=$ORIGIN -Wl,-rpath=$ORIGIN/../lib -shared build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/BlockIO.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifDataInfo.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifExcept.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifFileReadDef.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParentChild.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParserBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifScannerBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifString.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DataInfo.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DicFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICParserBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICScannerBase.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/Exceptions.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/GenCont.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/GenString.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ISTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ITTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/mapped_ptr_vector.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/mapped_vector.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/ParentChild.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/RcsbFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/RcsbPlatform.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/Serializer.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/TableFile.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/TTable.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifParser.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/CifScanner.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICParser.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/DICScanner.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regcomp.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regerror.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regexec.os build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/cifparse/regfree.os -Lexternal/lib -Lbuild/external/release/linux/2.6/64/x86/gcc/4.9/mpi -Lexternal -L/opt/applications/openmpi/1.10.1/gnu/lib -L/opt/applications/openmpi/1.10.1/gnu/lib/openmpi -L/opt/rh/devtoolset-3/root/usr/lib -L/opt/rh/devtoolset-3/root/usr/lib64 -L/opt/applications/python/2.7.11/gnu/lib -L/opt/applications/python/2.7.11/gnu/third-party/pytom/external/lib -L/usr/lib -L/usr/local/lib
/usr/bin/ld: unrecognized option ‘-plugin’
/usr/bin/ld: use the –help option for usage information
collect2: error: ld returned 1 exit status
scons: *** [build/external/release/linux/2.6/64/x86/gcc/4.9/mpi/libcifparse.so] Error 1
scons: building terminated because of errors.
-
January 12, 2017 at 11:51 pm #12087Anonymous
“mpicc –version” returns “gcc (GCC) 4.9.2 20150212 (Red Hat 4.9.2-6)”
I replace the “site.settings” with “site.settings.killdevil”. Then, I load 1)python/2.7.11 2) gcc/4.9.2 and 3) openmpi/1.10.1 modules and then run “python ./scons.py bin -j7 mode=release extras=mpi” and then get the error above.
Is there any other module that I need to load?
-
January 12, 2017 at 11:51 pm #12608Anonymous
“mpicc –version” returns “gcc (GCC) 4.9.2 20150212 (Red Hat 4.9.2-6)”
I replace the “site.settings” with “site.settings.killdevil”. Then, I load 1)python/2.7.11 2) gcc/4.9.2 and 3) openmpi/1.10.1 modules and then run “python ./scons.py bin -j7 mode=release extras=mpi” and then get the error above.
Is there any other module that I need to load?
-
January 12, 2017 at 11:51 pm #13129Anonymous
“mpicc –version” returns “gcc (GCC) 4.9.2 20150212 (Red Hat 4.9.2-6)”
I replace the “site.settings” with “site.settings.killdevil”. Then, I load 1)python/2.7.11 2) gcc/4.9.2 and 3) openmpi/1.10.1 modules and then run “python ./scons.py bin -j7 mode=release extras=mpi” and then get the error above.
Is there any other module that I need to load?
-
January 17, 2017 at 5:41 am #12096Anonymous
“python scons.py bin/rosetta_scripts.linuxgccrelease -j7 mode=release” produces the same error.
-
January 17, 2017 at 5:41 am #12617Anonymous
“python scons.py bin/rosetta_scripts.linuxgccrelease -j7 mode=release” produces the same error.
-
January 17, 2017 at 5:41 am #13138Anonymous
“python scons.py bin/rosetta_scripts.linuxgccrelease -j7 mode=release” produces the same error.
-
January 19, 2017 at 4:59 pm #12099Anonymous
Okay, then it’s really looking like there might be an issue with your compiler/linker installation.
I’d *highly* recommend getting in contact with the people who run/administer the cluster, and talk to them about the proper way of setting up your environment to compile programs. There’s unfortunately just too much variation in the way clusters are set up for us to be able to give general advice on how to work with your particular system – you need advice of someone who is familiar with your particular cluster.
Also, try downloading and running https://raw.githubusercontent.com/RosettaCommons/rosetta_clone_tools/master/rosetta_compiler_test.py — this does very basic compiler test to see if things are working appropriately. If it’s reporting failures (in the Main tests), that means that there’s an issue with your compiler setup itself, rather than an issue with the Rosetta build settings in particular.
-
January 19, 2017 at 4:59 pm #12620Anonymous
Okay, then it’s really looking like there might be an issue with your compiler/linker installation.
I’d *highly* recommend getting in contact with the people who run/administer the cluster, and talk to them about the proper way of setting up your environment to compile programs. There’s unfortunately just too much variation in the way clusters are set up for us to be able to give general advice on how to work with your particular system – you need advice of someone who is familiar with your particular cluster.
Also, try downloading and running https://raw.githubusercontent.com/RosettaCommons/rosetta_clone_tools/master/rosetta_compiler_test.py — this does very basic compiler test to see if things are working appropriately. If it’s reporting failures (in the Main tests), that means that there’s an issue with your compiler setup itself, rather than an issue with the Rosetta build settings in particular.
-
January 19, 2017 at 4:59 pm #13141Anonymous
Okay, then it’s really looking like there might be an issue with your compiler/linker installation.
I’d *highly* recommend getting in contact with the people who run/administer the cluster, and talk to them about the proper way of setting up your environment to compile programs. There’s unfortunately just too much variation in the way clusters are set up for us to be able to give general advice on how to work with your particular system – you need advice of someone who is familiar with your particular cluster.
Also, try downloading and running https://raw.githubusercontent.com/RosettaCommons/rosetta_clone_tools/master/rosetta_compiler_test.py — this does very basic compiler test to see if things are working appropriately. If it’s reporting failures (in the Main tests), that means that there’s an issue with your compiler setup itself, rather than an issue with the Rosetta build settings in particular.
-
-
AuthorPosts
- You must be logged in to reply to this topic.