Member Site Forums PyRosetta PyRosetta – General Gradient Descent

Viewing 1 reply thread
  • Author
    Posts
    • #3591
      Anonymous

        Hello,

         

        I’m looking for gradient descent in pyrosetta. But, I can’t find it. What is exactly the name of gradient descent in pyrosetta?

      • #15534
        Anonymous

          The minimizer is the Rosetta module that performs gradient-descent minimization.  In PyRosetta, this is most easily accessed using the MinMover (https://graylab.jhu.edu/PyRosetta.documentation/pyrosetta.rosetta.protocols.minimization_packing.html#pyrosetta.rosetta.protocols.minimization_packing.MinMover).

          • #15535
            Anonymous

              Thanks for your quick reply. Can I change this optimization method to something else like stochastic gradient descent?

            • #15537
              Anonymous

                Rosetta currently doesn’t have an option for stochastic gradient descent; since it’s very rapid to compute the whole gradient vector, there was never any reason to  do partial gradients.  There are, however, several flavours,  most of which use different approximations of the inverse of the second-derivative Hessian matrix.  (True gradient descent using only gradients is implemented as the “linmin_iterated” minimization type, but this converges slowly and is recommended only for debugging.  The default type is “lbfgs_armijo_nonmonotone”,  which is a quasi-Newtonian gradient descent methods that uses the low-memory Broyden–Fletcher–Goldfarb–Shannon algorithm to approximate the inverse of the Hessian matrix.  This converges more quickly).

          Viewing 1 reply thread
          • You must be logged in to reply to this topic.