D070_Refinement.py

Member Site Forums PyRosetta PyRosetta – General D070_Refinement.py

Viewing 1 reply thread
  • Author
    Posts
    • #1406
      Anonymous

        Hi,

        I’m trying to understand the D070_Refinement.py code posted by the Gray lab on PyRosetta.org. In short, the code basically makes small stabilizing backbone perturbations and repacks the sidechain of random residues in your pose. However, I don’t know how many residues get perturbed/repacked. What input parameter specifies the # of residues that get perturbed? The “cycles” parameter is set 9 and is fed to the RepeatMover() function and the comments say that “cycles” affects the amount of sampling, but I’m not don’t think think this is what I’m looking for.
        Does the PyJobDistributor have any control over the # of residues to perturb? Or is only for parallel processing?

        Thanks in advance,
        thorx020

      • #7792
        Anonymous

          It does all residues (this appears to be the ClassicRelax algorithm).

          For backbone, you know it does all residues because:
          # 4. create a MoveMap, all backbone torsions free
          movemap = MoveMap()
          movemap.set_bb(True)

          For sidechains, you know it does all residues because:
          # 8. setup a PackRotamersMover
          to_pack = standard_packer_task(starting_pose)
          to_pack.restrict_to_repacking() # prevents design, packing only
          to_pack.or_include_current(True) # considers the original sidechains
          packmover = PackRotamersMover(scorefxn, to_pack)

          Admittedly, #8 doesn’t _say_ all residues, but the default packer task is all residues, and this code does not restrict that.

          If you wanted to relax a subset, alter the MoveMap created at 4, and the PackerTask created at 8 to allow movement at your subset of interest.

          “Does the PyJobDistributor have any control over the # of residues to perturb? Or is only for parallel processing?”
          No, and hope it never does – this would be a flagrant violation of encapsulation and make the code much harder to use. PyJobDistributor just lets you run the code in parallel in an easier-than-manual fashion. (Same for the C++ job distributors).

      Viewing 1 reply thread
      • You must be logged in to reply to this topic.