optim#


clone_optimizer(optim: TOptim, new_params: Parameter | Iterator[Parameter]) TOptim[source]#

Clone an optimizer to get a new optim instance with new parameters.

WARNING: This is a temporary measure, and should not be used in downstream code! Once tianshou interfaces have moved to optimizer factories instead of optimizers, this will be removed.

Parameters:
  • optim – the optimizer to clone

  • new_params – the new parameters to use

Returns:

a new optimizer with the same configuration as the old one

optim_step(loss: Tensor, optim: Optimizer, module: Module | None = None, max_grad_norm: float | None = None) None[source]#

Perform a single optimization step: zero_grad -> backward (-> clip_grad_norm) -> step.

Parameters:
  • loss

  • optim

  • module – the module to optimize, required if max_grad_norm is passed

  • max_grad_norm – if passed, will clip gradients using this