Sigmoid#

class vision_architectures.schedulers.sigmoid.SigmoidScheduler(min_y=0.0, max_y=1.0, min_x=-7, max_x=7)[source]#

Bases: object

__init__(min_y=0.0, max_y=1.0, min_x=-7, max_x=7)[source]#
set_num_steps(num_steps)[source]#
is_ready()[source]#
is_completed()[source]#
get()[source]#
step()[source]#
class vision_architectures.schedulers.sigmoid.SigmoidLR(optimizer, min_lr, max_lr, total_steps, min_x=-3.0, max_x=3.0, last_epoch=-1, verbose='deprecated')[source]#

Bases: LRScheduler

__init__(optimizer, min_lr, max_lr, total_steps, min_x=-3.0, max_x=3.0, last_epoch=-1, verbose='deprecated')[source]#
get_lr()[source]#

Compute the next learning rate for each of the optimizer’s param_groups.

Returns:

A list of learning rates for each of the optimizer’s param_groups with the same types as their current group["lr"]s.

Return type:

list[float | Tensor]

Note

If you’re trying to inspect the most recent learning rate, use get_last_lr() instead.

Note

The returned Tensors are copies, and never alias the optimizer’s group["lr"]s.

step(epoch=None)[source]#

Step the scheduler.

Parameters:

epoch (int, optional) –

Deprecated since version 1.4: If provided, sets last_epoch to epoch and uses _get_closed_form_lr() if it is available. This is not universally supported. Use step() without arguments instead.

Note

Call this method after calling the optimizer’s step().