mlx.optimizers.Adam#
- 类 Adam(learning_rate: float | Callable[[array], array], betas: List[float] = [0.9, 0.999], eps: float = 1e-08, bias_correction: bool = False)#
Adam 优化器 [1]。详细说明如下:
[1]: Kingma, D.P. and Ba, J., 2015. Adam: A method for stochastic optimization. ICLR 2015.
- 参数:
方法
__init__
(learning_rate[, betas, eps, ...])apply_single
(gradient, parameter, state)执行 Adam 参数更新并将
和 存储在优化器状态中。init_single
(parameter, state)初始化优化器状态