mlx.optimizers.Adam#
- 类 Adam(learning_rate: float | Callable[[array], array], betas: List[float] = [0.9, 0.999], eps: float = 1e-08, bias_correction: bool = False)#
Adam 优化器 [1]。详细说明如下:
[1]: Kingma, D.P. and Ba, J., 2015. Adam: A method for stochastic optimization. ICLR 2015.
\[\begin{split}m_{t+1} &= \beta_1 m_t + (1 - \beta_1) g_t \\ v_{t+1} &= \beta_2 v_t + (1 - \beta_2) g_t^2 \\ w_{t+1} &= w_t - \lambda \frac{m_{t+1}}{\sqrt{v_{t+1} + \epsilon}}\end{split}\]- 参数:
方法
__init__
(learning_rate[, betas, eps, ...])apply_single
(gradient, parameter, state)执行 Adam 参数更新并将 \(v\) 和 \(m\) 存储在优化器状态中。
init_single
(parameter, state)初始化优化器状态