tensorflow-ops-0.3.0.0: Friendly layer around TensorFlow bindings.
Safe HaskellNone
LanguageHaskell2010

TensorFlow.Minimize

Synopsis

Documentation

type Minimizer a = forall m. MonadBuild m => [Variable a] -> [Tensor Value a] -> m ControlNode Source #

Functions that minimize a loss w.r.t. a set of Variables.

Generally only performs one step of an iterative algorithm.

Minimizers are defined as a function of the gradients instead of the loss so that users can apply transformations to the gradients.

minimizeWith Source #

Arguments

:: (MonadBuild m, GradientCompatible a) 
=> Minimizer a 
-> Tensor v a

Loss.

-> [Variable a]

Parameters of the loss function.

-> m ControlNode 

Convenience wrapper around gradients and a Minimizer.

gradientDescent Source #

Arguments

:: GradientCompatible a 
=> a

Learning rate.

-> Minimizer a 

Perform one step of the gradient descent algorithm.

data AdamConfig t Source #

Constructors

AdamConfig 

Fields

Instances

Instances details
Fractional t => Default (AdamConfig t) Source # 
Instance details

Defined in TensorFlow.Minimize

Methods

def :: AdamConfig t

adam :: (OneOfAdamDataTypes t, Fractional t) => Minimizer t Source #

Perform one step of the adam algorithm.

See https://arxiv.org/abs/1412.6980.

NOTE: Currently requires all Variables to have an initializedValue.