Backprop.Net
Networks consist of backpropagatable functions.
The backpropagation "functor": it consists of the result of the evaluation (the primal) and a function to perform the backpropagation given the gradient.
val eval : 'a t -> 'a0
Evaluate the result of a computation.
val update : 'a -> 'a0 t -> unit
Update according to parameter.
val run : unit t -> unit
Run gradient descent.
val cst : 'a -> 'a0 t
A constant.
val of_differentiable : ('a, 'b) Differentiable.t -> 'a0 t -> 'b0 t
A backpropagatable function from a differentiable one.
val softmax : Algebra.Vector.t t -> Algebra.Vector.t t
Softmax
Fold a function over a series of inputs.
val dup : int -> float t -> float * (float -> unit)
val drop : float t -> unit
Values of this type are linear. This operator must be used when a value is never used.
module Linear : sig ... end
Linear transformations.
module Vector : sig ... end
Operations on vectors.