Module Backprop.Net

Networks consist of backpropagatable functions.

type 'a t = 'a * ('a -> unit)

The backpropagation "functor": it consists of the result of the evaluation (the primal) and a function to perform the backpropagation given the gradient.

val observe : ('a -> unit) -> 'a0 t -> 'a0 t

Observe a value being evaluated.

val observe_descent : ('a -> unit) -> 'a0 t -> 'a0 t

Observe a value being optimized.

val eval : 'a t -> 'a0

Evaluate the result of a computation.

val update : 'a -> 'a0 t -> unit

Update according to parameter.

val climb : 'a -> 'b t -> unit t

Perform gradient climbing.

val descent : float -> float t -> unit t

Perform gradient descent.

val run : unit t -> unit

Run gradient descent.

Building blocks

val cst : 'a -> 'a0 t

A constant.

module VarMake (E : sig ... end) : sig ... end
include sig ... end
module Var : sig ... end
val var : float Stdlib.ref -> float t
val of_differentiable : ('a, 'b) Differentiable.t -> 'a0 t -> 'b0 t

A backpropagatable function from a differentiable one.

val sigmoid : float t -> float t

Sigmoid.

val relu : float t -> float t

ReLU.

val sin : float t -> float t

Sine.

val log : float t -> float t

Log.

val square : float t -> float t

Square.

Softmax

val fold : ('b t -> 'b0 t -> 'b0 t) -> 'b t Stdlib.Seq.t -> 'b t -> 'b1 t

Fold a function over a series of inputs.

val pair : 'a t -> 'b t -> ('a0 * 'b0) t

Pair two values.

val unpair : ('a * 'b) t -> 'a0 t * 'b0 t

Unpair two values.

val dup : int -> float t -> float * (float -> unit)
val drop : float t -> unit

Values of this type are linear. This operator must be used when a value is never used.

val mux : 'a t array -> 'a0 array t
val demux : float array t -> float t array
module Linear : sig ... end

Linear transformations.

module Vector : sig ... end

Operations on vectors.