package owl-base

  1. Overview
  2. Docs
Legend:
Library
Module
Module type
Parameter
Class
Class type
include Owl_algodiff_generic_sig.Sig
Type definition
type arr

General ndarray type

type elt

Scalar type

type trace_op

Trace type

type t =
  1. | F of float
  2. | Arr of arr
  3. | DF of t * t * int
  4. | DR of t * t ref * trace_op * int ref * int
    (*

    Abstract number type

    *)
Supported Maths functions
module Maths : sig ... end
module Mat : sig ... end
module Arr : sig ... end
Core functions
val diff : (t -> t) -> t -> t

``diff f x`` returns the exat derivative of a function ``f : scalar -> scalar`` at point ``x``. Simply calling ``diff f`` will return its derivative function ``g`` of the same type, i.e. ``g : scalar -> scalar``.

Keep calling this function will give you higher-order derivatives of ``f``, i.e. ``f |> diff |> diff |> diff |> ...``

val diff' : (t -> t) -> t -> t * t

similar to ``diff``, but return ``(f x, diff f x)``.

val grad : (t -> t) -> t -> t

gradient of ``f`` : (vector -> scalar) at ``x``, reverse ad.

val grad' : (t -> t) -> t -> t * t

similar to ``grad``, but return ``(f x, grad f x)``.

val jacobian : (t -> t) -> t -> t

jacobian of ``f`` : (vector -> vector) at ``x``, both ``x`` and ``y`` are row vectors.

val jacobian' : (t -> t) -> t -> t * t

similar to ``jacobian``, but return ``(f x, jacobian f x)``

val jacobianv : (t -> t) -> t -> t -> t

jacobian vector product of ``f`` : (vector -> vector) at ``x`` along ``v``, forward ad. Namely, it calcultes ``(jacobian x) v``

val jacobianv' : (t -> t) -> t -> t -> t * t

similar to ``jacobianv'``, but return ``(f x, jacobianv f x v)``

val jacobianTv : (t -> t) -> t -> t -> t

transposed jacobian vector product of ``f : (vector -> vector)`` at ``x`` along ``v``, backward ad. Namely, it calculates ``transpose ((jacobianv f x v))``.

val jacobianTv' : (t -> t) -> t -> t -> t * t

similar to ``jacobianTv``, but return ``(f x, transpose (jacobianv f x v))``

val hessian : (t -> t) -> t -> t

hessian of ``f`` : (scalar -> scalar) at ``x``.

val hessian' : (t -> t) -> t -> t * t

simiarl to ``hessian``, but return ``(f x, hessian f x)``

val hessianv : (t -> t) -> t -> t -> t

hessian vector product of ``f`` : (scalar -> scalar) at ``x`` along ``v``. Namely, it calculates ``(hessian x) v``.

val hessianv' : (t -> t) -> t -> t -> t * t

similar to ``hessianv``, but return ``(f x, hessianv f x v)``.

val laplacian : (t -> t) -> t -> t

laplacian of ``f : (scalar -> scalar)`` at ``x``.

val laplacian' : (t -> t) -> t -> t * t

simiar to ``laplacian``, but return ``(f x, laplacian f x)``.

val gradhessian : (t -> t) -> t -> t * t

return ``(grad f x, hessian f x)``, ``f : (scalar -> scalar)``

val gradhessian' : (t -> t) -> t -> t * t * t

return ``(f x, grad f x, hessian f x)``

val gradhessianv : (t -> t) -> t -> t -> t * t

return ``(grad f x v, hessian f x v)``

val gradhessianv' : (t -> t) -> t -> t -> t * t * t

return ``(f x, grad f x v, hessian f x v)``

Low-level functions
val pack_flt : elt -> t

TODO

val unpack_flt : t -> elt

TODO

val pack_arr : arr -> t

TODO

val unpack_arr : t -> arr

TODO

val tag : unit -> int

TODO

val primal : t -> t

TODO

val primal' : t -> t

TODO

val adjval : t -> t

TODO

val adjref : t -> t ref

TODO

val tangent : t -> t

TODO

val make_forward : t -> t -> int -> t

TODO

val make_reverse : t -> int -> t

TODO

val reverse_prop : t -> t -> unit

TODO

val type_info : t -> string

TODO

val shape : t -> int array

TODO

val copy_primal' : t -> t

TODO

val clip_by_value : amin:elt -> amax:elt -> t -> t
val clip_by_l2norm : elt -> t -> t
Helper functions
val to_trace : t list -> string

``to_trace t0; t1; ...`` outputs the trace of computation graph on the terminal in a human-readable format.

val to_dot : t list -> string

``to_dot t0; t1; ...`` outputs the trace of computation graph in the dot file format which you can use other tools further visualisation, such as Graphviz.

val pp_num : Format.formatter -> t -> unit

``pp_num t`` pretty prints the abstract number used in ``Algodiff``.

Utils module
module Utils : sig ... end
Learning_Rate module
module Learning_Rate : sig ... end
Batch module
module Batch : sig ... end
Loss module
module Loss : sig ... end
Gradient module
module Gradient : sig ... end
Momentum module
module Momentum : sig ... end
Regularisation module
module Regularisation : sig ... end
Clipping module
module Clipping : sig ... end
Stopping module
module Stopping : sig ... end
Checkpoint module
module Checkpoint : sig ... end
Params module
module Params : sig ... end
Core functions
val minimise_weight : ?state:Checkpoint.state -> Params.typ -> (t -> t -> t) -> t -> t -> t -> Checkpoint.state * t

This function minimises the weight ``w`` of passed-in function ``f``.

* ``f`` is a function ``f : w -> x -> y``. * ``w`` is a row vector but ``y`` can have any shape.

val minimise_network : ?state:Checkpoint.state -> Params.typ -> (t -> t * t array array) -> (t -> t array array * t array array) -> (t array array -> 'a) -> (string -> unit) -> t -> t -> Checkpoint.state

This function is specifically designed for minimising the weights in a neural network of graph structure. In Owl's earlier versions, the functions in the regression module were actually implemented using this function.

val minimise_fun : ?state:Checkpoint.state -> Params.typ -> (t -> t) -> t -> Checkpoint.state * t

This function minimises ``f : x -> y`` w.r.t ``x``.

``x`` is an ndarray; and ``y`` is an scalar value.

OCaml

Innovation. Community. Security.