Git Product home page Git Product logo

divergences.jl's Introduction

Divergences.jl

codecov

Divergences.jl is a Julia package that makes it easy to evaluate divergence measures between two vectors. The package allows calculating the gradient and the diagonal of the Hessian of several divergences. These divergences are used to good effect by the MomentBasedEstimators package.

Supported divergences

The package defines an abstract Divergence type with the following suptypes:

  • Kullback-Leibler divergence KullbackLeibler
  • Chi-square distance ChiSquared
  • Reverse Kullback-Leibler divergence ReverseKullbackLeibler
  • Cressie-Read divergences CressieRead

These divergences differ from the equivalent ones defined in the Distances package because they are normalized. Also, the package provides methods for calculating their gradient and the (diagonal elements of the) Hessian matrix.

The constructors for the types above are straightforward

KullbackLeibler()
ChiSqaured()
ReverseKullbackLeibler()

The CressieRead type define a family of divergences indexed by a parameter alpha. The constructor for CressieRead is

CR(::Real)

The Hellinger divergence is obtained by CR(-1/2). For a certain value of alpha, CressieRead correspond to a divergence that has a specific type defined. For instance CR(1) is equivalent to ChiSquared although the underlying code for evaluation and calculation of the gradient and Hessian are different.

Three versions of each divergence in the above list are implemented currently. A vanilla version, a modified version, and a fully modified version. These modifications extend the domain of the divergence.

The modified version takes an additional argument that specifies the point at which the divergence is modified by a convex extension.

ModifiedKullbackLeibler(theta::Real)
ModifiedReverseKullbackLeibler(theta::Real)
ModifiedCressieRead(alpha::Real, theta::Real)

Similarly, the fully modified version takes two additional arguments that specify the points at which the divergence is modified by a convex extensions.

FullyModifiedKullbackLeibler(phi::Real, theta::Real)
FullyModifiedReverseKullbackLeibler(phi::Real, theta::Real)
FullyModifiedCressieRead(alpha::Real, phi::Real, theta::Real)

Basic usage

Divergence between two vectors

Each divergence corresponds to a divergence type. You can always compute a certain divergence between two vectors using the following syntax

d = evaluate(div, x, y)

Here, div is an instance of a divergence type. For example, the type for Kullback Leibler divergence is KullbackLeibler (more divergence types are described in some details in what follows), then the Kullback Leibler divergence between x and y can be computed

d = evaluate(KullbackLeibler(), x, y)

We can also calculate the diverge between the vector x and the unit vector

r = evaluate(KullbackLeibler(), x)

The Divergence type is a subtype of PreMetric defined in the Distances package. As such, the divergences can be evaluated row-wise and column-wise for X::Matrix and Y::Matrix.

rowise(div, X, Y)
colwise(div, X, Y)

Gradient of the divergence

To calculate the gradient of div::Divergence with respect to x::AbstractArray{Float64, 1} the gradient method can be used

g = gradient(div, x, y)

or through its in-place version

gradient!(Array(Float64, size(x)), div, x, y)

Hessian of the divergence

The hessian method calculate the Hessian of the divergence with respect to x

h = hessian(div, x, y)

Its in-place variant is also defined

hessian!(Array(Float64, size(x)), div, x, y)

Notice that the Hessian of a divergence is sparse, where the diagonal entries are the only ones different from zero. For this reason, hessian(div, x, y) return an Array{Float64,1} with the diagonal entries of the hessian.

List of divergences

[To be added]

divergences.jl's People

Contributors

giob1994 avatar gragusa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

divergences.jl's Issues

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.