neural

Neural Networks in native Haskell

https://github.com/brunjlar/neural

Latest on Hackage:0.3.0.1

This package is not currently in any snapshots. If you're interested in using it, we recommend adding it to Stackage Nightly. Doing so will make builds more reliable, and allow stackage.org to host generated Haddocks.

MIT licensed by Lars Bruenjes
Maintained by [email protected]

The goal of neural is to provide a modular and flexible neural network library written in native Haskell.

Features include

  • composability via arrow-like instances and pipes,

  • automatic differentiation for automatic gradient descent/ backpropagation training (using Edward Kmett's fabulous ad library).

The idea is to be able to easily define new components and wire them up in flexible, possibly complicated ways (convolutional deep networks etc.).

Four examples are included as proof of concept:

  • A simple neural network that approximates the sine function on [0,2 pi].

  • Another simple neural network that approximates the sqrt function on [0,4].

  • A slightly more complicated neural network that solves the famous Iris flower problem.

  • A first (still simple) neural network for recognizing handwritten digits from the equally famous MNIST database.

The library is still very much experimental at this point.