Experimental Haskell machine learning library
Go to file
2016-07-30 16:52:34 +04:30
app feat(crossEntropy): crossEntropy cost function 2016-07-24 10:48:04 +04:30
examples fix(precision): little bug in implementation 2016-07-30 16:52:34 +04:30
src fix(precision): little bug in implementation 2016-07-30 16:52:34 +04:30
test fix(stack): use stack build and exec instead of manual stack ghc 2016-07-18 16:33:34 +04:30
.gitignore Initial commit 2016-07-17 16:52:16 +04:30
.gitmodules feat(results): accuracy, recall and precision functions used to calculate measures 2016-07-29 17:55:59 +04:30
LICENSE fix(stack): use stack build and exec instead of manual stack ghc 2016-07-18 16:33:34 +04:30
README.md feat(results): accuracy, recall and precision functions used to calculate measures 2016-07-29 17:55:59 +04:30
Setup.hs initial commit, still work in progress 2016-07-17 16:53:13 +04:30
sibe.cabal feat(results): accuracy, recall and precision functions used to calculate measures 2016-07-29 17:55:59 +04:30
stack.yaml fix(stack): use stack build and exec instead of manual stack ghc 2016-07-18 16:33:34 +04:30

sibe

A simple Machine Learning library.

A simple neural network:

module Main where
  import Sibe
  import Numeric.LinearAlgebra
  import Data.List

  main = do
    let learning_rate = 0.5
        (iterations, epochs) = (2, 1000)
        a = (logistic, logistic') -- activation function and the derivative
        rnetwork = randomNetwork 0 2 [(8, a)] (1, a) -- two inputs, 8 nodes in a single hidden layer, 1 output

        inputs = [vector [0, 1], vector [1, 0], vector [1, 1], vector [0, 0]] -- training dataset
        labels = [vector [1], vector [1], vector [0], vector [0]] -- training labels

        -- initial cost using crossEntropy method
        initial_cost = zipWith crossEntropy (map (`forward` rnetwork) inputs) labels

        -- train the network
        network = session inputs rnetwork labels learning_rate (iterations, epochs)

        -- run inputs through the trained network
        -- note: here we are using the examples in the training dataset to test the network,
        --       this is here just to demonstrate the way the library works, you should not do this
        results = map (`forward` network) inputs

        -- compute the new cost
        cost = zipWith crossEntropy (map (`forward` network) inputs) labels

See other examples:

stack exec example-xor
stack exec example-naivebayes-doc-classifier