lewuathe / dllib   0.0.11

Apache License 2.0 Website GitHub

dllib is a distributed deep learning library running on Apache Spark

Scala versions: 2.11

dllib Build Status codecov GitHub tag license Maven Central

dllib is a distributed deep learning framework running on Apache Spark. See more detail in documentation. dllib is designed to be simple and easy to use for Spark users.

Since dllib has completely same interface of MLlib algorithms, libraries in MLlib can be used for feature engineering or transformation.

How to use

dllib is uploaded on Spark Packages. You can use from sperk-shell directly.

$ ./bin/spark-shell --packages Lewuathe:dllib:0.0.9

If you want use jar package to extend, you can write the configuration in your pom.xml.

<dependency>
    <groupId>com.lewuathe</groupId>
    <artifactId>dllib_2.11</artifactId>
    <version>0.0.9</version>
</dependency>

Example

This is an example for classification of MNIST dataset. Full code can be seen here.

import com.lewuathe.dllib.graph.Graph
import com.lewuathe.dllib.layer.{AffineLayer, ReLULayer, SoftmaxLayer}
import com.lewuathe.dllib.network.Network

// Define the network structure as calculation graph.
val graph = new Graph(Array(
  new AffineLayer(100, 784),
  new ReLULayer(100, 100),
  new AffineLayer(10, 100),
  new SoftmaxLayer(10, 10)
))

// Model keeps whole network parameters which should be trained.
// Default is in-memory model.
val model = Model(nn3Graph)

val nn3 = Network(model, graph)

// MultilayerPerceptron defines the optimization algorithms and hyper parameters.
val multilayerPerceptron = new MultiLayerPerceptron("MNIST", nn3)

// We can pass Dataset of Spark to the network.
val trainedModel = multilayerPerceptron.fit(df)

val result = trainedModel.transform(df)

result.filter("label = prediction").count()

License

Apache v2

Author