Keras Model Import: Supported Features

Little-known fact: Deeplearning4j’s creator, Skymind, has two of the topfive Keras contributorson our team, making it the largest contributor to Keras after Keras creator FrancoisChollet, who’s at Google.

While not every concept in DL4J has an equivalent in Keras and vice versa, many of thekey concepts can be matched. Importing keras models into DL4J is done inour deeplearning4j-modelimportmodule. Below is a comprehensive list of currently supported features.

Layers

Mapping keras to DL4J layers is done in the layers sub-module of model import. The structure of this project loosely reflects the structure of Keras.

Core Layers

Convolutional Layers

Pooling Layers

Locally-connected Layers

Recurrent Layers

Embedding Layers

Merge Layers

  • Add / add
  • Multiply / multiply
  • Subtract / subtract
  • Average / average
  • Maximum / maximum
  • Concatenate / concatenate
  • Dot / dot

Advanced Activation Layers

Normalization Layers

Noise Layers

Layer Wrappers

Losses

  • mean_squared_error
  • mean_absolute_error
  • mean_absolute_percentage_error
  • mean_squared_logarithmic_error
  • squared_hinge
  • hinge
  • categorical_hinge
  • logcosh
  • categorical_crossentropy
  • sparse_categorical_crossentropy
  • binary_crossentropy
  • kullback_leibler_divergence
  • poisson
  • cosine_proximity

Activations

  • softmax
  • elu
  • selu
  • softplus
  • softsign
  • relu
  • tanh
  • sigmoid
  • hard_sigmoid
  • linear

Initializers

  • Zeros
  • Ones
  • Constant
  • RandomNormal
  • RandomUniform
  • TruncatedNormal
  • VarianceScaling
  • Orthogonal
  • Identity
  • lecun_uniform
  • lecun_normal
  • glorot_normal
  • glorot_uniform
  • he_normal
  • he_uniform

Regularizers

  • l1
  • l2
  • l1_l2

Constraints

  • max_norm
  • non_neg
  • unit_norm
  • min_max_norm

Optimizers

  • SGD
  • RMSprop
  • Adagrad
  • Adadelta
  • Adam
  • Adamax
  • Nadam
  • TFOptimizer