Usage of initializers

Initializations define the way to set the initial random weights of Keras layers.

The keyword arguments used for passing initializers to layers will depend on the layer. Usually it is simply kernel_initializer and bias_initializer:

  1. model.add(Dense(64,
  2. kernel_initializer='random_uniform',
  3. bias_initializer='zeros'))

Available initializers

The following built-in initializers are available as part of the keras.initializers module:

[source]

Initializer

  1. keras.initializers.Initializer()

Initializer base class: all initializers inherit from this class.

[source]

Zeros

  1. keras.initializers.Zeros()

Initializer that generates tensors initialized to 0.

[source]

Ones

  1. keras.initializers.Ones()

Initializer that generates tensors initialized to 1.

[source]

Constant

  1. keras.initializers.Constant(value=0)

Initializer that generates tensors initialized to a constant value.

Arguments

  • value: float; the value of the generator tensors.

[source]

RandomNormal

  1. keras.initializers.RandomNormal(mean=0.0, stddev=0.05, seed=None)

Initializer that generates tensors with a normal distribution.

Arguments

  • mean: a python scalar or a scalar tensor. Mean of the random values to generate.
  • stddev: a python scalar or a scalar tensor. Standard deviation of the random values to generate.
  • seed: A Python integer. Used to seed the random generator.

[source]

RandomUniform

  1. keras.initializers.RandomUniform(minval=-0.05, maxval=0.05, seed=None)

Initializer that generates tensors with a uniform distribution.

Arguments

  • minval: A python scalar or a scalar tensor. Lower bound of the range of random values to generate.
  • maxval: A python scalar or a scalar tensor. Upper bound of the range of random values to generate. Defaults to 1 for float types.
  • seed: A Python integer. Used to seed the random generator.

[source]

TruncatedNormal

  1. keras.initializers.TruncatedNormal(mean=0.0, stddev=0.05, seed=None)

Initializer that generates a truncated normal distribution.

These values are similar to values from a RandomNormalexcept that values more than two standard deviations from the meanare discarded and redrawn. This is the recommended initializer forneural network weights and filters.

Arguments

  • mean: a python scalar or a scalar tensor. Mean of the random values to generate.
  • stddev: a python scalar or a scalar tensor. Standard deviation of the random values to generate.
  • seed: A Python integer. Used to seed the random generator.

[source]

VarianceScaling

  1. keras.initializers.VarianceScaling(scale=1.0, mode='fan_in', distribution='normal', seed=None)

Initializer capable of adapting its scale to the shape of weights.

With distribution="normal", samples are drawn from a truncated normaldistribution centered on zero, with stddev = sqrt(scale / n) where n is:

  • number of input units in the weight tensor, if mode = "fan_in"
  • number of output units, if mode = "fan_out"
  • average of the numbers of input and output units, if mode = "fan_avg"

With distribution="uniform",samples are drawn from a uniform distributionwithin [-limit, limit], with limit = sqrt(3 * scale / n).

Arguments

  • scale: Scaling factor (positive float).
  • mode: One of "fan_in", "fan_out", "fan_avg".
  • distribution: Random distribution to use. One of "normal", "uniform".
  • seed: A Python integer. Used to seed the random generator.

Raises

  • ValueError: In case of an invalid value for the "scale", mode" or "distribution" arguments.

[source]

Orthogonal

  1. keras.initializers.Orthogonal(gain=1.0, seed=None)

Initializer that generates a random orthogonal matrix.

Arguments

  • gain: Multiplicative factor to apply to the orthogonal matrix.
  • seed: A Python integer. Used to seed the random generator.

References

[source]

Identity

  1. keras.initializers.Identity(gain=1.0)

Initializer that generates the identity matrix.

Only use for 2D matrices.If the desired matrix is not square, it gets paddedwith zeros for the additional rows/columns.

Arguments

  • gain: Multiplicative factor to apply to the identity matrix.

lecun_uniform

  1. keras.initializers.lecun_uniform(seed=None)

LeCun uniform initializer.

It draws samples from a uniform distribution within [-limit, limit]where limit is sqrt(3 / fan_in)where fan_in is the number of input units in the weight tensor.

Arguments

  • seed: A Python integer. Used to seed the random generator.

Returns

An initializer.

References

glorot_normal

  1. keras.initializers.glorot_normal(seed=None)

Glorot normal initializer, also called Xavier normal initializer.

It draws samples from a truncated normal distribution centered on 0with stddev = sqrt(2 / (fan_in + fan_out))where fan_in is the number of input units in the weight tensorand fan_out is the number of output units in the weight tensor.

Arguments

  • seed: A Python integer. Used to seed the random generator.

Returns

An initializer.

References

glorot_uniform

  1. keras.initializers.glorot_uniform(seed=None)

Glorot uniform initializer, also called Xavier uniform initializer.

It draws samples from a uniform distribution within [-limit, limit]where limit is sqrt(6 / (fan_in + fan_out))where fan_in is the number of input units in the weight tensorand fan_out is the number of output units in the weight tensor.

Arguments

  • seed: A Python integer. Used to seed the random generator.

Returns

An initializer.

References

he_normal

  1. keras.initializers.he_normal(seed=None)

He normal initializer.

It draws samples from a truncated normal distribution centered on 0with stddev = sqrt(2 / fan_in)where fan_in is the number of input units in the weight tensor.

Arguments

  • seed: A Python integer. Used to seed the random generator.

Returns

An initializer.

References

lecun_normal

  1. keras.initializers.lecun_normal(seed=None)

LeCun normal initializer.

It draws samples from a truncated normal distribution centered on 0with stddev = sqrt(1 / fan_in)where fan_in is the number of input units in the weight tensor.

Arguments

  • seed: A Python integer. Used to seed the random generator.

Returns

An initializer.

References

he_uniform

  1. keras.initializers.he_uniform(seed=None)

He uniform variance scaling initializer.

It draws samples from a uniform distribution within [-limit, limit]where limit is sqrt(6 / fan_in)where fan_in is the number of input units in the weight tensor.

Arguments

  • seed: A Python integer. Used to seed the random generator.

Returns

An initializer.

References

An initializer may be passed as a string (must match one of the available initializers above), or as a callable:

  1. from keras import initializers
  2. model.add(Dense(64, kernel_initializer=initializers.random_normal(stddev=0.01)))
  3. # also works; will use the default parameters.
  4. model.add(Dense(64, kernel_initializer='random_normal'))

Using custom initializers

If passing a custom callable, then it must take the argument shape (shape of the variable to initialize) and dtype (dtype of generated values):

  1. from keras import backend as K
  2. def my_init(shape, dtype=None):
  3. return K.random_normal(shape, dtype=dtype)
  4. model.add(Dense(64, kernel_initializer=my_init))