tfmri.initializers.LecunNormal
tfmri.initializers.LecunNormal¶
- class LecunNormal(seed=None)[source]¶
Bases:
keras.initializers.initializers_v2.LecunNormal
Lecun normal initializer.
Note
This initializer can be used as a drop-in replacement for tf.keras.initializers.LecunNormal. However, this one also supports initialization of complex-valued weights. Simply pass
dtype='complex64'
ordtype='complex128'
to its__call__
method.Also available via the shortcut function
tf.keras.initializers.lecun_normal
.Initializers allow you to pre-specify an initialization strategy, encoded in the Initializer object, without knowing the shape and dtype of the variable being initialized.
Draws samples from a truncated normal distribution centered on 0 with
stddev = sqrt(1 / fan_in)
wherefan_in
is the number of input units in the weight tensor.Examples:
>>> # Standalone usage: >>> initializer = tf.keras.initializers.LecunNormal() >>> values = initializer(shape=(2, 2))
>>> # Usage in a Keras layer: >>> initializer = tf.keras.initializers.LecunNormal() >>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)
- Args:
- seed: A Python integer. Used to make the behavior of the initializer
deterministic. Note that a seeded initializer will not produce the same random values across multiple calls, but multiple initializers will produce the same sequence when constructed with the same seed value.
- References:
[Klambauer et al., 2017](https://arxiv.org/abs/1706.02515)