tfmri.initializers.VarianceScaling
tfmri.initializers.VarianceScaling¶
- class VarianceScaling(scale=1.0, mode='fan_in', distribution='truncated_normal', seed=None)[source]¶
Bases:
keras.initializers.initializers_v2.VarianceScalingInitializer capable of adapting its scale to the shape of weights tensors.
Note
This initializer can be used as a drop-in replacement for tf.keras.initializers.VarianceScaling. However, this one also supports initialization of complex-valued weights. Simply pass
dtype='complex64'ordtype='complex128'to its__call__method.Also available via the shortcut function
tf.keras.initializers.variance_scaling.With
distribution="truncated_normal" or "untruncated_normal", samples are drawn from a truncated/untruncated normal distribution with a mean of zero and a standard deviation (after truncation, if used)stddev = sqrt(scale / n), wherenis:number of input units in the weight tensor, if
mode="fan_in"number of output units, if
mode="fan_out"average of the numbers of input and output units, if
mode="fan_avg"
With
distribution="uniform", samples are drawn from a uniform distribution within[-limit, limit], wherelimit = sqrt(3 * scale / n).Examples:
>>> # Standalone usage: >>> initializer = tf.keras.initializers.VarianceScaling( ... scale=0.1, mode='fan_in', distribution='uniform') >>> values = initializer(shape=(2, 2))
>>> # Usage in a Keras layer: >>> initializer = tf.keras.initializers.VarianceScaling( ... scale=0.1, mode='fan_in', distribution='uniform') >>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)
- Args:
scale: Scaling factor (positive float). mode: One of “fan_in”, “fan_out”, “fan_avg”. distribution: Random distribution to use. One of “truncated_normal”,
“untruncated_normal” and “uniform”.
- seed: A Python integer. Used to make the behavior of the initializer
deterministic. Note that a seeded initializer will produce the same random values across multiple calls.