These networks incorporate Scaled Exponential Linear Units (SELUs) which lead to a self-normalizing property. When these units are used, activations close to zero mean and unit variance are propagated through the layers. 27.07.2023 17:54 aior