Abstract: In this article, we mainly study the depth and width of autoencoders consisting of rectified linear unit (ReLU) activation functions. An autoencoder is a layered neural network consisting of ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results