Abstract:
|
In learning a one-hidden layer neural network, the parameters of the filter (weights) need to be learned in a non-overlapping convolution layer. To learn both the filter and output weights, this study proposes a multiple resolution model with the optimal setting of the combinations on three factors: the size of outcome weights, the size of the data set, and the resolution of the model. The multiple resolution model approximates the coefficient matrix with various resolutions of filter weights. With nested uniform designs, the optimal combination setting of the three factors is found for achieving the stability of filter weights. The model achieves the solution to the target problem with a small run size of experiments. The parameter vectors of filter weights and outcome weights are identifiable with regular assumptions. The filter weights are obtained with the truncated singular value decomposition. Theoretical supports are provided on the variation of the filter weights. The efficiency of the multiple resolution model is guaranteed by the upper bound of the matrix recovery error with the r-resolution filter weights.
|