title: Learning and Generalisation in Neural Networks with Local Preprocessing creator: Kutsia, Merab subject: 530 subject: 530 Physics description: We study learning and generalisation ability of a specific two-layer feed-forward neural network and compare its properties to that of a simple perceptron. The input patterns are mapped nonlinearly onto a hidden layer, much larger than the input layer, and this mapping is either fixed or may result from an unsupervised learning process. Such preprocessing of initially uncorrelated random patterns results in the correlated patterns in the hidden layer. The hidden-to-output mapping of the network is performed by a simple perceptron, trained using a supervised learning process. We investigate the effects of the correlations on the learning and generalisation properties as opposed to those of a simple perceptron with uncorrelated patterns. As it turns out, this architecture has some advantages over a simple perceptron. date: 2007 type: Dissertation type: info:eu-repo/semantics/doctoralThesis type: NonPeerReviewed format: application/pdf identifier: https://archiv.ub.uni-heidelberg.de/volltextserverhttps://archiv.ub.uni-heidelberg.de/volltextserver/7598/1/Kutsia_PhD.pdf identifier: DOI:10.11588/heidok.00007598 identifier: urn:nbn:de:bsz:16-opus-75985 identifier: Kutsia, Merab (2007) Learning and Generalisation in Neural Networks with Local Preprocessing. [Dissertation] relation: https://archiv.ub.uni-heidelberg.de/volltextserver/7598/ rights: info:eu-repo/semantics/openAccess rights: http://archiv.ub.uni-heidelberg.de/volltextserver/help/license_urhg.html language: eng