Limitations and Future Trends in Neural Computation (NATO Science)

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This work reports critical analyses on complexity issues in the continuum setting and on generalization to new examples, which are two basic milestones in learning from examples in connectionist models. The problem of loading the weights of neural networks, which is often framed as continuous optimization, has been the target of many criticisms, since the potential solution of any learning problem is limited by the presence of local minimal in the error function. The notion of efficient solution needs to be formalized so as to provide useful comparisons with the traditional theory of computational complexity in the discrete setting. It also covers up-to-date developments in computational mathematics.

Author(s): Sergey Ablameyko, NATO ADVANCED RESEARCH WORKSHOP ON LIMIT
Year: 2003

Language: English
Pages: 254