Dr. Bishop is a world-renowned expert in this field, but his book didn't work for me. Despite the title, it covers the more general topic of classification, not just Neural Networks. However, it does so less well than my favorites (esp. Hastie and Tibshirani). In terms of specific discussion of nonlinear classifiers, I preferred Christianini's discussion of SVM's.The most positive feature is a detailed discussion of the Kolmogorov theorem which I found very powerful, but perhaps not in the way the author intended. To paraphrase, the Kolmogorov theorem states that a structure as simple as a single hidden layer neural network is dense with respect to the space of continuous functions. I interpret that to mean that a neural network model can fit anything, completely independent of any underlying relationship. I've always been skeptical of NN's and this property gives me a fact to support my bias. Bishop's discussion of the Kolmogorov theorem captures my opinion of the book. He presents the theorem as an aside and does not propose to draw any inferences from it. I prefer authors who have more of a central thesis.That said, many people really like this book. I bought it based on the near-universal praise. My dislike may be more a personal taste than a reliable guideline.
Author(s): Christopher M. Bishop
Publisher: Oxford University Press, USA
Year: 1996
Language: English
Pages: 496