Perhaps the best section of the book was its coverage of the field's history. Minsky and Papert were mentioned as publishing a paper in 1969 that dumped on neural networks and led to a diminishing in funding. So much so that the book's authors call those years the Dark Age. It lasted till the 80s, when Hopfield published a series of seminal papers, that led to a revival. He took ideas from physics (especially solid state physics, which was his professional background) and applied them in novel ways to neural networks. To the extent that so-called Hopfield networks were subsequently described in many papers. This interdisciplinary mixing of physics and biology may prove inspirational to some readers doing active research.
Later parts of the book then explain the various types of neural networks currently in use. Along with sufficient details about implementation to aid you start up your work.
However, the book does [perhaps correctly] omit one thing. In the 80s, after Hopfield invigorated the subject, there was much speculation that the improved approaches might yield some qualitatively new and striking phenomena. Perhaps something even approaching a functioning, self-aware mind. Alas, this has not come to pass. Neural networks have certainly become an important and practical tool. But the excitement has died down.
Author(s): Russell C. Eberhart, Yuhui Shi
Edition: 1
Publisher: Morgan Kaufmann
Year: 2007
Language: English
Pages: 472