Rice University, 2008, -254 pp.
This book focuses on the discrete Fourier transform (DFT), discrete convolution, and, particularly, the fast algorithms to calculate them. These topics have been at the center of digital signal processing since its beginning, and new results in hardware, theory and applications continue to keep them important and exciting.
As far as we can tell, Gauss was the first to propose the techniques that we now call the fast Fourier transform (FFT) for calculating the coefficients in a trigonometric expansion of an asteroid's orbit in 1805 [174]. However, it was the seminal paper by Cooley and Tukey [88] in 1965 that caught the attention of the science and engineering community and, in a way, founded the discipline of digital signal processing (DSP).
The impact of the Cooley-Tukey FFT was enormous. Problems could be solved quickly that were not even considered a few years earlier. A flurry of research expanded the theory and developed excellent practical programs as well as opening new applications [94]. In 1976, Winograd published a short paper [403] that set a second _urry of research in motion [86]. This was another type of algorithm that expanded the data lengths that could be transformed efficiently and reduced the number of multiplications required. The ground work for this algorithm had be set earlier by Good [148] and by Rader [308]. In 1997 Frigo and Johnson developed a program they called the FFTW (fastest Fourier transform in the west) [130], [135] which is a composite of many of ideas in other algorithms as well as new results to give a robust, very fast system for general data lengths on a variety of computer and DSP architectures. This work won the 1999 Wilkinson Prize for Numerical Software.
It is hard to overemphasis the importance of the DFT, convolution, and fast algorithms. With a history that goes back to Gauss [174] and a compilation of references on these topics that in 1995 resulted in over 2400 entries [362], the FFT may be the most important numerical algorithm in science, engineering, and applied mathematics. New theoretical results still are appearing, advances in computers and hardware continually restate the basic questions, and new applications open new areas for research. It is hoped that this book will provide the background, references, programs and incentive to encourage further research and results in this area as well as provide tools for practical applications.
Studying the FFT is not only valuable in understanding a powerful tool, it is also a prototype or example of how algorithms can be made efficient and how a theory can be developed to define optimality. The history of this development also gives insight to the process of research where timing and serendipity play interesting roles.
Preface: Fast Fourier Transforms
Introduction: Fast Fourier Transforms
Multidimensional Index Mapping
Polynomial Description of Signals
The DFT as Convolution or Filtering
Factoring the Signal Processing Operators
Winograd's Short DFT Algorithms
DFT and FFT: An Algebraic View
The Cooley-Tukey Fast Fourier Transform Algorithm
The Prime Factor and Winograd Fourier Transform Algorithms in Practice
Algorithms for Data with Restrictions
Convolution Algorithms
Comments: Fast Fourier Transforms
Conclusions: Fast Fourier Transforms
FFT Flowgraphs
Operation Counts for General Length FFT
FFT Computer Programs
Programs for Short FFTs