Springer, 2008. — 241 p.
Knowledge processing with interval methods has intrinsic merit. First, qualitative properties are often presented as ranges of data attributes rather than specific points. For example, one’s blood pressure is normal if within the normal range (i. e. normal interval). By grouping attribute values into meaningful intervals, we can omit insignificant quantitative differences and focus more on qualitatively processing datasets. More importantly, interval-valued attributes contain more information than points and can represent variability and uncertainty. Finally, interval-valued computational results can be more meaningful and useful than point-valued output in a dynamic environment.
Statistical and probabilistic methods have been widely applied in knowledge discovery. However, despite the fact that confidence intervals and fuzzy intervals have been used to deal with uncertainties, they may not always work well in practice. By integrating interval methods with stochastic models and fuzzy logic, this book provides at least additional, if not more powerful, tools for knowledge processing, especially for handling variability and uncertainty.
Successful applications have been putting interval computing into the mainstream of computing. In 2006, the C++ standard committee evaluated a detailed proposal to include interval computing as a part of the ANSI/ISO C++ standard library. Interval arithmetic has already been in the kernel computations of Intel’s Itanium-based architecture. Aside from many other software tools, Sun Microsystems has already included interval arithmetic in its Sun Studio.
More importantly, applying unique properties of interval computing, new algorithms have been developed to solve some otherwise very difficult problems. For example, one can computationally find all roots for nonlinear systems of equations on a given domain with interval Newton/generalized bisection methods and reliably find nonlinear global optima with interval branch-and-bound algorithms computationally. Very recently, Ferguson and Hales proved the 400-year-old Kepler conjecture with interval methods. In 2007, they received the first Robbins Prize from AMS for their work.
Knowledge processing with intervals is significantly different from that with points. In this book, we extend previous knowledge processing methods to interval-valued datasets. By embedding interval and soft computing methods into distributed homogeneous and/or heterogeneous database systems that collect and manage massive datasets, scientists may significantly enhance their ability to process massive datasets.
This book can be used as an introduction to interval methods and soft computing for knowledge processing for upper-level undergraduates or first-year graduate students. It can also be a reference for researchers and practitioners.
This book only introduces some initial applications of interval methods in knowledge processing. We sincerely hope to see more fruitful and significant results in both of theory and application in the future.
Fundamentals of Interval Computing.
Soft Computing Essentials.
Relations between Interval Computing and Soft Computing.
Interval Matrices in Knowledge Discovery.
Interval Function Approximation and Applications.
Interval Rule Matrices for Decision Making.
Interval Matrix Games.
Interval-Weighted Graphs and Flow Networks.
Arithmetic on Bounded Families of Distributions: A DEnv Algorithm Tutorial.
IntBox: An Object-Oriented Interval Computing Software Toolbox in C++.