Emphasizing issues of computational efficiency, Michael Kearns and Umesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics.
Contributes to the search for common methods underlying efficient learning algorithms and computational impediments to learning. Topics such as Occam's razor, the Vapnik-Chervonenkis dimension and learning in the presence of noise are discussed to illuminate basic principles. For researchers in artificial intelligence and related fields. Annotation c. Book News, Inc., Portland, OR (booknews.com)