Regularization, Optimization, Kernels, and Support Vector Machines (Chapman & Hall/CRC Machine Learning & Pattern Recognition)
<P><STRONG>Regularization, Optimization, Kernels, and Support Vector Machines</STRONG> offers a snapshot of the current state of the art of large-scale machine learning, providing a single multidisciplinary source for the latest research and advances in regularization, sparsity, compressed sensing, convex and large-scale optimization, kernel methods, and support vector machines. Consisting of 21 chapters authored by leading researchers in machine learning, this comprehensive reference:</P> <UL> <LI>Covers the relationship between support vector machines (SVMs) and the Lasso</LI> <LI>Discusses multi-layer SVMs</LI> <LI>Explores nonparametric feature selection, basis pursuit methods, and robust compressive sensing</LI> <LI>Describes graph-based regularization methods for single- and multi-task learning</LI> <LI>Considers regularized methods for dictionary learning and portfolio selection</LI> <LI>Addresses non-negative matrix factorization</LI> <LI>Examines low-rank matrix and tensor-based models</LI> <LI>Presents advanced kernel methods for batch and online machine learning, system identification, domain adaptation, and image processing</LI> <LI>Tackles large-scale algorithms including conditional gradient methods, (non-convex) proximal techniques, and stochastic gradient descent</LI></UL> <P><B>Regularization, Optimization, Kernels, and Support Vector Machines</B> is ideal for researchers in machine learning, pattern recognition, data mining, signal processing, statistical learning, and related areas.</P>