Regularization, Optimization, Kernels, and Support Vector Machines

Johan A.K. Suykens, Marco Signoretto, Andreas Argyriou

October 23, 2014 by Chapman and Hall/CRC
Reference - 525 Pages - 93 B/W Illustrations
ISBN 9781482241396 - CAT# K23354
Series: Chapman & Hall/CRC Machine Learning & Pattern Recognition

was $112.95

USD$90.36

SAVE ~$22.59

Add to Wish List
SAVE 25%
When you buy 2 or more print books!
See final price in shopping cart.
FREE Standard Shipping!

Features

  • Covers the latest research and advances in regularization, sparsity, and compressed sensing
  • Describes recent progress in convex and large-scale optimization, kernel methods, and support vector machines
  • Discusses output kernel learning, domain adaptation, multi-layer support vector machines, and more
  • Offers a snapshot of the current state of the art of large-scale machine learning
  • Includes interdisciplinary contributions from worldwide experts

Summary

Regularization, Optimization, Kernels, and Support Vector Machines offers a snapshot of the current state of the art of large-scale machine learning, providing a single multidisciplinary source for the latest research and advances in regularization, sparsity, compressed sensing, convex and large-scale optimization, kernel methods, and support vector machines. Consisting of 21 chapters authored by leading researchers in machine learning, this comprehensive reference:

  • Covers the relationship between support vector machines (SVMs) and the Lasso
  • Discusses multi-layer SVMs
  • Explores nonparametric feature selection, basis pursuit methods, and robust compressive sensing
  • Describes graph-based regularization methods for single- and multi-task learning
  • Considers regularized methods for dictionary learning and portfolio selection
  • Addresses non-negative matrix factorization
  • Examines low-rank matrix and tensor-based models
  • Presents advanced kernel methods for batch and online machine learning, system identification, domain adaptation, and image processing
  • Tackles large-scale algorithms including conditional gradient methods, (non-convex) proximal techniques, and stochastic gradient descent

Regularization, Optimization, Kernels, and Support Vector Machines is ideal for researchers in machine learning, pattern recognition, data mining, signal processing, statistical learning, and related areas.