Regularization, Optimization, Kernels, and Support Vector Machines

1st Edition

Johan A.K. Suykens, Marco Signoretto, Andreas Argyriou

Chapman and Hall/CRC
Published October 23, 2014
Reference - 525 Pages - 93 B/W Illustrations
ISBN 9781482241396 - CAT# K23354
Series: Chapman & Hall/CRC Machine Learning & Pattern Recognition

For Instructors Request Inspection Copy

USD$120.00

Add to Wish List
FREE Standard Shipping!

Features

  • Covers the latest research and advances in regularization, sparsity, and compressed sensing
  • Describes recent progress in convex and large-scale optimization, kernel methods, and support vector machines
  • Discusses output kernel learning, domain adaptation, multi-layer support vector machines, and more
  • Offers a snapshot of the current state of the art of large-scale machine learning
  • Includes interdisciplinary contributions from worldwide experts

Summary

Regularization, Optimization, Kernels, and Support Vector Machines offers a snapshot of the current state of the art of large-scale machine learning, providing a single multidisciplinary source for the latest research and advances in regularization, sparsity, compressed sensing, convex and large-scale optimization, kernel methods, and support vector machines. Consisting of 21 chapters authored by leading researchers in machine learning, this comprehensive reference:

  • Covers the relationship between support vector machines (SVMs) and the Lasso
  • Discusses multi-layer SVMs
  • Explores nonparametric feature selection, basis pursuit methods, and robust compressive sensing
  • Describes graph-based regularization methods for single- and multi-task learning
  • Considers regularized methods for dictionary learning and portfolio selection
  • Addresses non-negative matrix factorization
  • Examines low-rank matrix and tensor-based models
  • Presents advanced kernel methods for batch and online machine learning, system identification, domain adaptation, and image processing
  • Tackles large-scale algorithms including conditional gradient methods, (non-convex) proximal techniques, and stochastic gradient descent

Regularization, Optimization, Kernels, and Support Vector Machines is ideal for researchers in machine learning, pattern recognition, data mining, signal processing, statistical learning, and related areas.

Instructors

We provide complimentary e-inspection copies of primary textbooks to instructors considering our books for course adoption.

Request an
e-inspection copy

Share this Title