Store besparelser
Hurtig levering
Gemte
Log ind
0
Kurv
Kurv

Statistical Learning with Sparsity

- The Lasso and Generalizations

Statistical Learning with Sparsity

- The Lasso and Generalizations
Tjek vores konkurrenters priser

Discover New Methods for Dealing with High-Dimensional Data

A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.

Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.

In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

Tjek vores konkurrenters priser
Normalpris
kr 469
Fragt: 39 kr
6 - 8 hverdage
20 kr
Pakkegebyr
God 4 anmeldelser på
Tjek vores konkurrenters priser

Discover New Methods for Dealing with High-Dimensional Data

A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.

Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.

In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

Produktdetaljer
Sprog: Engelsk
Sider: 368
ISBN-13: 9780367738334
Indbinding: Paperback
Udgave:
ISBN-10: 0367738333
Udg. Dato: 18 dec 2020
Længde: 23mm
Bredde: 234mm
Højde: 156mm
Forlag: Taylor & Francis Ltd
Oplagsdato: 18 dec 2020
Forfatter(e) Trevor Hastie, Robert Tibshirani, Martin Wainwright


Kategori Automatisk styrings- & reguleringsteknik


ISBN-13 9780367738334


Sprog Engelsk


Indbinding Paperback


Sider 368


Udgave


Længde 23mm


Bredde 234mm


Højde 156mm


Udg. Dato 18 dec 2020


Oplagsdato 18 dec 2020


Forlag Taylor & Francis Ltd

Kategori sammenhænge