Optimization for machine learning / edited by Suvrit Sra, Sebastian Nowozin, and Stephen J. Wright.
Material type: TextLanguage: English Series: Neural information processing seriesPublication details: Cambridge, Mass. : MIT Press, c2012.Description: ix, 494 p. : ill. ; 26 cmISBN:- 9788120347540
- 9780262016469 (hardcover : alk. paper)
- 026201646X (hardcover : alk. paper)
- 006.31 22 SRA
Item type | Current library | Collection | Call number | Status | Date due | Barcode |
---|---|---|---|---|---|---|
General Books | CUTN Central Library Generalia | Non-fiction | 006.31 SRA (Browse shelf(Opens below)) | Available | 34122 | |
General Books | CUTN Central Library Generalia | Non-fiction | 006.31 SRA (Browse shelf(Opens below)) | Available | 34123 | |
General Books | CUTN Central Library Generalia | Non-fiction | 006.31 SRA (Browse shelf(Opens below)) | Available | 34124 |
Browsing CUTN Central Library shelves, Shelving location: Generalia, Collection: Non-fiction Close shelf browser (Hides shelf browser)
006.31 ROG A first course in machine learning / | 006.31 SHA Understanding machine learning : | 006.31 SRA Optimization for machine learning / | 006.31 SRA Optimization for machine learning / | 006.31 SRA Optimization for machine learning / | 006.31 VIE Introduction to deep learning business applications for developers : | 006.31015181 MOI Algorithmic aspects of machine learning / |
Series Foreword. Preface.
1. Introduction: Optimization and Machine Learning
2. Convex Optimization with Sparsity-Inducing Norms
3. Interior-Point Methods for Large-Scale Cone Programming
4. Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey
5. First-Order Methods for Nonsmooth Convex Large-Scale Optimization
I: General Purpose Methods
6. First-Order Methods for Nonsmooth Convex Large-Scale Optimization
II: Utilizing Problem’s Structure
7. Cutting-Plane Methods in Machine Learning
8. Introduction to Dual Decomposition for Inference
9. Augmented Lagrangian Methods for Learning, Selecting, and Combining Features
10. The Convex Optimization Approach to Regret Minimization
11. Projected Newton-type Methods in Machine Learning
12. Interior-Point Methods in Machine Learning
13. The Tradeoffs of Large-Scale Learning
14. Robust Optimization in Machine Learning
15. Improving First and Second-Order Methods by Modeling Uncertainty
16. Bandit View on Noisy Optimization
17. Optimization Methods for Sparse Inverse Covariance Selection
18. A Pathwise Algorithms for Covariance Selection.
Includes bibliographical references.
There are no comments on this title.