
Statistical Learning Theory and Stochastic Optimization
Ecole d'Eté de Probabilités de Saint-Flour XXXI - 2001
Olivier Catoni - Collection Lectures Notes in Mathematics
Résumé
Lecture Notes in Mathematics
This series reports on new developments in mathematical research and teaching - quickly, informally and at a high level.The type of material considered for publication includes
- Research monographs
- Lectures on a new field or presentations of a new angle in a classical field
- Summer schools and intensive courses on topics of current research
Texts that are out of print but still in demand may also be considered.
The timeliness of a manuscript is sometimes more important than its form, which may in such cases be preliminary or tentative.
Details of the editorial policy can be found on the inside front-cover of a current volume. We recommend contacting the publisher or the series editors at an early stage of your project. Addresses are given on the inside back-cover.
Manuscripts should be prepared according to Springer-Verlag's standard specifications. LaT|iX style files may be found at www.springeronline.com [click on <Mathematics>, then on <For Authors> and look for <Macro Packages for books>]. Style files for other TeX-versions, and additional technical instructions, if necessary, are available on request from: lnm@springer.de.
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong" (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adoptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.
Sommaire
- Universal lossless data compression
- Links between data compression and statistical estimation
- Non cumulated mean risk
- Gibbs estimators
- Randomized estimators and empirical complexity
- Deviation inequalities
- Markov chains with exponential transitions
Caractéristiques techniques
PAPIER | |
Éditeur(s) | Springer |
Auteur(s) | Olivier Catoni |
Collection | Lectures Notes in Mathematics |
Parution | 13/10/2004 |
Nb. de pages | 272 |
Format | 15,5 x 23,5 |
Couverture | Broché |
Poids | 442g |
Intérieur | Noir et Blanc |
EAN13 | 9783540225720 |
Avantages Eyrolles.com
Nos clients ont également acheté
Consultez aussi
- Les meilleures ventes en Graphisme & Photo
- Les meilleures ventes en Informatique
- Les meilleures ventes en Construction
- Les meilleures ventes en Entreprise & Droit
- Les meilleures ventes en Sciences
- Les meilleures ventes en Littérature
- Les meilleures ventes en Arts & Loisirs
- Les meilleures ventes en Vie pratique
- Les meilleures ventes en Voyage et Tourisme
- Les meilleures ventes en BD et Jeunesse