外文科技图书简介
当前位置:首页 > 检索结果 >文献详细内容

书名:An introduction to sparse stochastic processes

责任者:Michael Unser and Pouya D. Tafti  |  cole Polytechnique Fédérale  |  Lausanne.  |  Tafti, Pouya,

ISBN\ISSN:9781107058545,1107058546 

出版时间:2014

出版社:Cambridge University Press

分类号:


摘要

Providing a novel approach to sparsity, this comprehensive book presents the theory of stochastic processes that are ruled by linear stochastic differential equations, and that admit a parsimonious representation in a matched wavelet-like basis. Two key themes are the statistical property of infinite divisibility, which leads to two distinct types of behaviour - Gaussian and sparse - and the structural link between linear stochastic processes and spline functions, which is exploited to simplify the mathematical analysis. The core of the book is devoted to investigating sparse processes, including a complete description of their transform-domain statistics. The final part develops practical signal-processing algorithms that are based on these models, with special emphasis on biomedical image reconstruction. This is an ideal reference for graduate students and researchers with an interest in signal/image processing, compressed sensing, approximation theory, machine learning, or statistics.

查看更多

目录

Preface xiii

Notation xv

1 Introduction 1

1.1 Sparsity: Occam’s razor of modem signal processing? 1

1.2 Sparse stochastic models: the step beyond Gaussianity 2

1.3 From spli nes to stochastic processes, or when Schoenberg meets Levy 5

      1.3.1 Splines and Legos revisited 5

      1.3.2 Higher-degree polynomial splines 8

      1.3.3 Random splines, innovations, and Levy processes 9

      1.3.4 Wavelet analysis of Levy processes and M-tenn approximations 12

      1.3.5 Levy’s wavelet-based synthesis of Brownian motion 15

1.4 Historical notes: Paul Levy and his legacy 16

2 Roadmap to the book 19

2.1 On the implications of the innovation model 20

      2.1.1 Linear combination of sampled values 20

      2.1.2 Wavelet analysis 21

2.2 Organization of the book22

3 Mathematical context and background 25

3.1 Some classes of function spaces 25

      3.1.1 About the notation: mathematics vs. engineering 28

      3.1.2 Normed spaces 28

      3.1.3 Nuclear spaces 29

3.2 Dual spaces and adjoint operators 32

      3.2.1 The dual of lp spaces 33

      3.2.2 The duals of @ and y 33

      3.2.3 Distinction between Hermitian and duality products 34

3.3 Generalized functions 35

      3.3.1 Intuition and definition 35

      3.3.2 Operations on generalized functions 36

      3.3.3 The Fourier transform of generalized functions 37

      3.3.4 The kernel theorem 38

      3.3.5 Linear shift-invariant operators and convolutions 39

      3.3.6 Convolution operators on Lp (Rd) 40

3.4 Probability theory 43

      3.4.1 Probability measures 43

      3.4.2 Joint probabilities and independence 44

      3.4.3 Characteristic functions in finite dimensions 45

      3.4.4 Characteristic functiona ls innite dimensions 46

3.5 Genera lized random processes and fields 47

      3.5.1 Generalized random processes as collections of random variables 47

      3.5.2 General ized random processes as random genera l ized functions 49

      3.5.3 Detem1ination of statistics from the characteristic functiona l 49

      3.5.4 Operations on generalized stochastic processes 51

      3.5.5 I nnovation processes 52

      3.5.6 Example:filtered white Gaussian noise 53

3.6 Bibliographical pointers and historical notes 54

4 Continuous-domain innovation models 57

4.1 Lntroductioo : from Gaussian to sparse probability distributions 58

4.2 Levy exponents and infinitely divisible distributions 59

      4.2.1 Canonical Levy-Khintchine representation 60

      4.2.2 Deciphering the Levy-Khintchioe formula 64

      4.2.3 Gaussian vs. sparse categorization 68

      4.2.4 Proofs of Theorems 4.1 and 4.2 69

4.3 Finite-d imensional innovation model 71

4.4 Wh i te Levy noises or innovat ions 73

      4.4.1 Specication of wh i te noise in Schwartz’ space y’ 73

      4.4.2 Impulsive Poisson noise 76

      4.4.3 Properties of white noise 78

4.5 Genera l ized stochastic processes and linear models 84

      4.5.1 Innovation models 84

      4.5.2 Ex istence and characterization of tbe solution 84

4.6 Bibliographical notes 87

5 Operators and their inverses 89

      5.1 Introductory example: first-order differential equation 90

      5.2 Sh ift-inva riant i nverse operators 92

      5.3 Stable differential systems in 1 -D 95

      5.3.1 First-order differential operators wi th stable inverses 96

      5.3.2 Higher-order di fferential operators with stable inverses 96

5.4 Unstable Nth-order diserential systems 97

      5.4.1 First-order differential operators with unstable shift-in variant inverses 97

      5.4.2 H igher-order differential operators with unstable sh ift-invariant inverses 101

      5.4.3 Generalized boundary condi tions 102

5.5 Fractional-order operators 104

      5.5.1 Fractional derivatives in one dimension 104

      5.5.2 Fractional Laplacians 107

      5.5.3 lp-stabl e inverses 108

5.6 Discrete convolution operators 109

5.7 Bibliographical notes 111

6 Splines and wavelets 113

6.1 From Legos to wavelets 113

6.2 Basic concepts and definitions 118

      6.2.1 Spline-admissible operators 118

      6.2.2 Spl ines and operators 120

      6.2.3 Riesz bases 121

      6.2.4 Admissible wavelets 124

6.3 First-order exponential B-splines and wavelets 124

      6.3.1 B-spline construction 125

      6.3.2 lnterpolator in augmented-order spline space 126

      6.3.3 Di fferential wavelets 126

6.4 General ized 8-sp line basis 127

      6.4.1 B-spline properties 128

      6.4.2 8-spline factorization 136

      6.4.3 Polynomial B-splines 137

      6.4.4 Exponential B-splines 138

      6.4.5 Fractional B-splines 139

      6.4.6 Additional brands of univariate B-splines 141

      6.4.7 Multidi mensional B-splines 141

6.5 Generalized operator-like wavelets 142

      6.5.1 Multiresolution analysis of l2 (Rl) 142

      6.5.2 Multiresolution B-splines and the two-scale relation 143

      6.5.3 Constrnction of an operator-like wavelet basis 144

6.6 Bibliographical notes 147

7 Sparse stochastic processes 150

7.1 Introductory example: non-Gaussian AR( 1) processes 150

7.2 General abstract characterization 152

7.3 Non-Gaussian stationary processes 158

      7.3.1 Autocorrelation function and power spectrnm 159

      7.3.2 Generalized increment process 160

      7.3.3 Genera lized stationary Gaussian processes 161

      7.3.4 CARMA processes 162

7.4 Levy processes and their higher-order extensions 163

      7.4.1 Levy processes 163

      7.4.2 Higher-order Extensions of Levy processes 166

      7.4.3 N on-stationary Levy correlations 167

      7.4.4 Removal of long-range dependencies 169

      7.4.5 Examples of sparse processes 172

      7.4.6 Mixed processes 175

7.5 Self-similar processe 176

      7.5.1 Stable fractal processe 177

      7.5.2 Fractional Brownian motion through the looking-gla 180

      7.5.3 Scale-invariant Poisson processe 185

7.6 Bibliographical note 187

8 Sparse representations 191

8.1 Decoupling of Levy processes:币nite differences vs. wavelets 191

8.2 Extended theoretica l framework 194

      8.2.1 Discretization mechanism: sampling vs. projection 194

      8.2.2 Analysis of white nois巳 with non-smooth functions 195

8.3 Generalized increm巳nts for the decoupling of sample values 197

      8.3.1 First-order statistical characterization 199

      8.3.2 Higher-order statistical dependencie 200

      8.3.3 Generalized i ncrements and stochastic di fference equations 20 I

      8.3.4 Discrete wh itening filter 202

      8.3.5 Robust local ization 202

8.4 Wavelet analysis 205

      8.4.1 Wavelet-domain statistics 206

      8.4.2 Higher-order wavel depend encies and cumulanis 208

8.5 Optimal represent ation of Levy and AR( I ) processes 210

      8.5.1 Generalized increments and first-order linear prediction 211

      8.5.2 Vector-man·ix formulation 212

      8.5.3 Transform-domain statistics 212

      8.5.4 Comparison of orthogonal transfom1s 2 16

8.6 Bibliographical notes 222

9 Infinite divisibility and transform-domain statistics 223

9.1 Composition of id laws, spectral mixing, and analysis of white noise 224

9.2 Class C and unirnodality 230

9.3 Self-decomposable distri bution 232

9.4 Stable distribution 234

9.5 Rate of decay 235

9.6 Levy exponents and cumulants 237

9.7 Sernigroup prope 239

      9.7.1 Gaussian case 241

      9.7.2 SaS case 241

      9.7.3 Compound-Poisson case 241

      9.7.4 General i terated-convolution interpretation 241

9.8 M ultiscale analysis 242

      9.8.1 Scale evolution of the pdf 243

      9.8.2 Scale evolution of the moments 244

      9.8.3 Asymptotic convergence to a Gaussian/stable distribution 246

9.9 Notes and pointers to the literature 247

10 Recovery of sparse signals 248

10.1 Discretization of linear inverse problems 249

      10.1.1 Shi ft-invariant reconstruction subspace 249

      10.1.2 Finite-dimensiona l formulation 252

10.2 MAP estimation and regularization 255

      10.2.1 Potential function 256

      10.2.2 LMMSE/Gaussian solution 258

      10.2.3 Proxi mal operators 259

      10.2.4 MAP estimation 261

10.3 MAP reconstruction of biomedical images 263

      10.3.1 Scale-invariant image model and common numerical setup 264

      10.3.2 Deconvolution of fluorescence micrographs 265

      10.3.3 Magnetic resonance imaging 269

      10.3.4 X-ray tomography 272

      10.3.5 Discussion 276

10.4 The quest for the minimum-error solution 277

      10.4.1 M MSE estimators for first-order processes 278

      10.4.2 Di rect solution by belief propagation 279

      10.4.3 MMSE vs. MAP denoising of Lay processes 283

10.5 Bibliographical notes 286

11 Wavelet-domain methods 290

11.1 Discretization of i_nverse problems in a wavelet basis 291

      11.1.1 Specification of wavelet-domain MAP estimator 292

      11.1.2 Evolution of the potential function across scales 293

11.2 Wavelet-based methods for solvi ng linear inverse problems 294

      11.2.1 Preliminaries 295

      11.2.2 Iterati ve shrinkage/thresholding algorithm 296

      11.2.3 Fast iterative shrinkage/t hresholding algorithm 297

      11.2.4 Discussion of wavelet-based image reconstruction 298

11.3 Study of wavelet-domain shrinkage estimators 300

      11.3.1 Pointwise MAP estimators for AWG 301

      11.3.2 Pointwise MMSE estimators for AWGN 301

      11.3.3 Comparison of shrinkage functions: MAP vs. MMSE 303

      11.3.4 Conclusion on simple wavelet-domain shrinkage estimators 312

11.4 Improved denoising by consistent cycle spinning 313

      11.4.1 First-order wavelets: design and implementation 313

      11.4.2 From wavelet bases to tight wavelet frames 315

      11.4.3 Iterative MAP denoising 318

      11.4.4 Iterative MMSE denoising 320

11.5 Bibliographical notes 324

12 Conclusion 326

Appendix A Singular integrals 328

A.1 Regularization of singular integrals by analytic continuation 329

A.2 Fourier transfom1 of homogeneous distributions 331

A.3 Hadamard ’s finite part 332

A.4 Some convolution integrals with singular kernels 334

Appendix B Positive definiteness 336

B.1 Positive definiteness and Bochner’s theorem 336

B.2 Condi tiona lly positive-definite functions 339

B.3 Levy-Khintchine fommla from the point of view of generalized functions 342

Appendix C Special functions 344

C.1 Mod ified Bessel functions 344

C.2 Gamma function 344

C.3 Symmetric-a lpha-stable distributions 346

References 347

Index 363

查看PDF
查看更多

馆藏单位

中科院文献情报中心