Bayesian approach for inverse problems in imaging systems

Mar 27, 2015 - Example 2: Seeing outside of a body: Making an image using a camera, a ... h(x,y): Point Spread Function (PSF) of the imaging system ..... More specific and specialized priors, particularly through the ...... A. Mohammad-Djafari, Gauss-Markov-Potts Priors for Images in Computer Tomography Resulting to.
23MB taille 1 téléchargements 183 vues
.

Inverse problems in signal and image processing and Bayesian inference framework: from basic to advanced Bayesian computation Ali Mohammad-Djafari Laboratoire des Signaux et Syst`emes (L2S) UMR8506 CNRS-CentraleSup´elec-UNIV PARIS SUD SUPELEC, 91192 Gif-sur-Yvette, France http://lss.centralesupelec.fr Email: [email protected] http://djafari.free.fr http://publicationslist.org/djafari

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 1/77

Contents 1. Signal and Image Processing: Classical/Inverse problems approaches 2. Inverse problems examples I I I I

Instrumentation Imaging systems to see outside of a body Imaging systems to see inside of a body Other imaging systems (Acoustics, Radar, SAR,...)

3. Analytical/Algebraic methods 4. Deterministic regularization methods and their limitations 5. Bayesian approach 6. Two main steps: Priors and Computational aspects 7. Case studies: Instrumentation, X ray Computed Tomography, Microwave imaging, Acoustic source localisation, Ultrasound imaging, Satellite image restoration, etc. A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 2/77

Signal and Image Processing: Classical/Inverse problems approach I

Classical: You have given a signal or an image, process it. Examples: I

I

I

Model based or Inverse problem approach: I I I I

I I

Signal: Detect periodicities, changes, Model it for prediction, ... AR, MA, ARMA modeling,... Parameter estimation,... Image: Enhancement, Restoration, Segmentation, Contour detection, Compression, ... What represent the observed signal or image? How they are related to the desired unknowns? Forward modelling / Inversion Examples: Deconvolution, Image restoration, Image reconstruction in Computed Tomography (CT), ...

PCA, ICA / Blind source Separation, Compressed Sensing / L1 Regularization, Bayesian sparsity enforcing

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 3/77

Inverse Problems examples I

Example 1: Instrumentation: Measuring the temperature with a thermometer Deconvolution I I

I

Example 2: Seeing outside of a body: Making an image using a camera, a microscope or a telescope: Image restoration I I

I

f (x, y ) real scene g (x, y ) observed image

Example 3: Seeing inside of a body: Computed Tomography usng X rays, US, Microwave, etc.: Image reconstruction I I

I

f (t) input of the instrument g (t) output of the instrument

f (x, y ) a section of a real 3D body f (x, y , z) gφ (r ) a line of observed radiographe gφ (r , z)

Example 4: Seeing differently: MRI, Radar, SAR, Infrared, etc.: Fourier Synthesis I I

A. Mohammad-Djafari,

f (x, y ) a section of body or a scene g (u, v ) partial data in the Fourier domain Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 4/77

Measuring variation of temperature with a therometer I

f (t) variation of temperature over time

I

g (t) variation of length of the liquid in thermometer

I

Forward model: Convolution Z g (t) = f (t 0 ) h(t − t 0 ) dt 0 + (t) h(t): impulse response of the measurement system

I

Inverse problem: Deconvolution Given the forward model H (impulse response h(t))) and a set of data g (ti ), i = 1, · · · , M find f (t)

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 5/77

Measuring variation of temperature with a therometer Forward model: Convolution Z g (t) = f (t 0 ) h(t − t 0 ) dt 0 + (t)

f (t)−→

Thermometer h(t) −→

g (t)

Inversion: Deconvolution f (t)

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

g (t)

Scube seminar, L2S, CentraleSupelec, March 27, 2015 6/77

Seeing outside of a body: Making an image with a camera, a microscope or a telescope I

f (x, y ) real scene

I

g (x, y ) observed image

I

Forward model: Convolution ZZ g (x, y ) = f (x 0 , y 0 ) h(x − x 0 , y − y 0 ) dx 0 dy 0 + (x, y ) h(x, y ): Point Spread Function (PSF) of the imaging system

I

Inverse problem: Image restoration Given the forward model H (PSF h(x, y ))) and a set of data g (xi , yi ), i = 1, · · · , M find f (x, y )

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 7/77

Making an image with an unfocused camera Forward model: 2D Convolution ZZ g (x, y ) = f (x 0 , y 0 ) h(x − x 0 , y − y 0 ) dx 0 dy 0 + (x, y ) (x, y ) ?  - + -g (x, y ) 

f (x, y ) - h(x, y )

Inversion: Image Deconvolution or Restoration ? ⇐=

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 8/77

Seeing inside of a body: Computed Tomography I

f (x, y ) a section of a real 3D body f (x, y , z)

I

gφ (r ) a line of observed radiography gφ (r , z)

I

Forward model: Line integrals or Radon Transform Z gφ (r ) = f (x, y ) dl + φ (r ) L

ZZ r ,φ = f (x, y ) δ(r − x cos φ − y sin φ) dx dy + φ (r ) I

Inverse problem: Image reconstruction Given the forward model H (Radon Transform) and a set of data gφi (r ), i = 1, · · · , M find f (x, y )

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 9/77

2D and 3D Computed Tomography 3D

2D

Z gφ (r1 , r2 ) =

Z f (x, y , z) dl

Lr1 ,r2 ,φ

gφ (r ) =

f (x, y ) dl Lr ,φ

Forward probelm: f (x, y ) or f (x, y , z) −→ gφ (r ) or gφ (r1 , r2 ) Inverse problem: gφ (r ) or gφ (r1 , r2 ) −→ f (x, y ) or f (x, y , z) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 10/77

Computed Tomography: Radon Transform

Forward: Inverse:

A. Mohammad-Djafari,

f (x, y ) f (x, y )

−→ ←−

Inverse problems and Bayesian inference,

g (r , φ) g (r , φ)

Scube seminar, L2S, CentraleSupelec, March 27, 2015 11/77

Microwave or ultrasound imaging Measures: diffracted wave by the object g (ri ) Unknown quantity: f (r) = k02 (n2 (r) − 1) Intermediate quantity : φ(r) ZZ

Gm (ri , r 0 )φ(r 0 ) f (r 0 ) dr 0 , ri ∈ S D ZZ φ(r) = φ0 (r) + Go (r, r 0 )φ(r 0 ) f (r 0 ) dr 0 , r ∈ D g (ri ) =

D

Born approximation (φ(r 0 ) ' φ0 (r 0 )) ): ZZ g (ri ) = Gm (ri , r 0 )φ0 (r 0 ) f (r 0 ) dr 0 , ri ∈ S D

r

r

r

r L , aa r , E - E r e φ0r (φ, f )% r % r r r r g r r

r

! !

Discretization:   g = H(f ) g = Gm F φ −→ with F = diag(f ) φ= φ0 + Go F φ  H(f ) = Gm F (I − Go F )−1 φ0 A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 12/77

Fourier Synthesis in X ray ZZ Tomography f (x, y ) δ(r − x cos φ − y sin φ) dx dy

g (r , φ) = Z G (Ω, φ) =

g (r , φ) exp [−jΩr ] dr ZZ

F (u, y ) = F (v , y ) = G (Ω, φ) y 6 s

f (x, y ) exp [−jvx, yy ] dx dy for

@ I @ @ @ @ (x, y ) @ f @  @ @ φ @ H H @ @ @

u = Ω cos φ

r

α



@ I @ @

Ω 

@

F (ωx , @ ωy ) @

@ φ @ @

-

x

g (r , φ)–FT–G (Ω, φ) @ @

A. Mohammad-Djafari,

and v = Ω sin φ v 6

Inverse problems and Bayesian inference,

-

u

@ @ @

Scube seminar, L2S, CentraleSupelec, March 27, 2015 13/77

Fourier Synthesis in X ray tomography ZZ G (u, v ) =

f (x, y ) exp [−j (ux + vy )] dx dy

? =⇒

Forward problem: Given f (x, y ) compute G (u, v ) Inverse problem: Given G (u, v ) on those lines estimate f (x, y ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 14/77

Fourier Synthesis in Diffraction tomography

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 15/77

Fourier Synthesis in Diffraction tomography ZZ G (u, v ) =

f (x, y ) exp [−j (ux + vy )] dx dy

? =⇒ Forward problem: Given f (x, y ) compute G (u, v ) Inverse problem : Given G (u, v ) on those semi cercles estimate f (x, y ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 16/77

Fourier Synthesis in different imaging systems ZZ G (u, v ) =

X ray Tomography

f (x, y ) exp [−j (ux + vy )] dx dy

Diffraction

Eddy current

SAR & Radar

Forward problem: Given f (x, y ) compute G (u, v ) Inverse problem : Given G (u, v ) on those algebraic lines, cercles or curves, estimate f (x, y ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 17/77

Linear inverse problems I

Deconvolution Z f (τ )h(t − τ ) dτ

g (t) = I

Image restoration Z g (x, y ) =

f (x 0 , y 0 )h(x − x 0 , y − y 0 ) dx dy

I

Image reconstruction in X ray CT Z g (r , φ) = f (x, y )δ(r − x cos φ − y sin φ) dx dy

I

Fourier synthesis Z g (u, v ) =

I

f (x, y ) exp [−j(ux + vy )] dx dy

Unified linear relation Z g (s) =

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

f (r) h(s, r) dr Scube seminar, L2S, CentraleSupelec, March 27, 2015 18/77

Linear Inverse Problems Z g (si ) = I

h(si , r) f (r) dr + (si ),

i = 1, · · · , M

f (r) is assumed to be well approximated by N X f (r) ' fj φj (r) j=1

with {φj (r)} a basis or any other set of known functions Z N X g (si ) = gi ' fj h(si , r) φj (r) dr, i = 1, · · · , M Z j=1 g = Hf +  with Hij = h(si , r) φj (r) dr I

H is huge dimensional I

I

1D: 103 × 103 ,

2D: 106 × 106 ,

3D: 109 × 109

Due to ill-posedness of the inverse problems, Least squares b = arg minf {J(f )} with (LS) methods: f J(f ) = kg − Hf k2 do not give satisfactory result. Need for regularization methods: J(f ) = kg − Hf k2 + λkf k2

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 19/77

Regularization theory Inverse problems = Ill posed problems −→ Need for prior information Functional space (Tikhonov): g = H(f ) +  J(f ) = ||g − H(f )||22 + λ||Df ||22 Finite dimensional space (Philips & Towmey): g = Hf +  J(f ) = kg − Hf k2 + λkf k2 • Minimum norme LS (MNLS): • Classical regularization: • More general regularization: or

J(f ) = ||g − H(f )||2 + λ||f ||2 J(f ) = ||g − H(f )||2 + λ||Df ||2

J(f ) = Q(g − H(f )) + λΩ(Df )

J(f ) = ∆1 (g, H(f )) + λ∆2 (Df , f0 ) Limitations: • Errors are considered implicitly white and Gaussian • Limited prior information on the solution • Lack of tools for the determination of the hyperparameters A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 20/77

Inversion: Probabilistic methods Taking account of errors and uncertainties −→ Probability theory I

Maximum Likelihood (ML)

I

Minimum Inaccuracy (MI)

I

Probability Distribution Matching (PDM)

I

Maximum Entropy (ME) and Information Theory (IT)

I

Bayesian Inference (Bayes)

Advantages: I

Explicit account of the errors and noise

I

A large class of priors via explicit or implicit modeling

I

A coherent approach to combine information content of the data and priors

Limitations: I

Practical implementation and cost of calculation

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 21/77

Bayesian estimation approach M: g = Hf +  I Observation model M + Hypothesis on the noise  −→ p(g|f ; M) = p (g − Hf ) I A priori information p(f |M) p(g|f ; M) p(f |M) I Bayes : p(f |g; M) = p(g|M) Link with regularization : I

Maximum A Posteriori (MAP) : b = arg max {p(f |g)} = arg max {p(g|f ) p(f )} f f

f

= arg min {J(f ) = − ln p(g|f ) − ln p(f )} f

I

Regularization: b = arg min {J(f ) = Q(g, Hf ) + λΩ(f )} f f

with Q(g, Hf ) = − ln p(g|f ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

and

λΩ(f ) = − ln p(f )

Scube seminar, L2S, CentraleSupelec, March 27, 2015 22/77

Case of linear models and Gaussian priors g = Hf +  I

Prior knowledge on the noise: ∼

I

  1 2 → p(g|f ) ∝ exp − 2 kg − Hf k 2σ

Prior knowledge on f : f∼

I

N (0, σ2 I)

N (0, σf2 (D 0 D)−1 )



1 → p(f ) ∝ exp − 2 kDf k2 2σf



A posteriori: 

1 1 p(f |g) ∝ exp − 2 kg − Hf k2 − 2 kDf k2 2σ 2σf



b = arg maxf {p(f |g)} = arg minf {J(f )} f

I

MAP :

I

with J(f ) = kg − Hf k2 + λkDf k2 , λ = σσ2 f Advantage : characterization of the solution  b , Pb ) with f b = Pb H 0 g, Pb = H 0 H + λD 0 D −1 p(f |g) = N (f

2

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 23/77

MAP estimation with other priors: b = arg min {J(f )} with J(f ) = kg − Hf k2 + λΩ(f ) f f

Separable priors: I I

  P Gaussian: p(fj ) ∝ exp −α|fj |2 −→ Ω(f ) = α j |fj |2 P Gamma: p(fj ) ∝ fjα exp [−βfj ] −→ Ω(f ) = α j ln fj + βfj

I

Beta: P P p(fj ) ∝ fjα (1 − fj )β −→ Ω(f ) = α j ln fj + β j ln(1 − fj )

I

Generalized Gaussian: p(fj ) ∝ exp [−α|fj |p ] ,

Markovian models:  p(fj |f ) ∝ exp −α

A. Mohammad-Djafari,

1 < p < 2 −→

Ω(f ) = α

P

j

|fj |p ,

 X

φ(fj , fi ) −→

Ω(f ) = α

XX

φ(fj , fi ),

i∈Nj

j

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 24/77

i∈Nj

MAP estimation with markovian priors: b = arg min {J(f )} f f

J(f ) = kg − Hf k2 + λΩ(f )

with

Ω(f ) =

X

φ(fj − fj−1 )

j

with φ(t) : Convex functions: |t|α ,

 2 p t |t| ≤ T 1 + t 2 − 1, log(cosh(t)), 2T |t| − T 2 |t| > T

or Non convex functions: 2

log(1 + t ),

I

t2 , 1 + t2

2

arctan(t ),



t 2 |t| ≤ T T 2 |t| > T

A great number of methods, optimization algorithms,...

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 25/77

Main advantages of the Bayesian approach I

MAP = Regularization

I

Posterior mean ? Marginal MAP ?

I

More information in the posterior law than only its mode or its mean

I

Tools for estimating hyper parameters

I

Tools for model selection

I

More specific and specialized priors, particularly through the hidden variables and hierarchical models More computational tools:

I

I

I I

I

A. Mohammad-Djafari,

Expectation-Maximization for computing the maximum likelihood parameters MCMC for posterior exploration Variational Bayes for analytical computation of the posterior marginals ... Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 26/77

Bayesian Estimation: Simple priors I I

Linear model: g = Hf +  Gaussian case:  p(g|f , θ1 ) = N (Hf , θ1 I) b, P b) −→ p(f |g, θ) = N (f p(f |θ2 ) = N (0, θ2 I) with

(

b = (H 0 H + λI)−1 , P b=P b H 0g f

λ=

θ1 θ2

b = arg min {J(f )} with J(f ) = kg − Hf k2 + λkf k2 f 2 2 f

I

Generalized Gaussian prior & MAP: b = arg min {J(f )} with J(f ) = kg − Hf k2 + λkf kβ f 2 f

I

Double Exponential (β = 1): b = arg min {J(f )} with J(f ) = kg − Hf k2 + λkf k1 f 2 f

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 27/77

Full (Unsupervised) Bayesian approach M:

g = Hf + 

I

Forward & errors model: −→ p(g|f , θ 1 ; M)

I

Prior models −→ p(f |θ 2 ; M)

I

Hyperparameters θ = (θ 1 , θ 2 ) −→ p(θ|M)

I

Bayes: −→ p(f , θ|g; M) =

I

Joint MAP:

I

I I

p(g|f ,θ;M) p(f |θ;M) p(θ|M) p(g|M)

b , θ) b = arg max {p(f , θ|g; M)} (f (f ,θ) R  p(f |g; M) = R p(f , θ|g; M) dθ Marginalization: p(θ|g; M) = p(f , θ|g; M) df ( RR b = f f p(f , θ|g; M) dθ df RR Posterior means: b θ = θ p(f , θ|g; M) df dθ Evidence of the model: ZZ p(g|M) = p(g|f , θ; M)p(f |θ; M)p(θ|M) df dθ

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 28/77

Two main steps in the Bayesian approach I

Prior modeling I

I

I

I

Separable: Gaussian, Gamma, Sparsity enforcing: Generalized Gaussian, mixture of Gaussians, mixture of Gammas, ... Markovian: Gauss-Markov, GGM, ... Markovian with hidden variables (contours, region labels)

Choice of the estimator and computational aspects I I I I

I

MAP, Posterior mean, Marginal MAP MAP needs optimization algorithms Posterior mean needs integration methods Marginal MAP and Hyperparameter estimation need integration and optimization Approximations: I I I

A. Mohammad-Djafari,

Gaussian approximation (Laplace) Numerical exploration MCMC Variational Bayes (Separable approximation)

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 29/77

Different prior models for signals and images: Separable

Gaussian   p(fj ) ∝ exp −α|fj |2

Generalized Gaussian p(fj ) ∝ exp [−α|fj |p ] , 1 ≤ p ≤ 2

Gamma p(fj ) ∝ fjα exp [−βfj ]

Beta p(fj ) ∝ fjα (1 − fj )β

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 30/77

Sparsity enforcing prior models I

Sparse signals: Direct sparsity

I

Sparse signals: Sparsity in a Transform domain

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 31/77

Sparsity enforcing prior models I

Simple heavy tailed models: I I I I I

I

Generalized Gaussian, Double Exponential Symmetric Weibull, Symmetric Rayleigh Student-t, Cauchy Generalized hyperbolic Elastic net

Hierarchical mixture models: I I I I I I

A. Mohammad-Djafari,

Mixture of Gaussians Bernoulli-Gaussian Mixture of Gammas Bernoulli-Gamma Mixture of Dirichlet Bernoulli-Multinomial

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 32/77

Which images I am looking for?

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 33/77

Which image I am looking for?

Gauss-Markov

Generalized GM

Piecewize Gaussian

Mixture of GM

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 34/77

Different prior models for signals and images: Separable I

Simple Gaussian, Gamma, Generalized Gaussian   X p(f ) ∝ exp  φ(f j ) j

I

Simple Markovian models: Gauss-Markov, Generalized Gauss-Markov   X X p(f ) ∝ exp  φ(f j − f i ) j

I

j∈N (i)

Hierarchical models with hidden variables: Bernouilli-Gaussian, Gaussian-Gamma     X X p(f |z) ∝ exp  p(f j |z j ) and p(z) ∝ exp  p(z j ) j

j

with different choices for p(f j |z j ) and p(z j ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 35/77

Hierarchical models and hidden variables I

Student-t model 

 ν+1 St(f |ν) ∝ exp − log 1 + f 2 /ν 2 I



Infinite Scaled Gaussian Mixture (ISGM) equivalence Z ∞ St(f |ν) ∝= N (f |, 0, 1/z) G(z|α, β) dz, with α = β = ν/2 0

i h 1P 2 N (f |0, 1/z ) ∝ exp − z f j j j j j j 2 Q Q (α−1) = j G(z hPj |α, β) ∝ j zj iexp [−βzj ] ∝ exp (α − 1) ln zj − βzj h jP i p(f , z|α, β) ∝ exp − 12 j zj fj2 + (α − 1) ln zj − βzj

  p(f |z)      p(z|α, β)      

A. Mohammad-Djafari,

=

Q

j p(fj |zj ) =

Q

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 36/77

Gauss-Markov-Potts prior models for images

f (r)

c(r) = 1 − δ(z(r) − z(r 0 ))

z(r)

p(f (r)|z(r) = k, mk , vk ) = N (mk , vk ) X p(f (r)) = P(z(r) = k) N (mk , vk ) Mixture of Gaussians k

I I

Q p(z) = r p(z(r)) p(z) Potts-Markov:

Separable iid hidden variables: Markovian hidden variables: 

 X

δ(z(r) − z(r 0 )) p(z(r)|z(r 0 ), r 0 ∈ V(r)) ∝ exp γ   r 0 ∈V(r) X X p(z) ∝ exp γ δ(z(r) − z(r 0 )) r∈R r 0 ∈V(r) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 37/77

Four different cases To each pixel of the image is associated 2 variables f (r) and z(r) I

f |z Gaussian iid, z iid : Mixture of Gaussians

I

f |z Gauss-Markov, z iid : Mixture of Gauss-Markov

I

f |z Gaussian iid, z Potts-Markov : Mixture of Independent Gaussians (MIG with Hidden Potts)

I

f |z Markov, z Potts-Markov : Mixture of Gauss-Markov (MGM with hidden Potts)

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

f (r)

z(r)

Scube seminar, L2S, CentraleSupelec, March 27, 2015 38/77

Bayesian Computation and Algorithms I

Joint posterior probability law of all the unknowns f , z, θ p(f , z, θ|g) ∝ p(g|f , θ 1 ) p(f |z, θ 2 ) p(z|θ 3 ) p(θ)

I I

I

I

I

Often, the expression of p(f , z, θ|g) is complex. Its optimization (for Joint MAP) or its marginalization or integration (for Marginal MAP or PM) is not easy Two main techniques: MCMC and Variational Bayesian Approximation (VBA) MCMC: Needs the expressions of the conditionals p(f |z, θ, g), p(z|f , θ, g), and p(θ|f , z, g) VBA: Approximate p(f , z, θ|g) by a separable one q(f , z, θ|g) = q1 (f ) q2 (z) q3 (θ) and do any computations with these separable ones.

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 39/77

Hierarchical models Simple case (1 layer): g = Hf +  θ2 θ1 p(f |g, θ) ∝ p(g|f , θ 1 ) p(f |θ 2 ) ?  ?  Objective: Infer on f  f  MAP: f b = arg maxf {p(f |g, θ)} Z H b ?  Posterior Mean (PM): f = p(f |g, θ) df g 

Unsupervised case (2 layers): β0

α0

?  ? 

θ2

θ1

p(f , θ|g) ∝ p(g|f , θ 1 ) p(f |θ 2 ) p(θ) Objective: Infer on f , θ

 b , θ) b = arg max(f ,θ) {p(f , θ|g)} JMAP: (f ?  ?  Z

 f  Marginalization: p(θ|g) = H

? 

p(f , θ|g) df

VBA: Approximate p(f , θ|g) by q1 (f ) q2 (θ)

g



A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 40/77

Hierarchical models (3 layers) p(f , z, θ|g) ∝ p(g|f , θ 1 ) p(f |z, θ 2 ) p(z|θ 3 ) p(θ) p(θ) = p(θ 1 |α0 ) p(θ 2 |β0 ) p(θ 3 |γ0 ) Objective: Infer on f , z, θ

γ0 ? 

β0

θ3

α

JMAP:

 0 b, z b = arg max(f ,z,θ) {p(f , z, θ|g)} b, θ) (f ?  ?  ? 

z θ2 θ1   Marginalization: Z @  ?  ? p(z, θ|g) = p(f , z, θ|g) df R f @  Z  H

? 

g



p(θ|g) =

p(z, θ|g) dz RR or p(f |g) = p(f , z, θ|g) dz dθ

VBA: Approximate p(f , z, θ|g) by q1 (f ) q2 (z) q3 (θ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 41/77

MCMC based algorithm p(f , z, θ|g) ∝ p(g|f , z, θ 1 ) p(f |z, θ 2 ) p(z|θ 3 ) p(θ) General scheme: b ∼ p(f |b b g) −→ z b g) −→ θ b ∼ (θ|fb, zb, g) b ∼ p(z|fb, θ, f z , θ, I

b g) ∝ p(g|f , θ) p(f |b b Estimate f using p(f |b z , θ, z , θ) When Gaussian, can be done via optimisation of a quadratic criterion.

I

b g) ∝ p(g|fb, zb, θ) b p(z) Estimate z using p(z|fb, θ, Often needs sampling (hidden discrete variable)

I

Estimate θ using p(θ|fb, zb, g) ∝ p(g|fb, σ2 I) p(fb|b z , (mk , vk )) p(θ) Use of Conjugate priors −→ analytical expressions.

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 42/77

Variational Bayesian Approximation I

Approximate p(f , θ|g) by q(f , θ|g) = q1 (f |g) q2 (θ|g) and then continue computations.

I I

Criterion KL(q(f , θ|g) : p(f , θ|g)) R RR RR KL(q : p) = q ln q/p = q1 q2 ln q1pq2 = q1 ln q1 + R RR q2 ln q2 − q ln p = −H(q1 ) − H(q2 )− < ln p >q

I

Iterative algorithm q1 −→ q2 −→ q1 −→ q2 , · · ·  h i  q1 (f ) ∝ exp hln p(g, f , θ; M)i q2 (θ) i h  q2 (θ) ∝ exp hln p(g, f , θ; M)i q1 (f )

p(f , θ|g) −→

A. Mohammad-Djafari,

Variational Bayesian Approximation

Inverse problems and Bayesian inference,

b b1 (f ) −→ f −→ q b b2 (θ) −→ θ −→ q

Scube seminar, L2S, CentraleSupelec, March 27, 2015 43/77

JMAP

b , θ) b = arg max {p(p(f , θ|g)} (f (f ,θ)

Alternate optimization: ( b = arg minθ {p(f , θ|g)} θ b = arg minf {p(f , θ|g)} f Main drawbacks: I

Convergence

I

Uncertainties in each step are not accounted for

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 44/77

Marginalization I

b = arg maxθ {p(θ|g)} where θ Z p(θ|g) = p(f , θ|g) df ∝ p(g|θ) p(θ)

Marginal MAP:

n o b = arg maxf p(f |θ, b g) or f Z b b g) df Posterior Mean: f = f p(f |θ, and then

I

Main drawback: Needs the expression of the Likelihood: Z p(g|θ) = p(g|f , θ 1 ) p(f |θ 2 ) df Not always analytically available −→ EM, SEM and GEM algorithms

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 45/77

EM and GEM algorithms I

I

I

EM and GEM Algorithms: f as hidden variable, g as incomplete data, (g, f ) as complete data ln p(g|θ) incomplete data log-likelihood ln p(g, f |θ) complete data log-likelihood Iterative algorithm:  b(k) ) = E  E-step: Q(θ, θ {ln p(g, f |θ)} b(k) ) p(f |g,θ n o  M-step: θ b(k) = arg maxθ Q(θ, θ b(k−1) ) GEM (Bayesian) algorithm:  b(k) ) = E  E-step: Q(θ, θ  M-step: θ b

(k)

{ln p(g, f |θ) + ln p(θ)} o b(k−1) ) Q(θ, θ

b p(f n|g,θ

= arg maxθ

(k)

)

b −→ p(f |θ, b g) −→ f b p(f , θ|g) −→ EM, GEM −→ θ A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 46/77

JMAP, Marginalization, VBA I

JMAP: p(f , θ|g) optimization

I

b −→ f b −→ θ

Marginalization p(f , θ|g) −→

p(θ|g)

b −→ p(f |θ, b g) −→ f b −→ θ

Joint Posterior Marginalize over f I

Variational Bayesian Approximation

p(f , θ|g) −→

A. Mohammad-Djafari,

Variational Bayesian Approximation

Inverse problems and Bayesian inference,

b −→ q1 (f ) −→ f b −→ q2 (θ) −→ θ

Scube seminar, L2S, CentraleSupelec, March 27, 2015 47/77

Variational Bayesian Approximation I

Approximate p(f , θ|g) by q(f , θ) = q1 (f ) q2 (θ) and then use them for any inferences on f and θ respectively.

I

Criterion KL(q(f Z Z Z , θ|g) : p(fZ, θ|g)) q1 q2 q q1 q2 ln KL(q : p) = q ln = p p Iterative algorithm q1 −→ q2 −→ q1 −→ q2 , · · ·

I

 h i  q b1 (f ) ∝ exp hln p(g, f , θ; M)ibq2 (θ) h i  q b2 (θ) ∝ exp hln p(g, f , θ; M)ibq1 (f ) p(f , θ|g) −→

A. Mohammad-Djafari,

Variational Bayesian Approximation

Inverse problems and Bayesian inference,

b −→ q1 (f ) −→ f b −→ q2 (θ) −→ θ

Scube seminar, L2S, CentraleSupelec, March 27, 2015 48/77

Variational Bayesian Approximation p(f , θ, g|, M) p(g|M) p(f , θ, g|M) = p(g|f , θ, M) p(f |θ, M) p(θ|M) ZZ p(f , θ|g; M) KL(q : p) = q(f , θ) ln df dθ q(f , θ) ZZ p(g, f , θ|M) p(g|M) = q(f , θ) df dθ q(f , θ) ZZ p(g, f , θ|M) ≥ q(f , θ) ln df dθ. q(f , θ) Free energy: ZZ p(g, f , θ|M) F(q) = q(f , θ) ln df dθ q(f , θ) Evidence of the model M: p(f , θ|g, M) =

p(g|M) = F(q) + KL(q : p) Minimizing KL(q : p) = Maximaizing F(q) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 49/77

VBA: Separable Approximation p(g|M) = F(q) + KL(q : p) q(f , θ) = q1 (f ) q2 (θ) Minimizing KL(q : p) = Maximizing F(q)

b2 ) = arg min {KL(q1 q2 : p)} = arg max {F(q1 q2 )} (b q1 , q (q1 ,q2 )

(q1 ,q2 )

KL(q1 q2 : p) is convexe wrt q1 when q2 is fixed and vise versa:  b1 = arg minq1 {KL(q1 q b2 : p)} = arg maxq1 {F(q1 q b2 )} q b2 = arg minq2 {KL(b q q1 q2 : p)} = arg maxq2 {F(b q1 q2 )}  h i  q b1 (f ) ∝ exp hln p(g, f , θ; M)ibq2 (θ) h i  q b2 (θ) ∝ exp hln p(g, f , θ; M)ibq1 (f ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 50/77

BVA: Choice of family of laws q1 and q2 Case 1 : −→ Joint MAP

I

(

I



I

(

 n o e M) fe= arg maxf p(f , θ|g; b1 (f |fe) = δ(f − fe) q n o e = δ(θ − θ) e −→θ e = arg maxθ p(fe, θ|g; M) b2 (θ|θ) q Case 2 : −→ EM

 e M)iq1 (f |θ) b1 (f ) q ∝ p(f |θ, g) Q(θ, θ)= hln p(f , θ|g; n o e −→ e e e e b2 (θ|θ) = δ(θ − θ) θ q = arg maxθ Q(θ, θ) Appropriate choice for inverse problems

 e g; M) Accounts for the uncertainties of b1 (f ) ∝ p(f |θ, q −→ b b and vise versa. e θ for f b2 (θ) ∝ p(θ|f , g; M) q Exponential families, Conjugate priors

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 51/77

JMAP, EM and VBA JMAP Alternate optimization Algorithm: n o e e = arg maxf p(f , θ|g) e e −→ f b f −→f θ (0) −→ θ−→ ↑ ↓ n o b ←− θ←− e e = arg maxθ p(f e , θ|g) ←− f e θ θ EM: e θ (0) −→ θ−→ ↑ b ←− θ←− e θ

e g) q1 (f ) = p(f |θ, e = hln p(f , θ|g)i Q(θ, θ) qo 1 (f ) n e = arg maxθ Q(θ, θ) e θ

b −→q1 (f ) −→ f ↓ ←− q1 (f )

VBA: h i b θ (0) −→ q2 (θ)−→ q1 (f ) ∝ exp hln p(f , θ|g)iq2 (θ) −→q1 (f ) −→ f ↑ ↓ h i b θ ←− q2 (θ)←− q2 (θ) ∝ exp hln p(f , θ|g)iq1 (f ) ←−q1 (f ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 52/77

y 6

Computed Tomography: Discretization Hij

 f (x, y ) 

Q Q

f1 Q QQ fjQ Q Q Q Qg

@ @ -

x

HH

i

fN

@

Z g (r , φ) =

f (x, y ) dl PL f (x, y ) = j fj bj (x, y )  1 if (x, y ) ∈ pixel j bj (x, y ) = 0 else N X gi = Hij fj + i j=1

g = Hf +  A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Case study: Reconstruction from 2 projections R g1 (x) = R f (x, y ) dy , g2 (y ) = f (x, y ) dx Very ill-posed inverse problem f (x, y ) = g1 (x) g2 (y ) Ω(x, y ) RΩ(x, y ) is a Copula: R Ω(x, y ) dx = 1 Ω(x, y ) dy = 1

Scube seminar, L2S, CentraleSupelec, March 27, 2015 53/77

Simple example

   

1 2 3 g1 g2 g3 g4

3 4 ? 4 6 ? 7 3   1   0 =   1 0

? ? 7 1 0 0 1

4 6

f1 f3 g3 f2 f4 g4 g1 g2   f1 0   1    f2    f3  0 f4 1

1 -1 0

-1 0 -1 1 0 1 0 0 f4 f7 g4 f5 f8 g5 f6 f9 g6 g2 g3

1 0 -1 0 0

I

f1 0 f2 1 f3 1 g1 0 Hf = g −→ fb = H −1 g if H invertible.

I

H is rank deficient: rank(H) = 3

I

Problem has infinite number of solutions.

I

How to find all those solutions ?

I

Which one is the good one? Needs prior information.

I

To find an unique solution, one needs either more data or prior information.

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 54/77

Application in CT: Reconstruction from 2 projections

g|f f |z g = Hf +  iid Gaussian g|f ∼ N (Hf , σ2 I) or Gaussian Gauss-Markov

z iid or Potts

c q(r) ∈ {0, 1} 1 − δ(z(r) − z(r 0 )) binary

p(f , z, θ|g) ∝ p(g|f , θ 1 ) p(f |z, θ 2 ) p(z|θ 3 ) p(θ)

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 55/77

Proposed algorithms p(f , z, θ|g) ∝ p(g|f , θ 1 ) p(f |z, θ 2 ) p(z|θ 3 ) p(θ) • MCMC based general scheme: b g) −→ zb ∼ p(z|fb, θ, b g) −→ θ b ∼ (θ|fb, zb, g) fb ∼ p(f |b z , θ, Iterative algorithme: I

I

b g) ∝ p(g|f , θ) p(f |b b Estimate f using p(f |b z , θ, z , θ) Needs optimisation of a quadratic criterion. b g) ∝ p(g|fb, zb, θ) b p(z) Estimate z using p(z|fb, θ,

Needs sampling of a Potts Markov field. I

Estimate θ using p(θ|fb, zb, g) ∝ p(g|fb, σ2 I) p(fb|b z , (mk , vk )) p(θ) Conjugate priors −→ analytical expressions.

• Variational Bayesian Approximation I

Approximate p(f , z, θ|g) by q1 (f ) q2 (z) q3 (θ)

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 56/77

Results

Original

Backprojection

Filtered BP

Gauss-Markov+pos

GM+Line process

GM+Label process

c

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

LS

z

c

Scube seminar, L2S, CentraleSupelec, March 27, 2015 57/77

Application in Acoustic source localization (Ning Chu et al.) 1.4

1.4

10

2 1.3

1.3

8

0 1.2

1.2 6 −2

1.1

1.1

y (m)

y (m)

4 −4 1

1 2

−6

0.9

0.9 0

−8

0.8

0.7

0.8

−2

0.7

−10

0.6

−4

0.6 −1.2

−1

−0.8

−0.6 x (m)

−0.4

−0.2

0

−1.2

Source powers

−1

−0.8

−0.6 x (m)

−0.4

−0.2

0

Beamforming powers

1.4

1.4 2

2

1.3

1.3 0

0

1.2

1.2 −2

−2 1.1

−4 1

y (m)

y (m)

1.1

−4

1

−6 0.9

−6

0.9 −8

0.8

−10

0.7

−12

0.6 −1.2

−1

−0.8

−0.6 x (m)

−0.4

−0.2

0

Bayesian inversion InverseMAP problems and Bayesian inference,

A. Mohammad-Djafari,

−8

0.8

−10

0.7

−12

0.6 −1.2

−1

−0.8

−0.6 x (m)

−0.4

−0.2

0

Proposed VBA inversion

Scube seminar, L2S, CentraleSupelec, March 27, 2015 58/77

Application in Acoustic source localization (Ning Chu et al.)

A. Mohammad-Djafari,

Wind tunnel

Beamforming

DAMAS

Proposed VBA inference

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 59/77

Application in Microwave imaging Z g (ω) =

f (r) exp [−j(ω.r)] dr + (ω)

ZZ g (u, v ) =

f (x, y ) exp [−j(ux + vy )] dx dy + (u, v ) g = Hf + 

f (x, y )

A. Mohammad-Djafari,

g (u, v )

Inverse problems and Bayesian inference,

fb IFT

fb Proposed method

Scube seminar, L2S, CentraleSupelec, March 27, 2015 60/77

Microwave Imaging for Breast Cancer detection (L. Gharsalli et al.) receivers

tumor ( D₄ )

2 cm

skin ( D₂ )

7.5 c m

S 10 cm

source

breast ( D₃ ) D₁

D 12.2 cm

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 61/77

Microwave Imaging for Breast Cancer detection CSI: Contrast Source Inversion, VBA: Variational Bayesian Approach, MGI: Independent Gaussian mixture, MGM: Gauss-Markov mixture

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 62/77

Images fusion and joint segmentation (with O. F´eron)   gi (r) = fi (r) + i (r) 2 p(fi (r)|z(r) Q = k) = N (mi k , σi k )  p(f |z) = i p(fi |z)

g1

−→

b f 1 b z

g2

A. Mohammad-Djafari,

b f 2

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 63/77

Data fusion in medical imaging (with O. F´eron)   gi (r) = fi (r) + i (r) 2 p(fi (r)|z(r) Q = k) = N (mi k , σi k )  p(f |z) = i p(fi |z)

g1

−→

b f 1 b z

g2 A. Mohammad-Djafari,

b f 2 Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 64/77

Joint segmentation of hyper-spectral images (with N. Bali & A. Mohammadpour)  gi (r) = fi (r) + i (r)    2 p(fi (r)|z(r) Q = k) = N (mi k , σi k ), k = 1, · · · , K p(f |z) = i p(fi |z)    mi k follow a Markovian model along the index i

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 65/77

Segmentation of a video sequence of images (with P. Brault)  gi (r) = fi (r) + i (r)    2 p(fi (r)|zi (r) Q = k) = N (mi k , σi k ), k = 1, · · · , K p(f |z) = i p(fi |zi )    zi (r) follow a Markovian model along the index i

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 66/77

Image separation in Sattelite imaging (with H. Snoussi & M. Ichir)  N  X    Aij fj (r) + i (r)  gi (r) = j=1

 p(fj (r)|zj (r) = k) = N (mj k , σj2 k )     p(A ) = N (A , σ 2 ) 0ij ij 0 ij

A. Mohammad-Djafari,

fInverse problems and Bayesian g inference,

b f

b z

Scube seminar, L2S, CentraleSupelec, March 27, 2015 67/77

Conclusions I

I

I I

I

I I

Inverse problems arise in many science and engineering applications Deterministic Algorithms: Optimization of a two terms criterion, penalty term, regularization term Probabilistic: Bayesian approach Hierarchical prior model with hidden variables are very powerful tools for Bayesian approach to inverse problems. Gauss-Markov-Potts models for images incorporating hidden regions and contours Main Bayesian computation tools: JMAP, MCMC and VBA Application in different imaging system (X ray CT, Microwaves, PET, Ultrasound, Optical Diffusion Tomography (ODT), Acoustic source localization,...)

Current Projects: I

Efficient implementation in 2D and 3D cases

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 68/77

Thanks to: Present PhD students: I L. Gharsali (Microwave imaging for Cancer detection) I M. Dumitru (Multivariate time series analysis for biological signals) I S. AlAli (Diffraction imaging for geophysical applications) Freshly Graduated PhD students: I C. Cai (2013: Multispectral X ray Tomography) I N. Chu (2013: Acoustic sources localization) I Th. Boulay (2013: Non Cooperative Radar Target Recognition) I R. Prenon (2013: Proteomic and Masse Spectrometry) I Sh. Zhu (2012: SAR Imaging) I D. Fall (2012: Emission Positon Tomography, Non Parametric Bayesian) I D. Pougaza (2011: Copula and Tomography) I H. Ayasso (2010: Optical Tomography, Variational Bayes) Older Graduated PhD students: I S. F´ ekih-Salem (2009: 3D X ray Tomography) I N. Bali (2007: Hyperspectral imaging) I O. F´ eron (2006: Microwave imaging) I F. Humblot (2005: Super-resolution) I M. Ichir (2005: Image separation in Wavelet domain) I P. Brault (2005: Video segmentation using Wavelet domain) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 69/77

Thanks to: Older Graduated PhD students: I H. Snoussi (2003: Sources separation) I Ch. Soussen (2000: Geometrical Tomography) I G. Mont´ emont (2000: Detectors, Filtering) I H. Carfantan (1998: Microwave imaging) I S. Gautier (1996: Gamma ray imaging for NDT) I M. Nikolova (1994: Piecewise Gaussian models and GNC) I D. Pr´ emel (1992: Eddy current imaging) Post-Docs: I J. Lapuyade (2011: Dimentionality Reduction and multivariate analysis) I S. Su (2006: Color image separation) I A. Mohammadpour (2004-2005: HyperSpectral image segmentation) Colleagues: I B. Duchˆ ene & A. Joisel (L2S)& G. Perruson (Inverse scattering and Microwave Imaging) I N. Gac (L2S) (GPU Implementation) I Th. Rodet (L2S) (Computed Tomography) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 70/77

Thanks to: National Collaborators I A. Vabre & S. Legoupil (CEA-LIST), (3D X ray Tomography) I E. Barat (CEA-LIST) (Positon Emission Tomography, Non Parametric Bayesian) I C. Comtat (SHFJ, CEA) (PET, Spatio-Temporal Brain activity) I J. Picheral (SSE, Sup´ elec) (Acoustic sources localization) I D. Blacodon (ONERA) (Acoustic sources separation) I J. Lagoutte (Thales Air Systems) (Non Cooperative Radar Target Recognition) I P. Grangeat (LETI, CEA, Grenoble) (Proteomic and Masse Spectrometry) I F. L´ evi (CNRS-INSERM, Hopital Paul Brousse) (Biological rythms and Chronotherapy of Cancer) International Collaborators I K. Sauer (Notre Dame University, IN, USA) (Computed Tomography, Inverse problems) I F. Marvasti (Sharif University), (Sparse signal processing) I M. Aminghafari (Amir Kabir University) (Independent Components Analysis) I A. Mohammadpour (AKU) (Statistical inference) I Gh. Yari (Tehran Technological University) (Probability and Analysis) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 71/77

References 1 I

A. Mohammad-Djafari, “Bayesian approach with prior models which enforce sparsity in signal and image processing,” EURASIP Journal on Advances in Signal Processing, vol. Special issue on Sparse Signal Processing, (2012).

I

A. Mohammad-Djafari (Ed.) Probl` emes inverses en imagerie et en vision (Vol. 1 et 2), Hermes-Lavoisier, Trait´ e Signal et Image, IC2, 2009,

I

A. Mohammad-Djafari (Ed.) Inverse Problems in Vision and 3D Tomography, ISTE, Wiley and sons, ISBN: 9781848211728, December 2009, Hardback, 480 pp.

I

A. Mohammad-Djafari, Gauss-Markov-Potts Priors for Images in Computer Tomography Resulting to Joint Optimal Reconstruction and segmentation, International Journal of Tomography & Statistics 11: W09. 76-92, 2008.

I

A Mohammad-Djafari, Super-Resolution : A short review, a new method based on hidden Markov modeling of HR image and future challenges, The Computer Journal doi:10,1093/comjnl/bxn005:, 2008.

I

H. Ayasso and Ali Mohammad-Djafari Joint NDT Image Restoration and Segmentation using Gauss-Markov-Potts Prior Models and Variational Bayesian Computation, IEEE Trans. on Image Processing, TIP-04815-2009.R2, 2010.

I

H. Ayasso, B. Duchene and A. Mohammad-Djafari, Bayesian Inversion for Optical Diffraction Tomography Journal of Modern Optics, 2008.

I

N. Bali and A. Mohammad-Djafari, “Bayesian Approach With Hidden Markov Modeling and Mean Field Approximation for Hyperspectral Data Analysis,” IEEE Trans. on Image Processing 17: 2. 217-225 Feb. (2008).

I

H. Snoussi and J. Idier., “Bayesian blind separation of generalized hyperbolic processes in noisy and underdeterminate mixtures,” IEEE Trans. on Signal Processing, 2006.

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 72/77

References 2 I

O. F´ eron, B. Duch` ene and A. Mohammad-Djafari, Microwave imaging of inhomogeneous objects made of a finite number of dielectric and conductive materials from experimental data, Inverse Problems, 21(6):95-115, Dec 2005.

I

M. Ichir and A. Mohammad-Djafari, Hidden Markov models for blind source separation, IEEE Trans. on Signal Processing, 15(7):1887-1899, Jul 2006.

I

F. Humblot and A. Mohammad-Djafari, Super-Resolution using Hidden Markov Model and Bayesian Detection Estimation Framework, EURASIP Journal on Applied Signal Processing, Special number on Super-Resolution Imaging: Analysis, Algorithms, and Applications:ID 36971, 16 pages, 2006.

I

O. F´ eron and A. Mohammad-Djafari, Image fusion and joint segmentation using an MCMC algorithm, Journal of Electronic Imaging, 14(2):paper no. 023014, Apr 2005.

I

H. Snoussi and A. Mohammad-Djafari, Fast joint separation and segmentation of mixed images, Journal of Electronic Imaging, 13(2):349-361, April 2004.

I

A. Mohammad-Djafari, J.F. Giovannelli, G. Demoment and J. Idier, Regularization, maximum entropy and probabilistic methods in mass spectrometry data processing problems, Int. Journal of Mass Spectrometry, 215(1-3):175-193, April 2002.

I

H. Snoussi and A. Mohammad-Djafari, “Estimation of Structured Gaussian Mixtures: The Inverse EM Algorithm,” IEEE Trans. on Signal Processing 55: 7. 3185-3191 July (2007).

I

N. Bali and A. Mohammad-Djafari, “A variational Bayesian Algorithm for BSS Problem with Hidden Gauss-Markov Models for the Sources,” in: Independent Component Analysis and Signal Separation (ICA 2007) Edited by:M.E. Davies, Ch.J. James, S.A. Abdallah, M.D. Plumbley. 137-144 Springer (LNCS 4666) (2007).

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 73/77

References 3 I

N. Bali and A. Mohammad-Djafari, “Hierarchical Markovian Models for Joint Classification, Segmentation and Data Reduction of Hyperspectral Images” ESANN 2006, September 4-8, Belgium. (2006)

I

M. Ichir and A. Mohammad-Djafari, “Hidden Markov models for wavelet-based blind source separation,” IEEE Trans. on Image Processing 15: 7. 1887-1899 July (2005)

I

S. Moussaoui, C. Carteret, D. Brie and A Mohammad-Djafari, “Bayesian analysis of spectral mixture data using Markov Chain Monte Carlo methods sampling,” Chemometrics and Intelligent Laboratory Systems 81: 2. 137-148 (2005).

I

H. Snoussi and A. Mohammad-Djafari, “Fast joint separation and segmentation of mixed images” Journal of Electronic Imaging 13: 2. 349-361 April (2004)

I

H. Snoussi and A. Mohammad-Djafari, “Bayesian unsupervised learning for source separation with mixture of Gaussians prior,” Journal of VLSI Signal Processing Systems 37: 2/3. 263-279 June/July (2004)

I

F. Su and A. Mohammad-Djafari, “An Hierarchical Markov Random Field Model for Bayesian Blind Image Separation,” 27-30 May 2008, Sanya, Hainan, China: International Congress on Image and Signal Processing (CISP 2008).

I

N. Chu, J. Picheral and A. Mohammad-Djafari, “A robust super-resolution approach with sparsity constraint for near-field wideband acoustic imaging,” IEEE International Symposium on Signal Processing and Information Technology pp 286–289, Bilbao, Spain, Dec14-17,2011

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 74/77

Questions, Discussions, Open mathematical problems I

Sparsity representation, low rank matrix decomposition I I I

I

I

Optimization of the KL divergence for Variational Bayesian Approximation I I

I

I

I I

Convergency of alternate optimization Other possible algorithms

Properties of the obtained approximation I

I

Sparsity and positivity or other constraints Group sparsity Algorithmic and implementation issues for great dimensional applications (Big Data) Joint estimation of Dictionary and coefficients

Does the moments of q’s corresponds to the moments of p? How about any other statistics: entropy, ...

Other divergency or Distance measures? Using Sparsity as a prior in Inverse Problems Applications in Biological data and signal analysis, Medical imaging, Non Destructive Testing (NDT) Industrial Imaging, Communication, Geophysical imaging, Radio Astronomy, ...

A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 75/77

Blind deconvolution (1) θ3

θ2

θ1

g = h ∗ f +  = Hf +  = F h + 

?  ?  ? 

Simple priors:  f h    p(f , h|g, θ) ∝ p(g|f , h, θ 1 ) p(f |θ 2 ) p(h|θ 3 ) @ H Objective: Infer on f , h R @ ? b, z b) = arg max(f ,z) {p(f , z|g)} JMAP: (f g  VBA: Approximate p(f , h|g) by q (f ) q (h) 1

γ0

γ0

2

α0

Unsupervised: ?  ?  ?  p(f , h, θ|g) ∝ p(g|f , h, θ 1 ) p(f |θ 2 ) p(h|θ 3 ) p(θ)

θ3 θ2 θ1   Objective: Infer on f , h, θ ?  ?  ?  JMAP: b, z b = arg max(f ,z,θ) {p(f , z, θ|g)}  f h b, θ) (f    VBA: @ H R @ ? Approximate p(f , h, θ|g) by q1 (f ) q2 (h) q3 (θ) g

 A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 76/77

Blind deconvolution (2) g = h ∗ f +  = Hf +  = F h +  Simple priors: p(f , h|g, θ) ∝ p(g|f , h, θ 1 ) p(f |θ 2 ) p(h|θ 3 )

γ h0

γf 0

?  ? 

Unsupervised: p(f , h, θ|g) ∝ p(g|f , h, θ 1 ) p(f |θ 2 ) p(h|θ 3 ) p(θ)

θ3 θ2 α Sparsity enforcing prior for f :   0 p(f , z f , h, θ|g) ∝ ?  ?  ? p(g|f , h, θ ) p(f |z ) p(z |θ ) p(h|θ ) p(θ)  1 f f 2 3 zh

zf

θ

1 Sparsity enforcing prior for h:    ?  ?  ? p(f , h, z h , θ|g) ∝   p(g|f , h, θ1 ) p(f |θ2 ) p(h|z) p(z|θ3 ) p(θ) f h    Hierarchical models for both f and h: @ H p(f , z f , h, z h , θ|g) ∝ R @ ? g p(g|f , h, θ 1 ) p(f |z f ) p(z f |θ 2 ) p(h|z h ) p(z h |θ 3 ) p(θ)  JMAP: n o b, z b z b = arg max b bh , θ|g) bf , h, bh , θ) (f b z h ,θ) p(f , z f , h, z (f ,z f ,h,b

VBA: Approximate p(f , z f , h, z h , θ|g) by q1 (f ) q2 (z f ) q3 (h) q4 (z h ) q5 (θ) A. Mohammad-Djafari,

Inverse problems and Bayesian inference,

Scube seminar, L2S, CentraleSupelec, March 27, 2015 77/77