Bayesian inference framework for Inverse problems - Ali Mohammad

http://djafari.free.fr http://publicationslist.org/djafari ...... CSI: Contrast Source Inversion, VBA: Variational Bayesian Approach,. MGI: Independent Gaussian mixture ...
17MB taille 2 téléchargements 404 vues
.

Bayesian inference framework for Inverse problems Ali Mohammad-Djafari Laboratoire des Signaux et Syst`emes (L2S) UMR8506 CNRS-CentraleSup´elec-UNIV PARIS SUD SUPELEC, 91192 Gif-sur-Yvette, France http://lss.centralesupelec.fr Email: [email protected] http://djafari.free.fr http://publicationslist.org/djafari

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

1/59

Contents 1. Inverse problems examples I I I I

Instrumentation Imaging systems to see outside of a body Imaging systems to see inside of a body Other imaging systems (Acoustics, Radar, SAR,...)

2. Analytical/Algebraic methods 3. Deterministic regularization methods and their limitations 4. Bayesian approach 5. Two main steps: Priors and Computational aspects 6. Case studies: Instrumentation, X ray Computed Tomography, Microwave imaging, Acoustic source localisation, Ultrasound imaging, Satellite image restoration, etc.

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

2/59

Inverse Problems examples I

Example 1: Instrumentation: Measuring the temperature with a thermometer Deconvolution I I

I

Example 2: Seeing outside of a body: Making an image using a camera, a microscope or a telescope: Image restoration I I

I

f (x, y ) real scene g (x, y ) observed image

Example 3: Seeing inside of a body: Computed Tomography usng X rays, US, Microwave, etc.: Image reconstruction I I

I

f (t) input of the instrument g (t) output of the instrument

f (x, y ) a section of a real 3D body f (x, y , z) gφ (r ) a line of observed radiographe gφ (r , z)

Example 4: Seeing differently: MRI, Radar, SAR, Infrared, etc.: Fourier Synthesis I I

f (x, y ) a section of body or a scene g (u, v ) partial data in the Fourier domain

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

3/59

Measuring variation of temperature with a therometer I

f (t) variation of temperature over time

I

g (t) variation of length of the liquid in thermometer

I

Forward model: Convolution Z g (t) = f (t 0 ) h(t − t 0 ) dt 0 + (t) h(t): impulse response of the measurement system

I

Inverse problem: Deconvolution Given the forward model H (impulse response h(t))) and a set of data g (ti ), i = 1, · · · , M find f (t)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

4/59

Measuring variation of temperature with a therometer Forward model: Convolution Z g (t) = f (t 0 ) h(t − t 0 ) dt 0 + (t)

f (t)−→

Thermometer h(t) −→

g (t)

Inversion: Deconvolution f (t)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

g (t)

ICEEE 2015, Sharif Univ. Tehran, Iran,

5/59

Seeing outside of a body: Making an image with a camera, a microscope or a telescope I

f (x, y ) real scene

I

g (x, y ) observed image

I

Forward model: Convolution ZZ g (x, y ) = f (x 0 , y 0 ) h(x − x 0 , y − y 0 ) dx 0 dy 0 + (x, y ) h(x, y ): Point Spread Function (PSF) of the imaging system

I

Inverse problem: Image restoration Given the forward model H (PSF h(x, y ))) and a set of data g (xi , yi ), i = 1, · · · , M find f (x, y )

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

6/59

Making an image with an unfocused camera Forward model: 2D Convolution ZZ g (x, y ) = f (x 0 , y 0 ) h(x − x 0 , y − y 0 ) dx 0 dy 0 + (x, y ) (x, y ) f (x, y ) - h(x, y )

?  - + -g (x, y ) 

Inversion: Image Deconvolution or Restoration ? ⇐=

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

7/59

Seeing inside of a body: Computed Tomography I

f (x, y ) a section of a real 3D body f (x, y , z)

I

gφ (r ) a line of observed radiography gφ (r , z)

I

Forward model: Line integrals or Radon Transform Z gφ (r ) = f (x, y ) dl + φ (r ) L

ZZ r ,φ = f (x, y ) δ(r − x cos φ − y sin φ) dx dy + φ (r ) I

Inverse problem: Image reconstruction Given the forward model H (Radon Transform) and a set of data gφi (r ), i = 1, · · · , M find f (x, y )

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

8/59

2D and 3D Computed Tomography 3D

2D

Z gφ (r1 , r2 ) =

Z f (x, y , z) dl

gφ (r ) =

Lr1 ,r2 ,φ

f (x, y ) dl Lr ,φ

Forward probelm: f (x, y ) or f (x, y , z) −→ gφ (r ) or gφ (r1 , r2 ) Inverse problem: gφ (r ) or gφ (r1 , r2 ) −→ f (x, y ) or f (x, y , z) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

9/59

Computed Tomography: Radon Transform

Forward: Inverse:

A. Mohammad-Djafari,

f (x, y ) f (x, y )

−→ ←−

Bayesian inference framework for inverse problems,

g (r , φ) g (r , φ)

ICEEE 2015, Sharif Univ. Tehran, Iran,

10/59

Microwave or ultrasound imaging Measures: diffracted wave by the object g (ri ) Unknown quantity: f (r) = k02 (n2 (r) − 1) Intermediate quantity : φ(r) ZZ

Gm (ri , r0 )φ(r0 ) f (r0 ) dr0 , ri ∈ S D ZZ φ(r) = φ0 (r) + Go (r, r0 )φ(r0 ) f (r0 ) dr0 , r ∈ D g (ri ) =

D

Born approximation (φ(r0 ) ' φ0 (r0 )) ): ZZ g (ri ) = Gm (ri , r0 )φ0 (r0 ) f (r0 ) dr0 , ri ∈ S D

r

r

r r ! ! L r , aa r , E - E r e φ0r (φ, f )% r % r r r r g r r

Discretization:   g = H(f) g = Gm Fφ −→ with F = diag(f) φ= φ0 + Go Fφ  H(f) = Gm F(I − Go F)−1 φ0 A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

11/59

Fourier Synthesis in X ray ZZ Tomography f (x, y ) δ(r − x cos φ − y sin φ) dx dy

g (r , φ) = Z G (Ω, φ) =

g (r , φ) exp [−jΩr ] dr ZZ

F (u, y ) = F (v , y ) = G (Ω, φ) y 6 s

f (x, y ) exp [−jvx, yy ] dx dy for

@ I @ @ @ @ (x, y ) @ f @  @ @ φ @ H H @ @ @

u = Ω cos φ

r

α



@ I @ @

Ω 

@

F (ωx , @ ωy ) @

@ φ @ @

-

x

g (r , φ)–FT–G (Ω, φ) @ @

A. Mohammad-Djafari,

and v = Ω sin φ v 6

Bayesian inference framework for inverse problems,

-

u

@ @ @

ICEEE 2015, Sharif Univ. Tehran, Iran,

12/59

Fourier Synthesis in X ray tomography ZZ G (u, v ) =

f (x, y ) exp [−j (ux + vy )] dx dy

? =⇒

Forward problem: Given f (x, y ) compute G (u, v ) Inverse problem: Given G (u, v ) on those lines estimate f (x, y ) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

13/59

Fourier Synthesis in Diffraction tomography

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

14/59

Fourier Synthesis in Diffraction tomography ZZ G (u, v ) =

f (x, y ) exp [−j (ux + vy )] dx dy

? =⇒ Forward problem: Given f (x, y ) compute G (u, v ) Inverse problem : Given G (u, v ) on those semi cercles estimate f (x, y ) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

15/59

Fourier Synthesis in different imaging systems ZZ G (u, v ) =

X ray Tomography

f (x, y ) exp [−j (ux + vy )] dx dy

Diffraction

Eddy current

SAR & Radar

Forward problem: Given f (x, y ) compute G (u, v ) Inverse problem : Given G (u, v ) on those algebraic lines, cercles or curves, estimate f (x, y ) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

16/59

Linear inverse problems I

Deconvolution Z f (τ )h(t − τ ) dτ

g (t) = I

Image restoration Z g (x, y ) =

f (x 0 , y 0 )h(x − x 0 , y − y 0 ) dx dy

I

Image reconstruction in X ray CT Z g (r , φ) = f (x, y )δ(r − x cos φ − y sin φ) dx dy

I

Fourier synthesis Z g (u, v ) =

I

f (x, y ) exp [−j(ux + vy )] dx dy

Unified linear relation Z g (s) =

A. Mohammad-Djafari,

f (r) h(s, r) dr

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

17/59

Linear Inverse Problems Z g (si ) = I

h(si , r) f (r) dr + (si ),

i = 1, · · · , M

f (r) is assumed to be well approximated by N X f (r) ' fj φj (r) j=1

with {φj (r)} a basis or any other set of known functions Z N X g (si ) = gi ' fj h(si , r) φj (r) dr, i = 1, · · · , M Z j=1 g = Hf +  with Hij = h(si , r) φj (r) dr I

H is huge dimensional I

I

1D: 103 × 103 ,

2D: 106 × 106 ,

3D: 109 × 109

Due to ill-posedness of the inverse problems, Least squares (LS) methods: bf = arg minf {J(f)} with J(f) = kg − Hfk2 do not give satisfactory result. Need for regularization methods: J(f) = kg − Hfk2 + λkfk2

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

18/59

Regularization theory Inverse problems = Ill posed problems −→ Need for prior information Functional space (Tikhonov): g = H(f ) +  J(f ) = ||g − H(f )||22 + λ||Df ||22 Finite dimensional space (Philips & Towmey): g = Hf +  J(f) = kg − Hfk2 + λkfk2 • Minimum norme LS (MNLS): • Classical regularization: • More general regularization: or

J(f) = ||g − H(f)||2 + λ||f||2 J(f) = ||g − H(f)||2 + λ||Df||2

J(f) = Q(g − H(f)) + λΩ(Df)

J(f) = ∆1 (g, H(f)) + λ∆2 (Df, f 0 ) Limitations: • Errors are considered implicitly white and Gaussian • Limited prior information on the solution • Lack of tools for the determination of the hyperparameters A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

19/59

Inversion: Probabilistic methods Taking account of errors and uncertainties −→ Probability theory I

Maximum Likelihood (ML)

I

Minimum Inaccuracy (MI)

I

Probability Distribution Matching (PDM)

I

Maximum Entropy (ME) and Information Theory (IT)

I

Bayesian Inference (Bayes)

Advantages: I

Explicit account of the errors and noise

I

A large class of priors via explicit or implicit modeling

I

A coherent approach to combine information content of the data and priors

Limitations: I

Practical implementation and cost of calculation

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

20/59

Bayesian estimation approach M: g = Hf +  Observation model M + Hypothesis on the noise  −→ p(g|f; M) = p (g − Hf) I A priori information p(f|M) p(g|f; M) p(f|M) I Bayes : p(f|g; M) = p(g|M) Link with regularization : I

I

I

Maximum A Posteriori (MAP) : bf = arg max {p(f|g)} = arg max {p(g|f) p(f)} f f = arg min {J(f) = − ln p(g|f) − ln p(f)} f Regularization: bf = arg min {J(f) = Q(g, Hf) + λΩ(f)} f with Q(g, Hf) = − ln p(g|f) and λΩ(f) = − ln p(f)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

21/59

Case of linear models and Gaussian priors g = Hf +  I

Prior knowledge on the noise: ∼

I



1 → p(g|f) ∝ exp − 2 kg − Hfk2 2σ



Prior knowledge on f: f∼

I

N (0, σ2 I)

N (0, σf2 (D0 D)−1 )



1 → p(f) ∝ exp − 2 kDfk2 2σf



A posteriori: 

1 1 p(f|g) ∝ exp − 2 kg − Hfk2 − 2 kDfk2 2σ 2σf bf = arg max {p(f|g)} = arg min {J(f)} f f



I

MAP :

I

with J(f) = kg − Hfk2 + λkDfk2 , λ = σσ2 f Advantage : characterization of the solution  b with bf = PH b 0 g, P b = H0 H + λD0 D −1 p(f|g) = N (bf, P)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

2

ICEEE 2015, Sharif Univ. Tehran, Iran,

22/59

Regularization versus Bayesian

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

23/59

Main advantages of the Bayesian approach I

MAP = Regularization

I

Posterior mean ? Marginal MAP ?

I

More information in the posterior law than only its mode or its mean

I

Tools for estimating hyper parameters

I

Tools for model selection

I

More specific and specialized priors, particularly through the hidden variables and hierarchical models More computational tools:

I

I

I I

I

Expectation-Maximization for computing the maximum likelihood parameters MCMC for posterior exploration Variational Bayes for analytical computation of the posterior marginals ...

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

24/59

Bayesian Estimation: Simple priors I I

Linear model: g = Hf +  Gaussian case:  p(g|f, θ1 ) = N (Hf, θ1 I) b → p(f|g, θ) = N (bf, P) p(f|θ2 ) = N (0, θ2 I) with

(

b = (H0 H + λI)−1 , P bf = PH b 0g

λ=

θ1 θ2

I

bf = arg min {J(f)} with J(f) = kg − Hfk2 + λkfk2 2 2 f Generalized Gaussian prior & MAP:

I

bf = arg min {J(f)} with J(f) = kg − Hfk2 + λkfkβ 2 f Double Exponential (β = 1): bf = arg min {J(f)} with J(f) = kg − Hfk2 + λkfk1 2 f

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

25/59

Full (Unsupervised) Bayesian approach M:

I I I I I

I

I I

g = Hf + 

Forward & errors model: −→ p(g|f, θ 1 ; M) Prior models −→ p(f|θ 2 ; M) Hyperparameters θ = (θ 1 , θ 2 ) −→ p(θ|M) p(g|f ,θ ;M) p(f |θ ;M) p(θ |M) Bayes: −→ p(f, θ|g; M) = p(g|M) b = arg max {p(f, θ|g; M)} Joint MAP: (bf, θ) (f ,θ ) R  p(f|g; M) = R p(f, θ|g; M) dθ Marginalization: p(θ|g; M) = p(f, θ|g; M) df ( RR bf = f p(f, θ|g; M) dθ df RR Posterior means: b = θ θ p(f, θ|g; M) df dθ Evidence of the model: ZZ p(g|M) = p(g|f, θ; M)p(f|θ; M)p(θ|M) df dθ

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

26/59

Hierarchical models Simple case (1 layer): g = Hf +  θ2 θ1 p(f|g, θ) ∝ p(g|f, θ 1 ) p(f|θ 2 ) ?  ?  Objective: Infer on f  f  MAP: b f = arg maxf {p(f|g,Zθ)} H ?  Posterior Mean (PM): bf = p(f|g, θ) df g 

Unsupervised case (2 layers): β0

α0

?  ? 

θ2

θ1

p(f, θ|g) ∝ p(g|f, θ 1 ) p(f|θ 2 ) p(θ) Objective: Infer on f, θ

 b = arg max JMAP: (bf, θ) ?  ?  (f ,θZ) {p(f, θ|g)}



f  Marginalization: p(θ|g) = H

? 

p(f, θ|g) df

VBA: Approximate p(f, θ|g) by q1 (f) q2 (θ)

g



A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

27/59

Two main steps in the Bayesian approach I

Prior modeling I

I

I

I

Separable: Gaussian, Gamma, Sparsity enforcing: Generalized Gaussian, mixture of Gaussians, mixture of Gammas, ... Markovian: Gauss-Markov, GGM, ... Markovian with hidden variables (contours, region labels)

Choice of the estimator and computational aspects I I I I

I

MAP, Posterior mean, Marginal MAP MAP needs optimization algorithms Posterior mean needs integration methods Marginal MAP and Hyperparameter estimation need integration and optimization Approximations: I I I

A. Mohammad-Djafari,

Gaussian approximation (Laplace) Numerical exploration MCMC Variational Bayes (Separable approximation)

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

28/59

Different prior models for signals and images: Separable

Gaussian   p(fj ) ∝ exp −α|fj |2

Generalized Gaussian p(fj ) ∝ exp [−α|fj |p ] , 1 ≤ p ≤ 2

Gamma p(fj ) ∝ fjα exp [−βfj ]

Beta p(fj ) ∝ fjα (1 − fj )β

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

29/59

Sparsity enforcing prior models I

Sparse signals: Direct sparsity

I

Sparse signals: Sparsity in a Transform domain

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

30/59

Sparsity enforcing prior models I

Simple heavy tailed models: I I I I I

I

Generalized Gaussian, Double Exponential Symmetric Weibull, Symmetric Rayleigh Student-t, Cauchy Generalized hyperbolic Elastic net

Hierarchical mixture models: I I I I I I

Mixture of Gaussians Bernoulli-Gaussian Mixture of Gammas Bernoulli-Gamma Mixture of Dirichlet Bernoulli-Multinomial

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

31/59

Which images I am looking for?

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

32/59

Which image I am looking for?

Gauss-Markov

Generalized GM

Piecewize Gaussian

Mixture of GM

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

33/59

Different prior models for signals and images: Separable I

Simple Gaussian, Gamma, Generalized Gaussian   X p(f) ∝ exp  φ(f j ) j

I

Simple Markovian models: Gauss-Markov, Generalized Gauss-Markov   X X p(f) ∝ exp  φ(f j − f i ) j

I

j∈N (i)

Hierarchical models with hidden variables: Bernouilli-Gaussian, Gaussian-Gamma     X X p(f|z) ∝ exp  p(f j |z j ) and p(z) ∝ exp  p(z j ) j

j

with different choices for p(f j |z j ) and p(z j ) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

34/59

Hierarchical models and hidden variables I

Student-t model 

 ν+1 log 1 + f 2 /ν St(f |ν) ∝ exp − 2 I



Infinite Scaled Gaussian Mixture (ISGM) equivalence Z ∞ St(f |ν) ∝= N (f |, 0, 1/z) G(z|α, β) dz, with α = β = ν/2 0

  p(f|z)      p(z|α, β)       p(f, z|α, β) A. Mohammad-Djafari,

i h 1P 2 z f N (f |0, 1/z ) ∝ exp − j j j j j j 2 Q Q (α−1) = j G(z hPj |α, β) ∝ j z j iexp [−βz j ] ∝ exp (α − 1) ln z j − βz j h jP i ∝ exp − 21 j z j f 2j + (α − 1) ln z j − βz j =

Q

j p(f j |z j ) =

Q

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

35/59

Gauss-Markov-Potts prior models for images

f (r)

z(r)

c(r) = 1 − δ(z(r) − z(r0 ))

p(f (r)|z(r) = k, mk , vk ) = N (mk , vk ) X p(f (r)) = P(z(r) = k) N (mk , vk ) Mixture of Gaussians I I

k Q Separable iid hidden variables: p(z) = r p(z(r)) Markovian hidden variables:  p(z) Potts-Markov:  X p(z(r)|z(r0 ), r0 ∈ V(r)) ∝ exp γ δ(z(r) − z(r0 ))   r0 ∈V(r) X X p(z) ∝ exp γ δ(z(r) − z(r0 )) r∈R r0 ∈V(r)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

36/59

Four different cases To each pixel of the image is associated 2 variables f (r) and z(r) I

f|z Gaussian iid, z iid : Mixture of Gaussians

I

f|z Gauss-Markov, z iid : Mixture of Gauss-Markov

I

f|z Gaussian iid, z Potts-Markov : Mixture of Independent Gaussians (MIG with Hidden Potts)

I

f|z Markov, z Potts-Markov : Mixture of Gauss-Markov (MGM with hidden Potts)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

f (r)

z(r) ICEEE 2015, Sharif Univ. Tehran, Iran,

37/59

Hierarchical models (3 layers) p(f, z, θ|g) ∝ p(g|f, θ 1 ) p(f|z, θ 2 ) p(z|θ 3 ) p(θ) p(θ) = p(θ 1 |α0 ) p(θ 2 |β0 ) p(θ 3 |γ0 ) Objective: Infer on f, z, θ

γ0 ? 

β0

θ3

α0

 ?  ?  ? 

JMAP: b = arg max (bf, b z, θ)

(f ,z,θ ) {p(f, z, θ|g)}

Marginalization: z θ2 θ1 Z   @  ?  ? p(z, θ|g) = p(f, z, θ|g) df Z R f @   p(θ|g) = p(z, θ|g) dz H

? 

g



Z Z or p(f|g) =

p(f, z, θ|g) dz dθ

VBA: Approximate p(f, z, θ|g) by q1 (f) q2 (z) q3 (θ) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

38/59

JMAP, Marginalization, VBA I

JMAP: p(f, θ|g) optimization

I

−→ bf b −→ θ

Marginalization p(f, θ|g) −→

p(θ|g)

b −→ p(f|θ, b g) −→ bf −→ θ

Joint Posterior Marginalize over f I

Variational Bayesian Approximation

p(f, θ|g) −→

A. Mohammad-Djafari,

Variational Bayesian Approximation

Bayesian inference framework for inverse problems,

−→ q1 (f) −→ bf b −→ q2 (θ) −→ θ ICEEE 2015, Sharif Univ. Tehran, Iran,

39/59

VBA: Choice of family of laws q1 and q2 Case 1 : −→ Joint MAP  n o ( e M) ef = arg max p(f, θ|g; b1 (f|ef) = δ(f − ef) q f n o e = δ(θ − θ) e−→θ= e arg max p(ef, θ|g; M) b2 (θ|θ) q θ

I

I



I

(

Case 2 : −→ EM  e M)i b1 (f) q ∝ p(f|θ, g) Q(θ, θ)= hln p(f, θ|g; q1 (o f |θe ) n −→ e = δ(θ − θ) e θ e e b2 (θ|θ) q = arg maxθ Q(θ, θ) Appropriate choice for inverse problems

 e g; M) Accounts for the uncertainties of b1 (f) ∝ p(f|θ, q −→ b e θ for bf and vise versa. b2 (θ) ∝ p(θ|f, g; M) q Exponential families, Conjugate priors

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

40/59

JMAP, EM and VBA JMAP Alternate optimization Algorithm: n o e ef = arg max p(f, θ|g) e −→ef −→ bf θ (0) −→ θ−→ f ↑ ↓ n o b ←− θ←− e e = arg max p(ef, θ|g) ←−ef θ θ θ EM: e θ (0) −→ θ−→ ↑ b ←− θ←− e θ

−→q1 (f) −→ bf ↓

e g) q1 (f) = p(f|θ, e = hln p(f, θ|g)i Q(θ, θ) q1o (f ) n e = arg max Q(θ, θ) e θ θ

←− q1 (f)

VBA: h i θ (0) −→ q2 (θ)−→ q1 (f) ∝ exp hln p(f, θ|g)iq2 (θ ) −→q1 (f) −→ bf ↑ ↓ h i b θ ←− q2 (θ)←− q2 (θ) ∝ exp hln p(f, θ|g)iq1 (f ) ←−q1 (f) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

41/59

y 6

Computed Tomography: Discretization Hij

 f (x, y ) 

Q Q

f1 Q QQ fjQ Q Q Q Qg

@ @ -

x

HH

i

fN

@

Z g (r , φ) =

f (x, y ) dl PL f (x, y ) = j fj bj (x, y )  1 if (x, y ) ∈ pixel j bj (x, y ) = 0 else N X gi = Hij fj + i j=1

g = Hf +  A. Mohammad-Djafari,

Case study: Reconstruction from 2 projections R g1 (x) = R f (x, y ) dy , g2 (y ) = f (x, y ) dx Very ill-posed inverse problem f (x, y ) = g1 (x) g2 (y ) Ω(x, y ) RΩ(x, y ) is a Copula: R Ω(x, y ) dx = 1 Ω(x, y ) dy = 1

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

42/59

Simple example

   

1 2 3 g1 g2 g3 g4

3 4 ? 4 6 ? 7 3   1   0 =   1 0

I

? 4 f1 f3 g3 1 -1 ? 6 f2 f4 g4 -1 1 7 g1 g2 0 0   f1 f4 f1 1 0 0  f2  f2 f5 0 1 1       f3 f6 f3 0 1 0 g1 g2 f4 1 0 1 Hf = g −→ bf = H−1 g if H invertible.

I

H is rank deficient: rank(H) = 3

I

Problem has infinite number of solutions.

I

How to find all those solutions ?

I

Which one is the good one? Needs prior information.

I

To find an unique solution, one needs either more data or prior information.

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

0 0

-1 1 0

1 0 -1 0 0

f7 g4 f8 g5 f9 g6 g3

ICEEE 2015, Sharif Univ. Tehran, Iran,

43/59

Application in CT: Reconstruction from 2 projections

g|f g = Hf +  g|f ∼ N (Hf, σ2 I) Gaussian

f|z iid Gaussian or Gauss-Markov

z iid or Potts

c q(r) ∈ {0, 1} 1 − δ(z(r) − z(r0 )) binary

p(f, z, θ|g) ∝ p(g|f, θ 1 ) p(f|z, θ 2 ) p(z|θ 3 ) p(θ)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

44/59

Results

Original

Backprojection

Filtered BP

Gauss-Markov+pos

GM+Line process

GM+Label process

c

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

LS

z

ICEEE 2015, Sharif Univ. Tehran, Iran,

c

45/59

Application in Acoustic source localization (Ning Chu et al.)

Wind tunnel

Beamforming

DAMAS

Proposed VBA inference

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

46/59

Microwave Imaging for Breast Cancer detection (L. Gharsalli et al.)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

47/59

Microwave Imaging for Breast Cancer detection CSI: Contrast Source Inversion, VBA: Variational Bayesian Approach, MGI: Independent Gaussian mixture, MGM: Gauss-Markov mixture

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

48/59

Images fusion and joint segmentation (with O. F´eron)   gi (r) = fi (r) + i (r) 2 p(fi (r)|z(r) Q = k) = N (mi k , σi k )  p(f|z) = i p(f i |z)

g1

−→

bf 1 b z

g2

A. Mohammad-Djafari,

bf 2

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

49/59

Data fusion in medical imaging (with O. F´eron)   gi (r) = fi (r) + i (r) 2 p(fi (r)|z(r) Q = k) = N (mi k , σi k )  p(f|z) = i p(f i |z)

g1

−→

bf 1 b z

g2 A. Mohammad-Djafari,

bf 2 Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

50/59

Conclusions I

I

I I

I

I I

Inverse problems arise in many science and engineering applications Deterministic Algorithms: Optimization of a two terms criterion, penalty term, regularization term Probabilistic: Bayesian approach Hierarchical prior model with hidden variables are very powerful tools for Bayesian approach to inverse problems. Gauss-Markov-Potts models for images incorporating hidden regions and contours Main Bayesian computation tools: JMAP, MCMC and VBA Application in different imaging system (X ray CT, Microwaves, PET, Ultrasound, Optical Diffusion Tomography (ODT), Acoustic source localization,...)

Current Projects: I

Efficient implementation in 2D and 3D cases

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

51/59

Bayesian Blind deconvolution θ3

θ2

θ1

g = h ∗ f +  = Hf +  = Fh + 

?  ?  ? 

Simple priors:  h f    p(f, h|g, θ) ∝ p(g|f, h, θ 1 ) p(f|θ 2 ) p(h|θ 3 ) @ Objective: Infer on f, h R @ ? JMAP: (bf, b z) = arg max(f ,z) {p(f, z|g)} g  VBA: Approximate p(f, h|g) by q1 (f) q2 (h)

γ0

γ0

α0

Unsupervised: ?  ?  ?  p(f, h, θ|g) ∝ p(g|f, h, θ 1 ) p(f|θ 2 ) p(h|θ 3 ) p(θ)

θ3 θ2 θ1   Objective: Infer on f, h, θ ?  ?  ?  JMAP: b = arg max  h f (bf, b z, θ) {p(f, z, θ|g)} (f ,z,θ )    VBA: @ R @ ? Approximate p(f, h, θ|g) by q1 (f) q2 (h) q3 (θ) g

 A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

52/59

Bayesian Blind deconvolution with hierarchial models g = h ∗ f +  = Hf +  = Fh +  Simple priors: p(f, h|g, θ) ∝ p(g|f, h, θ 1 ) p(f|θ 2 ) p(h|θ 3 )

γ h0

γf 0

?  ? 

Unsupervised: p(f, h, θ|g) ∝ p(g|f, h, θ 1 ) p(f|θ 2 ) p(h|θ 3 ) p(θ)

θ3 θ2 α Sparsity enforcing prior for f:   0 p(f, zf , h, θ|g) ∝ ?  ?  ? p(g|f, h, θ 1 ) p(f|zf ) p(zf |θ 2 ) p(h|θ 3 ) p(θ)  z

z

θ

h f 1 Sparsity enforcing prior for h:    ?  ?  ? p(f, h, zh , θ|g) ∝   p(g|f, h, θ1 ) p(f|θ2 ) p(h|z) p(z|θ3 ) p(θ) h f    Hierarchical models for both f and h: @ p(f, zf , h, zh , θ|g) ∝ R @ ? p(g|f, h, θ 1 ) p(f|zf ) p(zf |θ 2 ) p(h|zh ) p(zh |θ 3 ) p(θ) g  JMAP: n o b b b = arg max b b (bf, b zf , h, zh , θ) p(f, zf , h, zh , θ|g) b b (f ,zf ,h,zh ,θ ) VBA: Approximate p(f, zf , h, zh , θ|g) by q1 (f) q2 (zf ) q3 (h) q4 (zh ) q5 (θ) A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

53/59

Thanks to: Present PhD students: I L. Gharsali (Microwave imaging for Cancer detection) I M. Dumitru (Multivariate time series analysis for biological signals) I S. AlAli (Diffraction imaging for geophysical applications) Freshly Graduated PhD students: I C. Cai (2013: Multispectral X ray Tomography) I N. Chu (2013: Acoustic sources localization) I Th. Boulay (2013: Non Cooperative Radar Target Recognition) I R. Prenon (2013: Proteomic and Masse Spectrometry) I Sh. Zhu (2012: SAR Imaging) I D. Fall (2012: Emission Positon Tomography, Non Parametric Bayesian) I D. Pougaza (2011: Copula and Tomography) I H. Ayasso (2010: Optical Tomography, Variational Bayes) Older Graduated PhD students: I S. F´ ekih-Salem (2009: 3D X ray Tomography) I N. Bali (2007: Hyperspectral imaging) I O. F´ eron (2006: Microwave imaging) A. Mohammad-Djafari, I

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

54/59

Thanks to: Older Graduated PhD students: I H. Snoussi (2003: Sources separation) I Ch. Soussen (2000: Geometrical Tomography) I G. Mont´ emont (2000: Detectors, Filtering) I H. Carfantan (1998: Microwave imaging) I S. Gautier (1996: Gamma ray imaging for NDT) I M. Nikolova (1994: Piecewise Gaussian models and GNC) I D. Pr´ emel (1992: Eddy current imaging) Post-Docs: I J. Lapuyade (2011: Dimentionality Reduction and multivariate analysis) I S. Su (2006: Color image separation) I A. Mohammadpour (2004-2005: HyperSpectral image segmentation) Colleagues: I B. Duchˆ ene & A. Joisel (L2S)& G. Perruson (Inverse scattering and Microwave Imaging) I N. Gac (L2S) (GPU Implementation) I Th. Rodet A. Mohammad-Djafari, Bayesian inference framework for inverse problems, ICEEE 2015, Sharif Univ. Tehran, Iran, (L2S) (Computed Tomography)

55/59

Thanks to: National Collaborators I A. Vabre & S. Legoupil (CEA-LIST), (3D X ray Tomography) I E. Barat (CEA-LIST) (Positon Emission Tomography, Non Parametric Bayesian) I C. Comtat (SHFJ, CEA) (PET, Spatio-Temporal Brain activity) I J. Picheral (SSE, Sup´ elec) (Acoustic sources localization) I D. Blacodon (ONERA) (Acoustic sources separation) I J. Lagoutte (Thales Air Systems) (Non Cooperative Radar Target Recognition) I P. Grangeat (LETI, CEA, Grenoble) (Proteomic and Masse Spectrometry) I F. L´ evi (CNRS-INSERM, Hopital Paul Brousse) (Biological rythms and Chronotherapy of Cancer) International Collaborators I K. Sauer (Notre Dame University, IN, USA) (Computed Tomography, Inverse problems) I F. Marvasti (Sharif University), (Sparse signal processing) I M. Aminghafari (Amir Kabir University) (Independent Components Analysis)

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

56/59

References 1 I

A. Mohammad-Djafari, “Bayesian approach with prior models which enforce sparsity in signal and image processing,” EURASIP Journal on Advances in Signal Processing, vol. Special issue on Sparse Signal Processing, (2012).

I

A. Mohammad-Djafari (Ed.) Probl` emes inverses en imagerie et en vision (Vol. 1 et 2), Hermes-Lavoisier, Trait´ e Signal et Image, IC2, 2009,

I

A. Mohammad-Djafari (Ed.) Inverse Problems in Vision and 3D Tomography, ISTE, Wiley and sons, ISBN: 9781848211728, December 2009, Hardback, 480 pp.

I

A. Mohammad-Djafari, Gauss-Markov-Potts Priors for Images in Computer Tomography Resulting to Joint Optimal Reconstruction and segmentation, International Journal of Tomography & Statistics 11: W09. 76-92, 2008.

I

A Mohammad-Djafari, Super-Resolution : A short review, a new method based on hidden Markov modeling of HR image and future challenges, The Computer Journal doi:10,1093/comjnl/bxn005:, 2008.

I

H. Ayasso and Ali Mohammad-Djafari Joint NDT Image Restoration and Segmentation using Gauss-Markov-Potts Prior Models and Variational Bayesian Computation, IEEE Trans. on Image Processing, TIP-04815-2009.R2, 2010.

I

H. Ayasso, B. Duchene and A. Mohammad-Djafari, Bayesian Inversion for Optical Diffraction Tomography Journal of Modern Optics, 2008.

I

N. Bali and A. Mohammad-Djafari, “Bayesian Approach With Hidden Markov Modeling and Mean Field Approximation for Hyperspectral Data Analysis,” IEEE Trans. on Image Processing 17: 2. 217-225 Feb. (2008).

I

H. Snoussi and J. Idier., “Bayesian blind separation of generalized hyperbolic processes in noisy and underdeterminate mixtures,” IEEE Trans. on Signal Processing, 2006.

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

57/59

References 2 I

O. F´ eron, B. Duch` ene and A. Mohammad-Djafari, Microwave imaging of inhomogeneous objects made of a finite number of dielectric and conductive materials from experimental data, Inverse Problems, 21(6):95-115, Dec 2005.

I

M. Ichir and A. Mohammad-Djafari, Hidden Markov models for blind source separation, IEEE Trans. on Signal Processing, 15(7):1887-1899, Jul 2006.

I

F. Humblot and A. Mohammad-Djafari, Super-Resolution using Hidden Markov Model and Bayesian Detection Estimation Framework, EURASIP Journal on Applied Signal Processing, Special number on Super-Resolution Imaging: Analysis, Algorithms, and Applications:ID 36971, 16 pages, 2006.

I

O. F´ eron and A. Mohammad-Djafari, Image fusion and joint segmentation using an MCMC algorithm, Journal of Electronic Imaging, 14(2):paper no. 023014, Apr 2005.

I

H. Snoussi and A. Mohammad-Djafari, Fast joint separation and segmentation of mixed images, Journal of Electronic Imaging, 13(2):349-361, April 2004.

I

A. Mohammad-Djafari, J.F. Giovannelli, G. Demoment and J. Idier, Regularization, maximum entropy and probabilistic methods in mass spectrometry data processing problems, Int. Journal of Mass Spectrometry, 215(1-3):175-193, April 2002.

I

H. Snoussi and A. Mohammad-Djafari, “Estimation of Structured Gaussian Mixtures: The Inverse EM Algorithm,” IEEE Trans. on Signal Processing 55: 7. 3185-3191 July (2007).

I

N. Bali and A. Mohammad-Djafari, “A variational Bayesian Algorithm for BSS Problem with Hidden Gauss-Markov Models for the Sources,” in: Independent Component Analysis and Signal Separation (ICA 2007) Edited by:M.E. Davies, Ch.J. James, S.A. Abdallah, M.D. Plumbley. 137-144 Springer (LNCS 4666) (2007).

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

58/59

References 3 I

N. Bali and A. Mohammad-Djafari, “Hierarchical Markovian Models for Joint Classification, Segmentation and Data Reduction of Hyperspectral Images” ESANN 2006, September 4-8, Belgium. (2006)

I

M. Ichir and A. Mohammad-Djafari, “Hidden Markov models for wavelet-based blind source separation,” IEEE Trans. on Image Processing 15: 7. 1887-1899 July (2005)

I

S. Moussaoui, C. Carteret, D. Brie and A Mohammad-Djafari, “Bayesian analysis of spectral mixture data using Markov Chain Monte Carlo methods sampling,” Chemometrics and Intelligent Laboratory Systems 81: 2. 137-148 (2005).

I

H. Snoussi and A. Mohammad-Djafari, “Fast joint separation and segmentation of mixed images” Journal of Electronic Imaging 13: 2. 349-361 April (2004)

I

H. Snoussi and A. Mohammad-Djafari, “Bayesian unsupervised learning for source separation with mixture of Gaussians prior,” Journal of VLSI Signal Processing Systems 37: 2/3. 263-279 June/July (2004)

I

F. Su and A. Mohammad-Djafari, “An Hierarchical Markov Random Field Model for Bayesian Blind Image Separation,” 27-30 May 2008, Sanya, Hainan, China: International Congress on Image and Signal Processing (CISP 2008).

I

N. Chu, J. Picheral and A. Mohammad-Djafari, “A robust super-resolution approach with sparsity constraint for near-field wideband acoustic imaging,” IEEE International Symposium on Signal Processing and Information Technology pp 286–289, Bilbao, Spain, Dec14-17,2011

A. Mohammad-Djafari,

Bayesian inference framework for inverse problems,

ICEEE 2015, Sharif Univ. Tehran, Iran,

59/59