On the distribution of cloud sizes as the limit of the ... - Florin Spineanu

The Poisson distribution is a fixed point (unfortunately is wrong). 3. Processes that generate clustering. The Weierstrass function. 4. Continuous Time Random ...
831KB taille 2 téléchargements 339 vues
1

Distributions

On the distribution of cloud sizes as the limit of the renormalization Florin Spineanu and Madalina Vlad National Institute of Laser, Plasma and Radiation Physics Bucharest, Romania

F. Spineanu – COST-2012 Hamburg –

2

Distributions

Statistical properties are useful information. The distribution function of some fluctuating variable exhibits a substantial change : intrinsic scale vs. power law. 1.

Cloud size distribution supposed Poisson it seems it is power law.

2.

Renormalization Group Transformations of the distribution functions. The Poisson distribution is a fixed point (unfortunately is wrong).

3.

Processes that generate clustering. The Weierstrass function.

4.

Continuous Time Random Walk. CTRW.

F. Spineanu – COST-2012 Hamburg –

3

Distributions

Cloud distribution

Plank 2a

Plank 3a.

Plank 5a.

F. Spineanu – COST-2012 Hamburg –

4

Distributions

The Distribution of intensity of convective events Possible change of analysis: from cross-over to Smooth variation.

Figure 1: Benner Curry and Corral.

F. Spineanu – COST-2012 Hamburg –

5

Distributions

1

Introduction

The observation has shown a cross-over in the power-law distribution relating the logarithm of the cloud size and the logarithm of the number of observed clouds with that size. The exponent changes from −0.89 to −3. approximately. Other results have been reported. The distributions of : height, horizontal area and duration, are lognormal according to Lopez. Craig and Cohen: distribution is Poisson. F. Spineanu – COST-2012 Hamburg –

6

Distributions

Figure 2: Cahalan Joseph. Is-it so important if there is a smooth (universal) universal function instead of a cross-over with two straight lines? Yes, it is, air has memory. First: what to investigate? Size of the clouds? cover area? Intensity of the convective event? F. Spineanu – COST-2012 Hamburg –

7

Distributions

Choose: intensity of the convective event. Since from it one can infer the size, the height, etc. The Poisson distribution implies (or is derived under the assumption of) independence between isolated events. But the statistical analysis of series of observations shows that the distribution of sizes is NOT Poisson. After normalization and rescaling, the full set of data collapses onto a single curve. That curve is an universal function (universal: irrespective of details and of the physical nature of the processes that are behind the stochastic series. The result works for several systems.) The curve shows two visibly different regimes: short scale and long scale. But there is no cross over. It is smooth. The Poisson processes are still relevant to the physical random system. But the enlargement of the set of stochastic observations F. Spineanu – COST-2012 Hamburg –

8

Distributions

(such as to suppress the localised effects, known to be biased), followed by a particular normalization, local, must be done, in order to get a robust concentration of points on the ”universal function”. Then: why the curve differs from a Poisson distribution? Answer is clear: due to the correlations: Poisson process would be totally uncorrelated, the events are independent. If the series is temporal, the time-correlation means: memory. Since the dynamics is not directly exploited (we do not know how) but the analysis is done essentially with statistical methods, we never fully understand the results of the statistical analysis. Some results are strange: if we had a massive rain long time ago, and we are affraid that a massive one is imminent, well, it is not so: if a F. Spineanu – COST-2012 Hamburg –

9

Distributions

long time has passed since the last massive event, then we should expect the next one very far in the future. The effect of the memory is clustering of the events that belong to distinct classes.

F. Spineanu – COST-2012 Hamburg –

10

Distributions

2

Clouds are like earthquakes, isn’t it ?

The work of Corral. Take as object of statistical analysis the magnitude M of something (convective event). The Gutenberg - Richter law. In a certain region the number of events in a long period of time decreases exponentially with the magnitude N (Mc ) ∼ 10−bMc where N (Mc )



b ≡

number of events of magnitude ≥ Mc const ≈ 1

Definitions. F. Spineanu – COST-2012 Hamburg –

11

Distributions

The rate of the event ’s occurence

r (t, Mc )



the rate: number of events of magnitude

M

>

Mc on unit time interval, at time t

Mean rate of occurence R (Mc )

≡ = =

1 r (t, Mc ) = T N (Mc ) T R0 × 10−bMc



T 0

r (t, Mc ) dt

Here T is the total time of observation. This average is taken on the whole interval of observation T on which the number of events of magnitude M ≥ Mc is N (Mc ). The total number of events, of all F. Spineanu – COST-2012 Hamburg –

12

Distributions

possible magnitudes M > 0 (i.e. with the threshold Mc = 0) leads to mean rate of occurence of events, R0 . Another expression of the Gutenberg-Richter law is in terms of probabilities: the event magnitudes follow an exponential distribution Pr ob [M ≥ Mc ]

= ∼

N (Mc ) N (total number of events of any magnitudes) exp [− (ln 10) bMc ]

The dissipation of energy during an event is an exponential function of the magnitude Energy ∼ exp (M ) then Mc ∼ ln (Energy) F. Spineanu – COST-2012 Hamburg –

13

Distributions

and replacing in the expression of the probability we obtain an algebraic decay of the probability in terms of the energy   −b ln(10) Pr ob [M ≥ Mc ] ∼ exp [ln (Energy)] ∼

1 (Energy)

b ln(10)

This is a power law distribution and indicates that there is NO intrinsic scale in the problem, or, there is scale invariance. The events (convection, earthquakes, etc.) have NO characteristics size. Now, let’s ask: what happens after a large event which, we know, does NOT exhaust the free-energy reservoir ? The Omori law. After a strong event (mainshock) the rate of events with magnitude F. Spineanu – COST-2012 Hamburg –

14

Distributions

higher than a threshold M ≥ Mc increases abruptly and then decays in time as a power law r0 (Mc )  r (t, Mc ) =  t p 1+ c where the time is measured from the mainshock and r0 (Mc ) is the rate immediately after the mainshock, at t = 0. The constant c differs according to the region. The exponent is p≈1 There is actually no precise separation between mainshock and aftershock and the Omori law is valid even if the ”mainshock ” is considered one of the aftershocks. Again: the power law decay suggests that there is no characteristic F. Spineanu – COST-2012 Hamburg –

15

Distributions

time scale in the relaxation. A different objective in the statistical analysis: the time interval between consequtive events. The method adopted (Bak) : consider all events of all magnitudes and (Corr´ al) of all regions. Consider events that have a magnitude M ≥ Mc occuring at time ti , i = 1, ..., N (Mc ). We introduce the ”waiting time ”, or ”recurrence time ”, τi , as τi = ti − ti−1 One can calculate the mean recurrence time τ (Mc ) which is actually the inverse of the rate of occuence 1 τ (Mc ) = R (Mc ) F. Spineanu – COST-2012 Hamburg –

16

Distributions

For full characterization of the sochastic variable τ ≡ {τi } we need the probability distribution of the recurrence times D (τ ; Mc ) =

Pr ob [τ < recurrence time < τ + dτ ] dτ

Choosing a particular region, the function D (τ ; Mc ) has been retrived from observational data and plots have been made of the dependence of D (τ ; Mc ) of the recurrence time τ , taking as parameter the magnitude Mc . Then it has been normalized the probability is multiplied with the average recurrence time ⎧ ⎫ ⎨ the variable recurrence time τ is normalized ⎬ ⎩ by dividing to the average normalized time ⎭

D(τ ;Mc ) R(Mc )

τ R (Mc )

F. Spineanu – COST-2012 Hamburg –

17

Distributions

Then it is found that all curves are almost superposed indicating the universal relationship D (τ ; Mc ) = R (Mc ) f [R (Mc ) τ ]

2.1

Nonhomogeneous Poisson process

The rate of occurence of the event (for example the seismic rate) r (t, Mc ). We find the average R (Mc ) ≡ r (t, Mc ). We define the stochastic variable recurrence time (time between two successive events) τ . We introduce the distribution function of the recurrence time D (τ ; Mc ). F. Spineanu – COST-2012 Hamburg –

18

Distributions

Normalizing D with R and τ by R−1 we find an universal law D (τ ; Mc ) [R (Mc )]

−1

= f [R (Mc ) τ ]

f ≡ scaling function The process of generation of events is NOT stationary. Then the scaling function f will NOT be the same all along the interval the observation. What should be done to return to a universal scaling? We have to do something to remove this dependence of local states by finding an transformation that will extract the nonhomogeneity. For example on a small window of observation the independence of the generation of one event of the generation of other events leads to F. Spineanu – COST-2012 Hamburg –

19

Distributions

the Poisson distribution. It contains a parameter, the rate, denote it by r. Moving along the time axis the window of observation the parameter of the Poisson distribution will change: this means that the stochastic process is nonhomogeneous. What should be done to return to a universal scaling? We will describe the stochastic process as a mixture of Poisson processes (since each is correct locally) but with a parameter that is itself a stochastic variable. The parameter will be described by a distribution law, given as a density of rates ρ (r). This is the weight of mixing different Poisson processes that each comes with its parameter r. The recurrence time density is a temporal mixture of Poisson

F. Spineanu – COST-2012 Hamburg –

20

Distributions

processes, with a density δ (τ ; r (Mc )) = r exp (−rτ ) The probability that a density δ (τ ; r) to contribute to the distribution D is proportional with r.  1 r0 rδ (τ ; r) ρ (r; Mc ) dr D (τ ; Mc ) = μ rmin  r0 1 = r 2 exp (−rτ ) ρ (r; Mc ) dr μ rmin It has been introduced the density of rates ρ (r; Mc ) and μ is a normalization factor



μ = r (Mc ) =

rρ (Mc ) dr

F. Spineanu – COST-2012 Hamburg –

21

Distributions

We know r (τ ; Mc ) from Omori law so we find

−1

dr ρ (r; Mc ) =



dt C ρ (r; Mc ) = r 1+1/p Then D (τ ; Mc )

= =

C μ C



r0

r 1−1/p exp (−rτ ) dr

rmin

Γ (2 − 1/p; rmin τ ) − Γ (2 − 1/p; r0 τ ) μτ 2−1/p

where

 Γ (α, z) =



z α−1 exp (−z) dz

z

Two extreme regimes. F. Spineanu – COST-2012 Hamburg –

22

Distributions

For

1 1 τ  r0 rmin

we have

C Γ (2 − 1/p) D (τ ; Mc ) ≈ μ τ 2−1/p

which is a power law. For rmin τ 1 expanding the Γ function one finds D (τ ; Mc ) ≈

C 1−1/p exp (−rmin τ ) rmin μ τ

which is an exponential decay. The universal function f that has been used to describe the connection between the the probability distribution of the recurrence F. Spineanu – COST-2012 Hamburg –

23

Distributions

times and the normalized recurrence time (τ R (Mc )) can be written f (θ) ∼

exp (−θ/a) θ 2−1/p (1 + θ)1/p−1

Another expression for f is obtained using the Γ function     1−γ 1 θ a f (θ) = C exp − aΓ (γ) θ a where θ



τR

a



dimensionless scale parameter

θ

=

γa which means a = 1/γ

and the only parameter for fit is γ.

F. Spineanu – COST-2012 Hamburg –

24

Distributions

For Omori sequence 1−γ

=

a =

1 2− p R rmin

A fit for one region leads to γ ≈ 0.22 and a ≈ 3 which leads to

1 ≈ 0.82 p= 1+γ

The stationarity The definition of the adequate variable that will allow to exhibit stationarity in the evolution of a stochastic process. F. Spineanu – COST-2012 Hamburg –

25

Distributions

In general we expect that the accumulated number of events that are higher than a threshold Mc , until the current time t:  t  r0 (Mc ) dt N (Mc , t) = N (Mc , 0) +  /c)p (1 + t 0 This growth should be linear, but it is NOT. There are however periods of time where the growth is linear. This means stationarity.

F. Spineanu – COST-2012 Hamburg –

26

Distributions

(Corral 9a) As said just above, there is no full stationarity but a collection of stationarity on intervals. To get global stationarity (if this exist) one has to put together all these stationarity sequences. F. Spineanu – COST-2012 Hamburg –

27

Distributions

Two aspects of the statistics of the large events: 1. the recurrence time increases with the time passed from the last event. 2. there is clustering of small and respectively large events.

2.2

Renormalization of the stochastic process and the distribution at the fixed point

Representing the amplitude of events over a threshold Mc , at the moment of time they have occured has an apparent very irregular picture of a time series of a stochastic process.

F. Spineanu – COST-2012 Hamburg –

28

Distributions

(Corral) Changing the threshold Mc → Mc the picture changes since there may be fewer events and they occur rarely. Then one changes the time interval enlarging the window of observation. The picture returns to the very irregular form, similar to F. Spineanu – COST-2012 Hamburg –

29

Distributions

the initial one. This transformation, consisting of two steps: 1. increasing the threshold such that low magnitude events are eliminated 2. enlarging the interval of time are similar to the transformation of the Renormalization Group in statistical physics: the elimination from the model of the small scale fluctuations (k > k0 ) and scaling of space-time variables. A sequence of such transformations can identify, in some cases, a kind of behavior of the system (described by the equation with modified parameters) which is invariant to further tranformations: a fixed point. If this is possible, then the system will exhibit properties that are independent of the degree of detail. An example is the description of the fluid turbulence. F. Spineanu – COST-2012 Hamburg –

30

Distributions

Assume the stochastic variable is the recurrence (waiting) time τi = ti − ti−1 and assume also that the occurence of the event at time ti (including the waiting time τi ) only dependes on the magnitude Mi−1 of the previous event. (This may allow to refer leter to the convection event of mass flux Mi−1 , etc.). We will try to find a relation between the distribution functions for the recurrence time for two values of the threshold parameter Mc > Mc D (τ ; Mc ) and D (τ ; Mc )

F. Spineanu – COST-2012 Hamburg –

31

Distributions

We define new distribution functions, which reflect a condition to be fulfilled by the events D (τ |Mprev ≥ Mc ; Mc ) =

distribution function of recurrence times τ of events with magntitude greater than Mc under the condition that the magnitude Mprev of the previous event was greater than the present threshold Mc

D (τ |Mprev ≤ Mc ; Mc ) =

distribution function of recurrence times τ of events with magnitude greater than Mc under the condition that the magnitude Mprev of the previous event was smaller than the present threshold Mc

F. Spineanu – COST-2012 Hamburg –

32

Distributions

Now define p , the probability that an event, from the class of events that have magnitudes greater than the first threshold Mc , has magnitude greater than the second threshold Mc , p

≡ =

Pr ob [M ≥ Mc |M ≥ Mc ]  10−b(Mc −Mc )

invoking the Gutenberg-Richter law that the number of events of magnitude greater than a threshold decreases exponentially. We also introduce the complementary probability q =1−p which is the probability that an event, from the class of events with magnitude greater than the first threshold Mc , has magnitude lower than the second threshold Mc . This means that the magnitude is in

F. Spineanu – COST-2012 Hamburg –

33

Distributions

the interval Mc ≤ M ≤ Mc and the probability q is q ≡ Pr ob [M < Mc |M ≥ Mc ] With these two (complementary) probabilities p and q and with the two distributions functions D (τ |Mprev ≥ Mc ; Mc ) and D (τ |Mprev ≤ Mc ; Mc ) we will construct the relationship between the two distribution functions of recurrence times, D (τ ; Mc ) and D (τ ; Mc ). The strategy is quite usual: we have introduced an intermediate parameter Mprev so we note that we have three parameters (magnitudes) in the problem Mc fixed Mprev can be anywhere relative to Mc and Mc Mc fixed

F. Spineanu – COST-2012 Hamburg –

Distributions

34

The presence of Mprev is obligatory since we have assumed that the occurence of a particular value of the waiting time τ depends on the magnitude of the previous event Mprev . We analyse the probability of the event by the possibilities of placing the magnitude Mprev for the previous event in all possible situations relative to the two fixed parameters Mc < Mc . There are probabilities for these situations, p and q. For each situation we invoke the distributions functions with the conditioning that correspond to the respective situation. In this way we construct the distribution function for the recurrence time of events greater than the second threshold

F. Spineanu – COST-2012 Hamburg –

35

Distributions

D (τ ; Mc ) =

pD (τ |Mprev ≥ Mc ; Mc ) +pD (τ |Mprev ≥ Mc ; Mc ) ∗ qD (τ |Mprev < Mc ; Mc ) +pD (τ |Mprev ≥ Mc ; Mc ) ∗ qD (τ |Mprev < Mc ; Mc ) ∗ qD (τ |Mprev < Mc ; Mc ) +... The operation ∗ is convolution. D (τ ; Mc )

=

pD (τ |Mprev ≥ Mc ; Mc ) ∞  ∗k q k [D (τ |Mprev < Mc ; Mc )] ∗ k=0

The Laplace transform





exp (−sτ ) F (τ ) dτ

F (s) = 0

F. Spineanu – COST-2012 Hamburg –

36

Distributions

converts the convolutions into products D (s; Mc )

=

pD (s|Mprev ≥ Mc ; Mc ) ∞  k q k [D (s|Mprev < Mc ; Mc )] × k=0

=

pD (s|Mprev ≥ Mc ; Mc )

1 1 − D (s; Mc ) + pD (s|Mprev ≥ Mc ; Mc )

where after replacing q = 1 − p we note that D (τ ; Mc ) = pD (s|Mprev ≥ Mc ; Mc ) + qD (s|Mprev < Mc ; Mc ) This is the transformation we looked for. Rising the threshold from Mc to Mc the distribution function of the recurrence time will transform as shown. Now we note that the recurrence time has of course, changed, the interval between two events must increase since we choosed to retain F. Spineanu – COST-2012 Hamburg –

37

Distributions

higher magnitudes. Then we scale the time of recurrence τ with the probability p,  −1   −1  D (τ ; Mc ) → p D p τ ; Mc D (s; Mc )

→ D (ps; Mc )

Then the transformation is pD (ps|Mprev ≥ Mc ; Mc ) T {D (s; Mc )} = 1 − D (ps; Mc ) + pD (ps|Mprev ≥ Mc ; Mc ) The fixed point is T {D (s; Mc )} = D (s; Mc )

F. Spineanu – COST-2012 Hamburg –

38

Distributions

2.2.1

The Poisson distribution is a fixed point of the renormalization transformation

The basic assumption that characterises the Poisson distribution is the independence of the events relative to any parameter. D (τ |Mprev ≥ Mc ; Mc )

=

D (τ |Mprev < Mc ; Mc )

=

D (τ ; Mc )



D0 (τ ; Mc )

The renormalization transformation becomes pD0 (ps; Mc ) T {D0 (s; Mc )} = 1 − qD0 (ps; Mc ) The fixed point equation has the solution D0 (s; Mc ) =

1 1 + ks

F. Spineanu – COST-2012 Hamburg –

39

Distributions

where k results from the separation of the variables s and ps. The inverse Laplace transform gives

 τ D0 (τ ; Mc ) = k −1 exp − k

with k = τ (Mc ) The Poisson process has no correlations and is invariant to the renormalization group.

F. Spineanu – COST-2012 Hamburg –

40

Distributions

2.2.2

Comment about the fact that the process identified as fixed point of the renormalization group transformation (Poisson) is not confirmed by the reality

The function that describes the distribution of the recurrence time τ is NOT Poisson. It is the universal function f . This shows that the assumption of independence of events (absence of correlations) is not correct.

2.3

Memory and clustering

Again using the language of the events of convection. F. Spineanu – COST-2012 Hamburg –

41

Distributions

(Havlin) Review There are two basic laws. One is governing the rate (number per unit time) of aftershocks with magnitude greater than M , following an main event. The law is an algebraic decay n (t) ∼ aM t−α with α ≈ 1. F. Spineanu – COST-2012 Hamburg –

Distributions

42

The other is the Gutenberg Richter law giving the number of earthquakes of amgnitude greater than M , by log10 N (M ) ≈ −bM . Other quantities can be defined for the description of the statistics of the events. If the request of details adapted to specific classes of events (regions, range of magnitudes) are weak, the statistics exhibit universalities. For example considering without restriction all areas and all threshold M ranges, the distribution function for the recurrence time between two successive events, τ , can be expressed after normalization, by an universal function   1 τ f DM (τ ) = τM τM

F. Spineanu – COST-2012 Hamburg –

43

Distributions

The function f (Θ) can be approximated by Gamma function   δ Θ f (Θ) ∼ Θ−(1−γ) exp − B γ ∼ 0.6 δ



1

To see if the function f (Θ) is indeed universal, it is searched the possible dependence of memory. One then defines the conditional probability DM (τ |τ0 ) =

distribution of the recurrence times τ that immediately follow a recurrence time τ0

F. Spineanu – COST-2012 Hamburg –

44

Distributions

If there is NO memory, the dependence on τ0 does not exist and DM (τ |τ0 ) ≡ DM (τ ) In reality, the dependence on τ0 is strong. The way in which appears the effect of memory is very strange: small recurrence time follow small recurrence time and large recurrence times follow large recurrence times. The distribution is clustered. By examination of the database of earthquakes in four areas, the conclusion is confirmed: small and large recurrence times are more likely to be followed by small and large ones. This is not a result of analytical model, it is the result of database investigation. Few consequences; F. Spineanu – COST-2012 Hamburg –

45

Distributions

1. TRUE: the bigger the size of an event, the shorter the time till next 2. TRUE: the shortest the time between two events, the shortest the recurrence of the next one. 3. FALSE: the longer the recurrence time for an event, the bigger its size; this is false.

2.4

The CTRW (Continuous Time Random Walk) and the Weierstrass function

The prototype of a stochastic system, the random walk; a sequence of independent random displacements, of arbitrary direction and random magnitude. F. Spineanu – COST-2012 Hamburg –

46

Distributions

A random time interval (waiting-time) is assumed between two jumps. This process is called CTRW. It consists of trajectories that are self-similar in space and have self-similar intermittency. The prototype is the Weierstrass function, which appears in the following context. Consider the simplest, one-dimensional, lattice for the Random Walk. Consider N steps. Consider the jumps on this lattice as being organized in this way: ±1 with probability C ±b with probability

C N

±b2 with probability ±bj with probability

nearest neighbor

C N2 C Nj

... the restriction: the total number of steps is N

F. Spineanu – COST-2012 Hamburg –

47

Distributions

The normalization C=

N −1 2N

The probability that a given displacement is l is ∞    N −1  1   j j δ l−b +δ l+b p (l) = 2N j=0 N j

F. Spineanu – COST-2012 Hamburg –

48

Distributions

(Weierstrass function) In literary terms: when N is large, the jumps are, most of them, to the nearest neighbor, with basic length ±1. Seldom there is a jump of length ±b. Even more rarely there is a jump of distance ±b2 , etc. F. Spineanu – COST-2012 Hamburg –

49

Distributions

One can say that the system will make a number of N jumps of ±1 (nearest neighbor) before a jump of length ±b to occur. The jumps of length ±1 will create a cluster and the cluster will be (probabilistically) left after N steps for a longer jump, of ±b. About N such clusters will be created, separated by a distance b from the first cluster of nearest neighbor. After that a jump of length ±b2 occurs, etc. The trajectory will trace out a set of self-similar clusters, under the conditions 1. the mean square displacement per jump is infinite  2 l =∞ 2. the effective walk dimension is greater than 2.

F. Spineanu – COST-2012 Hamburg –

50

Distributions

For the Weierstrass function ∞    2  l=+∞ N − 1 l2 p (l) = l = N j=0



l=−∞

2

b N

j

diverges if b2 > N

The Fourier transform of the probability p (l) is j=∞  j N −1  1 cos kb p (k) = N j=0 N j

This is actually the Fourier transform of the Weierstrass function, continuous everywhere and nowhere differentiable.

F. Spineanu – COST-2012 Hamburg –

51

Distributions

For small k, we have 1  2 2 l k + ... p (k) ≈ 1 − 2 which is not valid when b2 > N

The function p (k) verifies the scaling equation 1 N −1 p (k) = p (bk) + cos (k) N N

F. Spineanu – COST-2012 Hamburg –

52

Distributions

Conclusions The starting point was the examination of the database for amplitude of convective events (∼ cloud size distributions) and find that the the process that is behind it is NOT Poisson. Then there is a memory effect. We have assumed the existence of two laws: (1) exponential decay of the distribution of magnitudes, which is equivalent to Gutenberg-Richter law) and (2)the power-law decay of the frequency of events after each major event (equivalent to Omori law) Then data organizes into an universal law, approximated by Gamma function. This means correlations (memory) and in particular can exhibit anomalous statistics, similar to Levy flights. The signature is clustering of events. F. Spineanu – COST-2012 Hamburg –

53

Distributions

The instrument is Continuous Time Random Walk.

F. Spineanu – COST-2012 Hamburg –