Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

(*), and of the logarithm of its absolute values (∇) 2 Monte Carlo Experiments −1 10 −2 10 1 10 2 3 10 10 4 10 N Gaussian white noises {xn }, for their absolute values {|xn |}, and for the logarithm of their absolute values {ln |xn |}. As mentioned before the three quantities are practically identical. References 1. Box, G., Jenkins, G., Reinsel, G.: Time Series Analysis: Forecasting and Control, 3rd edn. Prentice-Hall, Upper Saddle River (1994) 2. Box, G.E.P., Pierce, D.A.:

polynomial fitting does not. The values of η for the estimated trends in Fig. 4.5 confirm the interpretation given in Sect. 3.2 to the values of the index η, i.e., η < 0.2 indicates a good resemblance between the estimated and real trends, whereas 0.2 < η < 0.5 an acceptable resemblance. The results in Fig. 4.6 are obtained by processing statistical ensembles of 100 numerically generated time series with N =1000, P = 5, and various values for φ and r . We analyze first the case of white noise

, X n 2 , . . . , X n m is the probability that their values are smaller than some given values Fn (x) = P(X n 1 ≤ x1 , . . . , X n m ≤ xm ), where n = (n 1 , . . . , n m ) and x = (x1 , . . . , xm ). For absolutely continuous random variables there exists the joint pdf pn (x). A stochastic process is (strictly) stationary if for every index vector n and integer d we have Fn+d1 = Fn or pn+d1 = pn , where 1 = (1, 1, . . . , 1), i.e., its joint probabilities do not change under temporal

avoid such errors when we have more information on the specific properties of noise or trend. 6 1 Introduction Confusion is often made between the trend definition and trend estimation. The definition is based on the different nature of the two terms in Eq. (1.10): the trend is the deterministic part of a time series with additive noise. Then it can be determined by means of Eq. (1.13) if we dispose of a sufficient number of time series (1.12) generated in the same conditions by the same

+ φ 4 + · · · + φ 2(n−2) + σ12 φ 2(n−1) . Applying the formula for the sum of a geometric series we have σn2 = σs2 + σ12 − σs2 φ 2(n−1) , (1.26) 1.3 AR(1) Stochastic Process 11 Fig. 1.2 The standard deviation of a finite AR(1) process for different values of φ when the first term coincides with the white noise Z 1 = W1 10 2 σn φ = 0.99 10 1 φ = 0.9 10 φ = 0.6 φ = 0.3 0 0 10 10 1 10 2 10 3 n where we have used Eq. (1.22). The variance of the finite AR(1) process has a