Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
Scarica il documento per vederlo tutto.
vuoi
o PayPal
tutte le volte che vuoi
STATISTICS FOR STOCHASTIC PROCESSES
- material on skuola.net computer
- 26 hours of classrooms + 6 hours of lab (R) + Rimotta lectures
- Thursday 9.00 - 10.30
Exam
- analysis of a data set
- essay on one of the documents by Prof. Rimotti right after the first part (30 min)
- oral examination (one question) a couple of days later
After the written part: ottimo - distinto - buono - sufficiente
books + web page
install packages "astsa" from the il paradiso dello studente page web, package "TSA"
12 Marzo 2018
What is a TIME SERIES?
Statistics are computed on observations (data): x1, ... , xm
Simple model: realizations of iid random variables X1, ... , Xn
There are correlations among data, particularly if time is involved in ordering usually through time.
Data are observations at discrete times of stochastic dynamical systems usually described with a stochastic process: discrete time series
Notation: Xt t tt z
We can't assume longer independence vs time series analysis
- Goals: description of data (plots, descriptive statistics, ...)
- looking at trend, seasonality, random terms
Time series (1 & 2): deterministic function (exponential) + stochastic process whose variance grows with time
Correlations as scatter diagram with log-log plot
Global temperature + trend to seasonality
speech: periodic time series
Prof Rimbot: spectral analysis to understand how to compute periodicity Fourier analysis
We're going to study linear time series sampled with time
Difference between periodicity and seasonality
Another tool is seasonal and trend decomposition, the command in R is decompose
seasonal + trend = trend loses decomposition
Remainder is a stochastic error
additive (linear) model: adding seasonal,
trend and remainder gives up the original data plot
- We will focus on finding a stochastic model for the remainder term, for example with descriptive statistics (tests for normality, qq-plots, histograms)
We'll work on data only, time series it's easier
mean and variance are constant over time
Non-stationary times series could be transformed
so we have that
E(Xt) = μt + E(X0) + ∑j=1t E(Wj) = μt
and
δt,ρ = E[(Xt - μt)(X0 - μρ)] =
= E[(μt + X0 + ∑j=1t Wj - μt)(μρ + X0 + ∑j=1ρ Wj - μρ)]
= E [ (X0 + ∑j=1t Wj) ( X0 + ∑j=1ρ Wj) ]
= E(X02) + E(X0) E(∑j=1ρ Wj) + E(∑j=1t Wj)E(X0)
+ ∑j=1t ∑k=1ρ E(WjWk) = σ2 min{ρ,t} a Lt
and also we have that
- E(WjWk) = { E(Wj)E(Wk) = 0 j ≠ k }
- { E(Wj2) = σ2 j = k }
and if ρ ≤ t, there are ρ pairs such that j = k
while if t < ρ, there are t pairs such that j = k
∞ δt,ρ = σ2 min{ρ,t} a Lt
ρt,ρ = δt,ρ = { min{ρ,t} a t σ2 /t if ρ ≤ t
σ2 ρ a Lt √ρσ if t < a
EXAMPLE
Let us consider a periodic signal
Xt = R sin(2πWt + φ) + Wt with t ∈ ℝ,
———————————————
W ∼ Θ, φ ∈ ℝ and
{Wt, a ≈ N(0,σ2) and iid)
In the case R = , 2πW = 1 and φ = 0
and in this case we have
Xt = sin t + Wt
R is the AMPLITUDE, φ is the PHASE/SHIFT
W is the FREQUENCY ρ = √a = ŵ
2πW = Ŵ
—φ/2πW is the shift with respect to the first
CROSSING
{Xt} is stationary
Kolmogorov Theorem
{Xt} a stochastic process (Xt₁, ..., Xtm)
- FXt₁, ..., Xtm
= (x₁, ..., xm) =
P(Xt₁ ≤ xi, ..., Xtm ≤ xm)
if Xi → ∞ I get that
FXt₂, ..., Xtm
(x₂, ..., xm) = P(Xt₂ ≤ x2, ..., Xtm ≤ xm)
EXERCISE Check if δ(h) = Cov(h) is a ACVF
{Xt} is STRICTLY STATIONARY if
(Xt₁, ..., Xtm) = (Xt₁₊ₙ, ..., Xtm₊ₙ)
in distribution
∀ m ∈ ℕ, h ∈ ℤ, ∀ t1, ..., tm ∈ ℤ
This means that the stochastic behaviour of {Xt}
over (t1, tn) is the same as (0, nth)
Theorem
If {Xt} is strictly stationary
and {Xt} ∈ ℒ2 ⇒ {Xt} is (weak) stationary
Proof
Note that for m = 1 ⇒ Xt₁ = Xt₁₊ₙ
∀h ∈ ℤ ∀t1 ∈ ℤ
if E(X²t) < ∞ then we have E(Xt) = E(Xt₁₊ₙ)
and Var(Xt) = Var(Xt₁₊ₙ)
We have Cov(Xt - X₀) = Cov(Xt₁₊ₙ, X₀₊ₙ)
since (Xt - X₀) = l (Xt₁₊ₙ, X₀₊ₙ) ∀t, ℓ ∈ ℤ
EXAMPLE 11
Consider Xt = μ + γt
a random walk = δ + μt + wt
then ∇Xt = Xt - Xt-1 = μt - μt-1 + (γt - γt-1)
= δ + ∇γt + wt
stationery
The difference operator at LAG d is
∇dXt = Xt - Xt-d = (1 - Bd)Xt
while we have ∇dXt = (1 - B)d Xt
EXAMPLE 12
of Xt = mt + at + γt with at such that at = at-d (a is the period), then
∇dXt = Xt - Xt-d =
= mt + at + γt - mt-d + at-d - γt-d =
= (mt - mt-d) + (γt - γt-d)
EXAMPLE 13
TREND AND SEASONAL COMPONENTS
Assume Xt = mt + at + γt with
- E(γt) = 0
- mt = at + b (Linear trend)
- at = at+d (a is the period) such that
- d = 2q + 1
d = q + 1
∑j=1qa + j = 0
Consider Zt = 1 / 1 + 2q (Xt-q + Xt-q+1 + .... + Xt+q-1 + Xt+q)
a step backwards
Let's plug μ in Zt and we get
Zt = 1 / 1 + 2q Σqj=q mt+j + Σqj=q at+j = 0
(from 1 hypothesis)