Estratto del documento

Time series analysis

Definitions

Time series (TS) we consider a given phenomenon in relation to its

evolution in time; broadly speaking, a collection of observations made

sequentially in time.

Examples: daily down jones stock market closes over a given period, monthly

unemployment data for a given country, annual global temperature data for a

given country, annual global temperature data for the past 100 years, monthly

sunflower oil prices, average monthly temperature in Tuscany, number of the fh

cases in given region, etc.

Def: a TS is a sequence of observations related to a given phenomenon,

{ }

y , t=1, … ,T

ordered in time t

Note: such definition is valid for equally spaced data.

Time series plot: the observations are plotted against the time of

observation, where consecutive observations are joined by straight lines.

We can decompose the TS in the following

components:

- Trend

- Cycle (cyclic component)

- Trend cycle

- Seasonality (seasonal component)

- Random component

Components

T

Trend when there is a long-term increase/decrease in the data, usually

t

related to the technological development; in general, it is represented by a

simple mathematical function, as for example a polynomial in t, or an

exponential function.

C

Cycle (business cycle) the sum, or the product of the trend and the

t

cycle. Thus, trend and cycle are merged together; this is because the trend and

the cycle often are not easily separable.

TC

Trend-Cycle the sum, or the product of the trend and the cycle. Thus,

t

trend and cycle are merged together; this is because the trend and the cycle

often are not easily separable.

S

Seasonality when a series is influenced by seasonal factors (e.g. the

t

quarter of the year, the month, or the day of the week).

u

Random component t

It is important to distinguish cyclic patterns and seasonal patterns. Seasonal

patterns have a fixed and known length, while cyclic patterns have variable

and unknown length. The average length of a cycle is usually longer than that

of seasonality, and the magnitude of cyclic variation is usually more variable

than that of seasonal variation.

Time series decomposition

{ }

y , t=1, … ,T

Given a TS :

t

- Additive decomposition: it is assumed that the three components are

=TC + +u

y S

independent, and it is represented as follows: t t t t

- Multiplicative decomposition: the TS is directly proportional to each

component: ∀

=TC >0,

y ∙ S ∙u y t

with

t t t t t

- Longs turn multiplicative relationship into an additive relationship (ln):

( ) ( ) ( )

=TC =ln + + ( )

y ∙ S ∙u ln y TC ln S ln ⁡ u

t t t t t t t t

Forecasting

Starting from a TS, how we can provide forecasting?

}

TS :{ y , y , … , y

1 2 T +1, + +

Aim: provide forecasting at time with H also called “horizon”

T T 2,… , T H

“Heuristic” methods: for examples Exponential Smoothing.

Stochastic processes (in discrete times)

( )

Definition: given a probability space , a Stochastic Process (PS) in

Ω , A , P

discrete times is a sequence of random variables:

{ } ∞

Y :t Z :Ω → R

t

- sample space

Ω

- algebra of events (set of events)

A

- probability measure

P ( )

( ) ( ) ( )

ω Ω→ … , y ω , y ω , y ω , …

−1

1 1 0 1 1 1

( )

( ) ( ) ( )

ω Ω→ … , y ω , y ω , y ω ,…

−1

2 2 0 2 1 2

( )

( ) ( ) ( )

ω Ω→ … , y ω , y ω , y ω , …

−1

3 3 0 3 1 3

Remarks y

- If we fix , for example t=3, is a single random variable

t 3 −∞

- If we fix , trajectory of the PS (the time goes from to ):

ω ∞

( )

( ) ( ) ( )

… , y ω , y ω , y ω ,…

−1 1 0 1 1 1 (

Y ω)

- If we fix both and : is a single number.

t ω t

Definitions (Stochastic Processes)

- Index set: the set within which stand the time t; for us, the index set will

be limited to the set of integers .

Z

- Domain in which each random variable is defined: we will concentrate

+¿

on set of all real numbers , or subsets of , for instance .

R R ¿

R

- Dependence structure between the random variables and the is, how

each random variable depends on the “past” ones.

PS 1: zero mean White Noise (WN)

( ) ( )

{ } 2 2 ∀ ∈

Y WN 0,σ ↔ Y i. i. d . N 0, σ , t Z

t t

***i.i.d. independent and identically distributed

Normal White Noise:

( ) ( )

{ } 2 2 ∀ ∈

Y NWN 0, σ ↔Y i .i . d . N 0, σ , t Z

t t

White noise + constant:

( ) ( )

{ } 2 2 ∀ ∈

Y WN μ , σ ↔ Y i. i. d . N μ , σ , t Z

t t ( )

=μ +u

Y 2

Which could be re-written as: with ∀ ∈

u WN 0, σ , t Z

t t t

Random Walk (RW)

{ } ∀ ∈

=Y +u

Y RW ↔ Y , t Z

t t t−1 t 2

Where )

u WN i. i .d .(0, σ

t

At time , the movement to time t is totally determined by the random

t−1 u

component .

t

Some application: the behavior of stock market, the path travelled by a

molecule as it moves through a liquid.

Random walk + drift 

{ } ∀ ∈

+drift =δ+Y +

Y RW ↔ Y u , t Z

t t t−1 t

2

Where: )

u WN i. i .d .(0, σ

t

: drift

δ

At time , the movement to time t is determined by plus the random

t−1 δ

u

component t

AR(1) Auto-Regressive of order 1

{ } ( ) ∀ ∈

=ω+α + 2

Y AR 1 ↔Y Y u , t Z where )

u WN i. i .d .(0, σ

−1

t t t t t

Where and are coefficients.

ω α

MA(1) Moving Average of order 1

{ } ( )

( ) ∀ ∈

=μ+ +u 2

Y MA 1 ↔ Y β u , t Z where u WN i. i .d . 0, σ

−1

t t 1 t t t

Linear Trend LT

{ } ( )

∀ ∈

< =β +

Y ↔Y β t+ u , t Z 2

where u WN i. i .d . 0, σ

t t 0 1 t t

Stationarity

- Motivation: in a Time Series Analysis, often we have at most one

observation of the SP. This issue makes the inference impossible in

practice, since at most one observation is not enough.

In general, the stationarity assumes that certain statistical properties of the TS

do not change in the time. That is, it ensure that each observation of the TS

“contains” information not only on the corresponding random variable, but also

on the other ones (including also those for which we do not have observations).

The stationarity can be related to all the moments or to some of them. Strong

stationarity is beyond the scope of this course. Weak stationarity (second

order) relates to the first two moments, that is the mean (first order model),

and the second order moments.

Weak stationarity

{ }

Y is weak stationarity ↔

t )

E(Y

1. Mean stationarity: (expected value) exist and does not depend

t

on (time):

t

( ) ∀ ∈

E Y , t Z

t (Y )

V

2. Variance stationarity: (variance) exists and does not depend

t

on :

h

( ) ∀ ∈

V Y , t Z

t 0 ( )

C Y , Y

3. Covariance stationarity: (covariance) exists, with ,

h ≠ 0

+h

t t

can depend on , where is called lag, but it does not depend on

h h

:

h ( ) ∀ ∈

+ =γ

C Y Y , t Z

t t+ h h

γ γ ,

Where , , are constant.

μ 0 h

For the process to be weak stationary, all the conditions must be satisfied.

Therefore, if the first condition is not met, you can skip checking the other two

conclude that the process is not stationary.

Remarks ( ) ( ) ∀ ∈

=V =γ

C Y , Y Y , t Z

1. With h=0 it follows that: t t t 0

{Y }

2. If is weak stationary, then:

t (Y )

C , Y γ γ

+h

t t h h

( ) ∀ ∈

= = = =ρ

ρ Y , Y , t Z

+h

t t h

√ γ

γ γ

( ) (Y )

V Y V 0

0 0

+h

t t ( )

ρ Y , Y

Thus, the correlation coefficient does not depend on , but it

t

+h

t t

can depend on . The correlation coefficient ranges between -1 and 1.

h

3. Furthermore: =1

ρ 0 =γ

γ −h h

=

ρ ρ

−h h

Symmetry between covariance and correlation coefficient.

In fact: ( ) ( )

=C =C =γ

γ Y ,Y Y , Y

−h −h

t t t−h t h

γ γ

−h h

= = =

ρ ρ

−h h

γ γ

0 0

Example: is the White Noise (WN) weak stationary?

( ) ( )

{ } 2 2 ∀ ∈

Y WN 0,σ ↔ Y i. i. d . 0, σ , t Z

t t

( ) ( )

∀ ∈

=0,

E Y t Z E Y

1. mean stationarity since exists and does not

t t

depend on t

2 ( )

( ) V Y

∀ ∈

2. variance stationarity since exists and does

=σ 

V Y , t Z t

t

not depend on t

( ) ( )

=0

h ≠ 0,C Y ,Y C Y , Y

3. With covariance stationarity since does

 +h

t t+ h t t

not depend on t

The zero mean White Noise (WN) is weak stationary. Furthermore, also the WN

+ constant is weak stationary (proof omitted).

Is the Random Walk (RW) weak stationary?

{ } ∀ ∈

=Y +u 2

Y RW ↔ Y , t Z

Remind: , where )

u WN i. i .d .(0, σ

t t t−1 t t

( ) ( ) ( ) ( ) ( )

=E =E + =E =…

E Y Y , u Y E u Y

1. mean stationarity

−1 −1

t t t t t t−1 2

( ) ( ) ( ) ( ) ( ) ( )

=V +u =V +V +2C =V +σ

2. V Y Y Y u Y , u Y

t t−1 t t−1 t t−1 t t−1

( ) ( ) ( ) ( )

+u =V +V +2

V Y Y u C Y , u

- Where is a property of variance

−1 −1

t−1 t t t t t

2

( ) =σ

- V u t

( ) =0

2C Y ,u

- t−1 t 2

( ) +σ >V (Y )

Recall that NO variance stationarity

V Y t−1 t−1

the RW is not weak stationary

Is the random walk + drift weak stationary?

{ } ∀ ∈ 2

+drift =δ+Y +

Y RW ↔ Y u , t Z where )

u WN i. i .d .(0, σ

t t t−1 t t

( ) ( ) ( ) ( )

( )+

=E + +u =E + =δ+ )

E Y δ Y δ E Y E u E(Y

1. which is different from

t t−1 t t−1 t t−1

)

E(Y if . Thus, NO mean stationarity

δ ≠ 0

−1

t

The RW + drift is not weak stationary.

Is the AR(1) Auto-Regressive of order 1 weak stationary?

{ } ( ) ∀ ∈

=ω+α + 2

Y AR 1 ↔Y Y u , t Z where )

u WN i. i .d .(0, σ

−1

t t t t t

Conditions related to AR(1):

=u

Y 2

1. If which is weak stationary (if

 

=0

ω=α ( )

WN 0,σ

t t

( )

2

=0 , )

ω=α =AR(1)

WN 0, σ

=ω+u

Y 2

2. If which is weak stationary

 

=0

α ( )

WN μ , σ

t t =Y +u

Y

3. If and RW which is not weak stationary

 

=1

ω=0 α t t−1 t

=ω+Y +u

Y

4. If RW + drift which is not weak stationary

 

=1

α −1

t t t

So the weak stationarity of AR(1) depends on .

α <1

It can be shown that AR(1) is stationary .

↔−1<α

MA(1) is always weak stationary.

Is the LT weak stationary?

{ } ∀ ∈ 2

< =β +

Y ↔Y β t+ u , t Z where )

u WN i. i .d .(0, σ

t t 0 1 t t

( ) ( ) ( ) ( ) ( ) ( )

=E + =E + + =β + =β +

E Y β β t+ u β E β t E u β t+ E u β t

1. t 0 1 t 0 1 t 0 1 t 0 1

Which depends on t NO mean stationarity

( ) =0

E u

Where t

LT is not weak stationary

 ( ) ( ) (Y )

E Y , V Y , C , Y

Estimators of under weak stationarity:

t t t t+ h

( ) ( )

E Y V Y

Three values of and are not known, so we use estimators.

t t

{ }

Y

Assumptions: is a weak stationary stochastic process we know

t

( ) ( ) ( )

=μ =γ =γ

E Y ,Var Y Cov Y , Y

and +h

t t 0 t t h T

1 ∑

1. estimators of (T=sample)

)=μ  

E(Y ^ = =

μ μ Y Ý

t t

T t=1

Y Ý

t ¿

¿

γ

2. estimator of

(Y )=γ  

V ¿

t 0 0 T−h

1 ∑

^ = ¿

γ 0 T =1

t Y

Y

( ¿¿ −h− )

t Ý

T−h

1

( ) =γ

C Y , Y γ

3. estimator of

  ∑ ( )

(¿ )= − ¿

Y Ý

¿ t+ h− Ý

+h

t t h t

T t=h−1

−h

T

1 ∑ ( )

^ = − ¿

γ Y Ý

h t

T t=1 Y

Y Ý

t ¿

¿

¿ 2

¿

Y

Y Ý

t ¿

¿

¿ 2

γ h

( ) ¿

ρ

4. estimator of

= =ρ  

ρ Y , Y h

+h

t t h

γ T

0 ∑ ¿

=1

t ¿

T

1 ∑ ¿

T =1

t

−h

T (¿¿ +h− )

t Ý

1 ∑ ( )

Y Ý ¿

t

T t=1 ^

γ h

^ = =¿

ρ h ^

γ 0

Autocorrelation Function: ACF

^

ρ

ACF: is the plot of by varying h, h=1, 2, 3, …, H. in general, H should be

h

much smaller than T (H<<T). T

H≤

Usually, 5

¿

MA( ∞ ∞

{ } ( )

Definition: =μ +u +ψ +ψ +ψ +…=μ +

Y MA ∞ ↔ Y u u u ψ u −

t t t 1 t−1 2 t−2 3 t−3 j t j

=0

j

2

=1

ψ

With , and where ( )

u WN i. i .d 0,σ

0 t

is a constant.

μ ( )

Is weak stationary?

MA ∞ ( ) =E( +ψ +ψ +

E Y μ+u u u …)

1. −2

t t 1 t−1 2 t

( )+

¿ )+ψ (u )+ )+

E μ E(u E E(ψ u …=μ mean stationarity

−1

t 1 t 2 t−2

( ) ( ) ( )

=0 =0, =0.

E u ψ E u ψ E u

( )=μ

Where , ,

E μ −1

t 1 t 2 t−2

( ) =V ( + +ψ +…)

V Y μ+u ψ u u

2. −1

t t 1 t 2 t−2

2 2

( )+V

¿ (u )+ψ (u )+ψ (u )

V μ V V

−1

t 1 t 2 t−2

2 2 2

¿ +ψ +…

0+σ σ

1

2 2 2

¿ (1+ψ + +…)

σ ψ

1 2

2

ψ j

∑ It does not depend on t, but is not necessary finite

¿

1+ j=1

2 ¿

σ ∞

∑ 2 ψ

It is finite , that is, are mean square convergent

<

↔ ψ ∞ j

j

j=1

2 2

( )+ this expression results from this

(u )+ψ (u )+ψ (u )

V μ V V V

−1

t 1 t 2 t−2

2

property: where is a random variable and a is

x

(a (x )

Var ∙ x)=a Var

a constant.

a ( ) ( )

=C +ψ +ψ + +ψ +ψ +

C Y , Y μ+ u u u …+ μ+u u u …

3. −h −h −h−1 −h−2

t t t 1 t−1 2 t−2 t 1 t 2 t

¿ )

C(Y ,Y −h

t t

2 does not depend on t, but can depend on h

¿ 

σ ψ ψ

j j+ h

j=0

∑ ψ

does not necessarily converge, if are mean square

 ψ ψ j

j j+h

j=0

convergent, then it converges.

In practice we should have finished number of coefficients.

{ } (

SP Y MA ∞) ↔ψ

The is weak stationary are mean square convergent.

t j

Thus, the moments are:

( ) =μ

E Y

- t 2

ψ j

∑ ¿=γ

- 1+ 0

j=1 2

( ) =σ ¿

V

Anteprima
Vedrai una selezione di 9 pagine su 37
Appunti Statistics for experiment in inglese  Pag. 1 Appunti Statistics for experiment in inglese  Pag. 2
Anteprima di 9 pagg. su 37.
Scarica il documento per vederlo tutto.
Appunti Statistics for experiment in inglese  Pag. 6
Anteprima di 9 pagg. su 37.
Scarica il documento per vederlo tutto.
Appunti Statistics for experiment in inglese  Pag. 11
Anteprima di 9 pagg. su 37.
Scarica il documento per vederlo tutto.
Appunti Statistics for experiment in inglese  Pag. 16
Anteprima di 9 pagg. su 37.
Scarica il documento per vederlo tutto.
Appunti Statistics for experiment in inglese  Pag. 21
Anteprima di 9 pagg. su 37.
Scarica il documento per vederlo tutto.
Appunti Statistics for experiment in inglese  Pag. 26
Anteprima di 9 pagg. su 37.
Scarica il documento per vederlo tutto.
Appunti Statistics for experiment in inglese  Pag. 31
Anteprima di 9 pagg. su 37.
Scarica il documento per vederlo tutto.
Appunti Statistics for experiment in inglese  Pag. 36
1 su 37
D/illustrazione/soddisfatti o rimborsati
Acquista con carta o PayPal
Scarica i documenti tutte le volte che vuoi
Dettagli
SSD
Scienze economiche e statistiche SECS-S/02 Statistica per la ricerca sperimentale e tecnologica

I contenuti di questa pagina costituiscono rielaborazioni personali del Publisher ingchiaretta98 di informazioni apprese con la frequenza delle lezioni di Statistica per la sperimentazione e le previsioni in ambito tecnologico e studio autonomo di eventuali libri di riferimento in preparazione dell'esame finale o della tesi. Non devono intendersi come materiale ufficiale dell'università Università degli Studi di Firenze o del prof Nikiforova Nedka Dechkova.
Appunti correlati Invia appunti e guadagna

Domande e risposte

Hai bisogno di aiuto?
Chiedi alla community