Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Markov Chains - Stochastic Hydrology - Lecture Notes, Study notes of Mathematical Statistics

The main points i the stochastic hydrology are listed below:Markov Chains, Conditional Probability, Transition Probability, Independent of Time, Transition Probability Matrix, Steady State, Probability Vector, Multiple Linear Regression, General Linear Model, Matrix Notation

Typology: Study notes

2012/2013

Uploaded on 04/20/2013

sathyanarayana
sathyanarayana 🇮🇳

4.4

(21)

140 documents

1 / 38

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
MARKOV CHAINS
Docsity.com
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26

Partial preview of the text

Download Markov Chains - Stochastic Hydrology - Lecture Notes and more Study notes Mathematical Statistics in PDF only on Docsity!

MARKOV CHAINS

  • A Markov chain is a stochastic process with the

property that value of process X

t

at time t depends

on its value at time t-1 and not on the sequence of

other values (X

t-

, X

t-

,……. X

0

) that the process

passed through in arriving at X

t-

4

Markov Chains

[ ] [ ]

1 2 0 1

t t t t t

P X X X X P X X

− − −

Single step Markov

chain

  • Usually written as indicating the transition of the

process from state a

i

at time t-1 to a

j

at time t.

  • If P

t

ij

is independent of time, then the Markov chain

is said to be homogeneous.

i.e., v t and τ

The transition probabilities remain same across

time.

6

Markov Chains

t

ij

P

1

t

t j t i ij

P X a X a P

t t

ij ij

P P

  • τ

Transition Probability Matrix(TPM):

7

Markov Chains

1 2 3.. m

11 12 13 1

21 22 23 2

31

1 2

m

m

m m mm

P P P P
P P P P
P
P
P P P

1

2

3

.

m

.

m x m

t

t-

  • p

j

(0)

is the probability of being in state j in period

t = 0.

  • If p

(0)

is given and TPM is given

9

Markov Chains

( ) ( ) ( ) ( ) 0 0 0 0

1 2

1

m

m

p p p p

×

…. Probability

vector at time 0

( ) ( ) ( ) ( )

1 2

1

n n n n

m

m

p p p p

×

…. Probability

vector at time

n

( ) ( ) 1 0

p = p × P

10

Markov Chains

( ) ( ) ( ) ( )

11 12 13 1

21 22 23 2

1 0 0 0

1 2 31

1 2

m

m

m

m m mm

P P P P
P P P P

p p p p P

P P P

( ) ( ) ( ) 0 0 0

1 11 2 21 1

m m

p P p P p P

…. Probability of

going to state 1

( ) ( ) ( ) 0 0 0

1 12 2 22 2

m m

p P + p P + + p P …. Probability of

going to state 2

]

  • As the process advances in time, p

j

(n)

becomes less

dependent on p

(0)

  • The probability of being in state j after a large

number of time steps becomes independent of the

initial state of the process.

  • The process reaches a steady state at large n
  • As the process reaches steady state, the

probability vector remains constant

12

Markov Chains

p = p × P

Example – 1

13

Consider the TPM for a 2-state first order homogeneous

Markov chain as

State 1 is a non-rainy day and state 2 is a rainy day

Obtain the

  1. probability that day 1 is a non-rainy day given that day 0 is

a rainy day

  1. probability that day 2 is a rainy day given that day 0 is a

non-rainy day

  1. probability that day 100 is a rainy day given that day 0 is a

non-rainy day

TPM

Example – 1 (contd.)

15

The probability is 0.

  1. probability that day 100 is a rainy day given that day 0

is a non-rainy day

( )

[ ]

[ ]

2

p

n 0 n

p = p × P

Example – 1 (contd.)

16

2

4 2 2

8 4 4

16 8 8

0.7 0.3 0.7 0.3 0.61 0.

0.4 0.6 0.4 0.6 0.52 0.

0.5749 0.

0.5668 0.

0.5715 0.

0.5714 0.

0.5714 0.

0.5714 0.

P P P

P P P

P P P

P P P

= ×

⎡ ⎤ ⎡ ⎤ ⎡ ⎤

= =

⎢ ⎥ ⎢ ⎥ ⎢ ⎥

⎣ ⎦ ⎣ ⎦ ⎣ ⎦

⎡ ⎤

= × =

⎢ ⎥

⎣ ⎦

⎡ ⎤

= × =

⎢ ⎥

⎣ ⎦

⎡ ⎤

= × =

⎢ ⎥

⎣ ⎦

Difficulties in using Markov chains in hydrology

  • Determining the number of states to use.
  • Determining the intervals of the variable under

study to associate with each state.

  • Assigning a number to the magnitude of an event

once the state is determined.

  • Estimating the large number of parameters

involved in even a moderate size Markov chain

model.

  • Handling situations where some transitions are

dependent on several previous time periods while

others are dependent on only one prior time period.

18

Markov Chains

Ref: Statistical methods in Hydrology by C.T.Haan, Iowa state university press

MULTIPLE LINEAR

REGRESSION

19

A general linear model of the form is

y = β

1

x

1

  • β

2

x

2

  • β

3

x

3

+…….. + β

p

x

p

y is dependent variable,

x

1

, x

2

, x

3

,……,x

p

are independent variables and

β

1

, β

2

, β

3

,……, β

p

are unknown parameters

  • n observations are required on y with the

corresponding n observations on each of the p

independent variables.

Multiple Linear Regression

21

  • n equations are written for each observation as

y

1

= β

1

x

1,

  • β

2

x

1,

  • …….. + β

p

x

1,p

y

2

= β

1

x

2,

  • β

2

x

2,

  • …….. + β

p

x

2,p

y

n

= β

1

x

n,

  • β

2

x

n,

  • …….. + β

p

x

n,p

  • Solving n equations for obtaining the p

parameters.

  • n must ne equal to or greater than p , in

practice n must be at least 3 to 4 times large as

p.

Multiple Linear Regression

22