Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Time Series and Monte Carlo Inference - Mathematical Tripos - Past Examination Paper, Exams of Mathematics

This is the Past Examination Paper of Mathematical Tripos which includes Topos Theory, Elementary Topos, Equivariant Maps, Discrete Topology, Category of Continuous, Local Operator, Closure Operations on Subobjects, Ective Subcategories, Geometric Morphisms etc. Key important points are: Time Series and Monte Carlo Inference, Autocovariances, Weakly Stationary Process, Spectral Density Function, White Noise Process, Mean Zero and Variance, Filter Theorem, Range of Values, Yule–Walker Equation

Typology: Exams

2012/2013

Uploaded on 02/28/2013

shamabhat_84
shamabhat_84 🇮🇳

80 documents

1 / 5

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
MATHEMATICAL TRIPOS Part III
Tuesday, 7 June, 2011 1:30 pm to 3:30 pm
PAPER 34
TIME SERIES AND MONTE CARLO INFERENCE
Attempt no more than THREE questions.
There are FOUR questions in total.
The questions carry equal weight.
STATIONERY REQUIREMENTS SPECIAL REQUIREMENTS
Cover sheet None
Treasury Tag
Script paper
You may not start to read the questions
printed on the subsequent pages until
instructed to do so by the Invigilator.
pf3
pf4
pf5

Partial preview of the text

Download Time Series and Monte Carlo Inference - Mathematical Tripos - Past Examination Paper and more Exams Mathematics in PDF only on Docsity!

MATHEMATICAL TRIPOS Part III

Tuesday, 7 June, 2011 1:30 pm to 3:30 pm

PAPER 34

TIME SERIES AND MONTE CARLO INFERENCE

Attempt no more than THREE questions. There are FOUR questions in total. The questions carry equal weight.

STATIONERY REQUIREMENTS SPECIAL REQUIREMENTS

Cover sheet None Treasury Tag Script paper

You may not start to read the questions printed on the subsequent pages until instructed to do so by the Invigilator.

1 Time Series

(a) Define the autocovariances γk, k ∈ Z, for a weakly stationary process {Xt}. Assume that a spectral density function f (λ) exists for {Xt}. Write down an expression for f (λ) in terms of the autocovariances. Find the autocovariances and the spectral density function for a white noise process {εt} with mean zero and variance σ^2.

(b) Suppose that {Xt} satisfies

Xt = φ 1 Xt− 1 + · · · + φpXt−p + εt, (∗)

where φ 1 ,... , φp are real constants and {εt} is as in (a). Write down a condition which implies there is a unique weakly stationary solution to (∗). Assume that this condition is satisfied. State the Filter Theorem, and show that {Xt} has spectral density function fX (λ) = σ^2 2 π|D(eiλ)|^2

where D(z) is a polynomial that you should specify in terms of the φj s.

(c) Consider the process Yt = cYt− 12 + εt, where c is a real constant and {εt} is as in (a). Find the range of values of c that satisfy the condition for the uniqueness and existence of a weakly stationary solution. Assume that c is in this range of values. Find the spectral density function of {Yt}. Use the Yule–Walker equations for the process {Yt} to find its autocorrelation function.

[Results from lectures may be quoted and used without proof.]

Part III, Paper 34

3 Monte Carlo Inference Describe the importance sampling estimator θ∗ Q for estimating θ := EP {φ(X)} from n independent draws from a proposal distribution Q. Under what conditions is it unbiased?

Assuming these conditions, show that var(θ∗ Q) is minimised when q(x) = q 0 (x) := p(x)|φ(x)|/k, where p, q are the densities of P , Q respectively, and k = EP {|φ(X)|}; and that the minimised variance is n−^1 (k^2 − θ^2 ).

Using e.g. the Box–Muller method, we generate independent standard normal variables Y, Z. A new variable X is then constructed as

X := Y sign(λY − Z)

where λ ∈ R. Let P denote the distribution of X. Show that

E(X^2 ) = 1 (1) E(|X|) =

2 /π (2) E(X) = λ

2 /π(1 + λ^2 ). (3)

[Hint for (2): Use xφ(x) = −φ′(x) and integration by parts.] [Hint for (3): You may use the identity

(1 + λ^2 )X = λ |λY − Z| + (Y + λZ) sign(λY − Z) (4)

and that (Y + λZ) and (λY − Z) are independent.]

For large λ, compare the performance of the optimal importance sampling estimator of EP (X) with that of the sample mean of n independent draws from P.

Part III, Paper 34

4 Monte Carlo Inference Describe the general form of the Metropolis-Hastings (M-H) algorithm for Markov chain Monte Carlo simulation from a distribution P with density p(x). Show that if the initial distribution is P then the chain is reversible, and that P is a stationary distribution of the chain. An attempt is made to simulate from P by rejection sampling using a proposal distribution Q, having density q(x). However we can not establish an upper bound on p(x)/q(x). Let u(x) := p(x)/c q(x), where c > 0 is some constant. We repeatedly and independently generate Y from Q, and independent U ∼ Unif(0, 1), until U 6 u(Y ): then return X = Y. Show that the density of X is

f (x) ∝ min{p(x), c q(x)}. (1)

In order to correct the discrepancy of f from p, the above procedure is used as the basis of a Markov chain Monte Carlo routine, as follows. If the current state is x, a new proposal value y is generated by rejection sampling, as above, from the density f in (1), independently of x. This is then accepted or rejected as the next state, according to the M-H procedure with target density p. Write down the acceptance probability α(x, y), distinguishing the cases that each of u(x), u(y) is or is not less than 1. What happens when in fact p(x) 6 c q(x) for all x?

END OF PAPER

Part III, Paper 34