Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Complex analysis notes, Lecture notes of Mathematics

Complex algebra analysis lecture notes

Typology: Lecture notes

2018/2019

Uploaded on 11/16/2022

farah134
farah134 🇨🇦

3

(1)

1 document

1 / 70

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Notes on Complexity
Peter J. Cameron
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c
pf2d
pf2e
pf2f
pf30
pf31
pf32
pf33
pf34
pf35
pf36
pf37
pf38
pf39
pf3a
pf3b
pf3c
pf3d
pf3e
pf3f
pf40
pf41
pf42
pf43
pf44
pf45
pf46

Partial preview of the text

Download Complex analysis notes and more Lecture notes Mathematics in PDF only on Docsity!

Notes on Complexity

Peter J. Cameron

ii

Preface

These notes have been developed for the first part of the course MAS223, Com- plexity and Optimisation in Operations Research , at Queen Mary, University of London. The description for this part of the course reads:

The course begins with an outline of complexity theory, which gives a more precise meaning to the statement that some problems (such as minimal spanning tree) are easy to solve whereas others (such as travelling salesman) are hard.

The key objectives for this part of the course are:

Decision problems; how to express a problem with an integer solution as

decision problem.

The O and o notation; arranging functions in order of value for large argu-

ment.

Input data representation and simple algorithms for arithmetic, matrix, and

graph problems.

Turing machines; ability to translate instructions into the action of the ma-

chine.

Solutions of decision problems on deterministic and nondeterministic Tur-

ing machines. Definition of

and 

. Interpretation of 

in terms of certificates.

Polynomial transformations,   -completeness. Examples of   -complete

problems.

Randomised and approximation algorithms. The class   and its relation

to

and 

The notes provide a self-contained introduction to decision, optimisation and counting problems, Turing machines, the definitions of complexity classes includ- ing P and NP and the relations between them. All of the above key objectives are covered here. The notes also include a number of worked exercises, many of which were set as homework problems in the course. The textbook for this part of the course was

M. R. Garey and D. S. Johnson, Computers and Intractability: A Guide to the Theory of NP-Completeness , Freeman, 1979.

iv

Contents

1 Introduction 1 1.1 Minimal connector, travelling salesman.............. 1 1.2 Graphs, trees and circuits...................... 6 1.3 Proofs................................ 11

2 Problems, algorithms, computations 17 2.1 Decision, counting, optimisation.................. 18 2.2 Input and output........................... 19 2.3 Orders of magnitude........................ 21 2.4 Examples.............................. 22

3 Complexity:

and 

3.1 Turing machines.......................... 37

and 

3.3 Polynomial transformations..................... 46

3.4 Cook’s Theorem; 

-completeness................ 47 3.5 Examples.............................. 52

4 Other complexity classes 57 4.1 Harder problems.......................... 57 4.2 Counting problems......................... 59 4.3 Parallel algorithms......................... 60 4.4 Randomised algorithms....................... 60 4.5 Approximation algorithms..................... 62 4.6 Quantum computation....................... 64

v

2 CHAPTER 1. INTRODUCTION

 Aberdeen

 Birmingham

 Cardiff

 Dover

 Exeter

 Fort William

 Glasgow

 Harwich

 Inverness

 John o’Groats

 Kyle of Lochalsh

 London

Figure 1.1: Map of Great Britain

total distance travelled to be as small as possible.

Although these two problems look quite similar, we will see that the minimal connector problem is “easy”, but the travelling salesman problem is “hard”. There is no difficulty in principle in solving the travelling salesman problem – we could simply look at all possible cyclic tours through the towns – but the number of possibilities to check grows very rapidly, and for even a moderate number of towns it is not practicable to check all possibilities.

First we attack the minimal connector problem in a simple-minded way. We first choose the shortest possible link between any two towns. We continue doing this until all the towns are connected, except that, if two towns already have an indirect connection, we do not need to link them again. Thus, for example, if we have already chosen edges AB, BC and CD, there is no need to include AD.

1.1. MINIMAL CONNECTOR, TRAVELLING SALESMAN 3

B C D E F G H I J K L A 676 813 947 916 240 233 861 169 373 304 832 B 166 312 253 631 470 269 737 924 758 188 C 383 195 781 620 396 884 1094 908 253 D 399 959 786 201 1001 1201 1080 114 E 901 723 449 995 1197 1011 291 F 163 874 106 314 127 821 G 695 267 475 288 639 H 916 1116 983 122 I 208 135 885 J 304 1067 K 943

Table 1.1: Distances in km

More formally, the procedure (known as the greedy algorithm for minimal connector ) works as follows:

Arrange the pairs of towns in a list L in order of increasing distances. Take

an empty list T.

Repeat the following step until the edges in T connect all the towns:

– Take the first pair in the list L , say  t 1  t 2 .

- If the two towns t 1 and t 2 in this pair are not connected by a sequence

of edges in T , add the edge  t 1  t 2  to T.

– Delete the pair  t 1  t 2  from L.

Return the list T.

Let’s work this algorithm on our example. The list L begins 106 (FI), 114 (DL), 122 (HL), 127 (FK), 135 (IK), 163 (FG), 166 (BC), 169 (AI), 188 (BL), 195 (CE), 201 (DH), 208 (IJ), 233 (AG), 240 (AF), 253 (BE), 253 (CL), 267 (GI), 269 (BH), 291 (EL), 304 (AK), 304 (JK), 312 (BD), 314 (FJ), 373 (AJ), 383 (CD), 396 (CH), 399 (DE), 449 (EH), 470 (BG), 475 (JG),... So we choose first the edges FI, DL, HL, FK. We do not choose IK, since I and K are already connected via F. Continuing, we choose FG, BC, AI, BL, CE. We do not choose DH. We choose IJ. Then every additional edge is skipped over until we reach BG, at which point we have connected all the towns and the algorithm terminates.

1.1. MINIMAL CONNECTOR, TRAVELLING SALESMAN 5

Let us try the same technique for the travelling salesman. Adapting the above, we have the greedy algorithm for travelling salesman. We assume that the number n of towns is greater than 2, else there is not much choice about the travelling salesman’s itinerary.

Let L be the list of all pairs of towns (sorted by increasing distance), and T

the empty list.

Take the first pair in L ; add it to T. At this and all subsequent stages except

the last, the edges in T will form a path, so we can talk about the ends of the path.

Repeat the following step until the edges in T connect all the towns:

- Take the first pair in the list L having the property that one of its towns

is an end of the path T and the other is not on the path; say  t 1  t 2 .

Add  t 1  t 2  to T.

Add to T the edge joining its two endpoints, creating a cycle. Return the

list T.

Although this looks superficially similar to Prim’s algorithm, and it does produce an itinerary for the travelling salesman, it does not produce a tour of smallest length. In our example, the edges are chosen in the order

FI, FK, AI, AG, KJ, BG, BC, CE, EL, DL, DH, HJ,

giving the tour AGBCELDHJKFIA of length 3492. However, the tour AHDLECBGFKJIA has length only 3269. (In fact this is the shortest possible tour, as can be confirmed by checking all the possibilities.)

Faced with the difficulty of the problem, we must be prepared to compromise. There are various kinds of compromise that we could make in an optimisation problem: we could be content with an efficient algorithm that does one of the following:

it guarantees to find a solution which is not too far from the optimal;

it makes some random choices, and guarantees to find the optimal with not-

too-small probability;

it makes some random choices, and guarantees to find a solution which is

not too far from the optimal with not-too-small probability.

6 CHAPTER 1. INTRODUCTION

As an example of the first compromise, we give an algorithm which, under an assumption which is physically reasonable, finds a travelling salesman tour which is guaranteed to be “not too bad”. This is the twice-round-the-tree algorithm for the travelling salesman :

Find a minimal connector (e.g. using the greedy algorithm or Prim’s algo-

rithm).

Find a tour visiting all the towns and returning to its starting point, using

each edge of the tree twice. (We will discuss later how this is done.)

Take this tour, and modify as follows: At each stage, go directly to the next

town on the tour which has not yet been visited. Return the result.

In our example, the tour in the second stage of the algorithm can be chosen to be AIJIFKFGBCECBLDLHLBGFIA, and the final travelling salesman’s tour is then AIJFKGBCELDHA, with length 3404, not too far from optimal! A list of distances between pairs of towns is said to satisfy the triangle in-

equality if, for any three towns x  y  z , we have

d x  y d y  z  d x  z ;

in other words, going directly from x to z is no further than detouring via y. We will show later that, if the distances satisfy the triangle inequality, then the length of the tour found by the algorithm is less than twice the minimum possible length. Finally, we discuss how to construct the tour in the second stage of the algo- rithm, in the case where the towns are represented in the plane (for example, on a map). Take the minimal connector, and replace each edge by a pair of edges. Now, if we enter a town by an edge, we leave it by one of the edges immediately adjacent in the anticlockwise sense. Since the minimal connector is a tree, after exploring the branch along this edge, we return to the town along the other edge of this pair, when we again move to the next edges in the anticlockwise sense. So, when we leave the town along the pair of edges by which we originally entered, all edges through that town have been used twice.

1.2 Graphs, trees and circuits

In this section we introduce the notation and terminology of graph theory, in order to state the problems more precisely. A graph consists of a set V of vertices and a set E of edges , each edge being incident with a pair of vertices. We denote the graph G with vertex set V and edge

8 CHAPTER 1. INTRODUCTION

Proposition 1.2.1 Let G  V  E be a graph. Define a relation  on V by the

rule that v  v  if there is a path from v to v . Then  is an equivalence relation.

Proof The relation  is reflexive (since v is joined to itself by the path with one

vertex and no edges), and symmetric (since, if we have a path from v to w , then reversing it gives a path from w to v ). We have to prove that it is transitive.

So suppose that v  v  and v   v   , so that there is a path P ^ v  X  v  from v

to v  , and a path P   v   X  v   from v  to v   , where X and X  are sequences of

edges and vertices. Then v  X  v   X  v   is a walk from v to v  . By the remark

before the proof, there is a path from v to v   ; so v  v . This completes the proof.

The equivalence classes of the relation in the preceding proposition are called the connected components of the graph G ; and we say that G is connected if it has just one connected component. In other words, a graph is connected if there is a path between any two of its vertices. A forest is a graph with no cycles; a tree is a connected forest. (So the con- nected components of forests are trees.) Note that a forest is a simple graph, since loops and multiple edges give rise to circuits of length 1 and 2 respectively.

Proposition 1.2.2 Suppose that a forest has n vertices, m edges, and r connected

components. Then n  m r.

Proof A forest has the property that, if one edge is removed, the number of connected components increases by 1 (see below). Using this fact, the proposition is easily proved by induction on m , the number of edges:

If there are no edges, then each connected component is a single vertex, so

r  n , m  0, and the induction starts.

Suppose that the proposition is true for forests with m  1 edges, and let G

be a forest with m edges, n vertices, and r components. Removing an edge

gives a graph with m  1 edges and r 1 components. By induction,

n  m  1   r 1  m r 

and we are done. Now let e be an edge of a forest, C the connected component containing e. Let v and w be the vertices incident with e. Now each vertex of C is joined to either v or w by a path not containing e. (Suppose that x is joined to v by a path including e. Then e must be the last vertex in the path, else v would occur twice. Deleting v and e gives a path from x to w not using e .) So when e is deleted, C splits into at most two components. But it must split: for if v and w were joined by a path not containing e , then adding e would produce a circuit, contradicting the assumption that G is a forest. The other components are unaffected by the deletion of e. So the number of components increases by one, as claimed.

1.2. GRAPHS, TREES AND CIRCUITS 9

Proposition 1.2.3 Let G  V  E be a connected graph. Then there is a subset S

of E such that V  S is a tree.

Proof We give an algorithm for finding such a tree.

Initialise by setting S  E.

While the graph V

 S contains a circuit, delete from E any edge which lies

in at least one circuit.

Return the set S.

The algorithm clearly produces a graph containing no circuit. To show that it is connected, we observe that the original graph is connected, and prove that the edge-deletion step preserves connectedness. Let the deleted edge be incident with v and w. Since e lies in a circuit, there is a path P from v to w not using e. So, if a path from x to y uses e , we can replace e by P to find a walk from x to y not using e , and then shorten this walk to a path as usual.

A tree with the properties given in this proposition is called a spanning tree of the graph G. If G is a weighted graph, then a spanning tree of G with smallest possible total weight is a minimal connector.

If a graph consists of a circuit, then removing any edge gives rise to a spanning tree.

Let G  V  E be a graph. A Hamiltonian circuit in G is a circuit containing

all the vertices of V (each exactly once). Clearly a graph containing a Hamiltonian circuit is connected. The converse is false, and there is no simple test known for recognising Hamiltonian graphs (those containing Hamiltonian circuits). As we will see, this is a hard problem.

Exercise 1.2.1 Prove that Kn is Hamiltonian if and only if n ! 2.

Solution Any circuit passing through all vertices in any order is Hamilto- nian, since each pair of vertices is joined by an edge.

Exercise 1.2.2 Is the graph shown in Figure 1.3 (the so-called Petersen graph ) Hamiltonian?

1.3. PROOFS 11

A minimal connector for K

n has weight^ n^^  1 and is a spanning tree for^ G.

G is Hamiltonian if and only if a minimal travelling salesman tour for K

n has length n. If G is itself a weighted graph, we can use the same trick, choosing a very large weight for non-edges, to reduce questions about G to questions about Kn.

Exercise 1.2.3 Let G be a simple graph. Give each edge  v  w  of Kn the weight

1 if it is an edge of G , and 2 if not.

(a) Prove that the weight of a minimal connector for Kn is n r  2, where r is

the number of connected components of G.

(b) Prove that the weight of a minimal travelling salesman tour for Kn is n s , where s is the smallest number of edges whose addition to G gives a Hamil- tonian graph.

Solution In this question we have two things to do in each part: construct a connector or travelling salesman tour of the specified weight, and show that no smaller weight is possible.

(a) Choose a spanning tree in each connected component of G. In each com-

ponent we have one fewer edges than vertices, so altogether we will have n  r

edges, with total weight n  r. Now enlarge this to a spanning tree for Kn. We

have to add r  1 more edges, each of weight 2 (since they do not belong to G ),

giving a connector of weight n  r 2 r  1  n r  2.

Now take any minimal connector for Kn. Since it is a tree, its edges which belong to G will form a forest, with at least as many components as G ; say s

components, where s r. Thus, we use n  s edges of G. The remaining s  1

edges are not in G , and have weight 2 s  1. The total weight is thus n  s

2 s  1 + n s  2 n  r 2. So the weight of a minimal connector is indeed

n  r 2.

(b) Suppose that adding s edges to G gives a Hamiltonian graph. Then a travelling salesman tour can be constructed using at most s of these edges and at

least n  s edges of G ; its total weight is at most n  s 2 s  n s.

On the other hand, a travelling salesman tour of weight n t would use at most t edges not in G , and clearly their addition to G would form a Hamiltonian graph.

1.3 Proofs

In this section we give the proofs of two results from the first section: the greedy algorithm always finds a minimal connector; and, if the triangle inequality holds,

12 CHAPTER 1. INTRODUCTION

then the twice-round-the-tree algorithm always finds a travelling salesman tour whose length is less than twice the minimum.

Theorem 1.3.1 The greedy algorithm, applied to any weighted complete graph, always finds a minimal connector.

Proof The greedy algorithm finds a subgraph which is connected (this is the termination condition) and has no cycles (this is the condition for an edge to be added), that is, a weighted spanning tree, with edge set S (say). We have to show that S is a spanning tree of smallest weight.

Let e 1  e 2  en  1 be the edges in S , in the order in which the Greedy Algo-

rithm chooses them. Note that

d e 1 -,/...0, d en  1 

since if d e j 21 d ei for j 3 i , then at the i th stage, e j would join points in

different components, and should have been chosen in preference to ei. Suppose, for a contradiction, that there is a spanning tree of smaller weight,

with edges f 1  fn  1 , ordered so that

d f 1 -,  , d fn  1 

Thus,

n  1

i 4 1

d fi -

n  1

i 4 1

d ei 

Choose k as small as possible so that

k

i 4 1

d fi -

k

i 4 1

d ei 

Note that k 3 1, since the greedy algorithm chooses first an edge of smallest

weight. Then we have

k  1

i 4 1

d fi -

k  1

i 4 1

d ei ;

hence

d f 1 5,6...0, d fk -1 d ek 

Now, at stage k , the greedy algorithm chooses ek rather than any of the edges

f 1  fk of strictly smaller weight; so all of these edges must fail the condition

that they join points in different components of V  S , where S  e 1  ek  1 .

It follows that the connected components of V  S  , where S   f 1  fk , are

subsets of those of V  S ; so V  S  has at least as many components as V  S.

But this is a contradiction, since both V  S and V  S  are forests, and their

numbers of components are n  7 k  1 and n  k respectively; it is false that

n  k n  8 k  1.

14 CHAPTER 1. INTRODUCTION

Solution Let M be the weight of the minimal connector. Then the algorithm produces a travelling salesman tour of weight at most 2 M. Now removing any edge from a travelling salesman tour gives a spanning tree, whose weight is thus not smaller than M. Suppose we remove the edge of largest weight x in the tour. Then at least n edges of the complete graph have weight smaller than x , so x t , where t is the n th smallest edge weight. Thus

M , L  x , L  t.

Similarly, if we pick a vertex v and remove the edge of the travelling salesman tour containing v and of larger weight x , then x is at least the second smallest weight of an edge through v , and the argument proceeds as before.

This chapter ends with something a bit different. We often use the principle that, if one of N possibilities can be determined uniquely as a result of n binary

choices, then N , 2 n. (This is sometimes called the “Twenty Questions” principle,

after the panel game in which the panellists are allowed to ask twenty questions with “yes” or “no” answers and have to identify some object. Since 2^20 is a little greater than a million, in theory one of a million objects can be identified. The following exercise shows that there is a ternary version as well, where each question is allowed to have one of three possible answers.

Exercise 1.3.3 (a) I have twelve coins, which are identical except that one of the coins is either lighter or heavier than the others. I have a balance which can compare the weight of two sets of coins. Show that, in three weighings, I can determine which coin is different, and whether it is lighter or heavier than the others.

(b) Each weighing can have three results (left-hand side heavier, right-hand side

heavier, or exact balance). So in three weighings I can distinguish 3^3  27

possibilities. If I had 13 coins C 1  C 13 , I might expect to be able to

determine which of the possible cases “ Ci light”, “ Ci heavy” (for 1 , i , 13)

or “ all coins the same”, since there are 2. 13 1  27 possibilities. (This

argument shows that we certainly can’t deal with more than 13 coins in just three weighings.) Is there a scheme for determining which coin out of 13 is different in only three weighings?

Solution (a) The following three weighings can be checked to work:

C 5  C 6  C 8  C 9 against C 6  C 10  C 11  C 12 

C 2  C 3  C 4  C 10 against C 8  C 9  C 11  C 12 

C 1  C 3  C 6  C 7 against C 4  C 9  C 10  C 12 

1.3. PROOFS 15

(b) If we have 13 coins, then there are 27 possibilities (each coin could be either light or heavy, or they might all be the same) to be determined by three

weighings each with three possible outcomes. Since 3^3  27, this would only be

possible if the first weighing reduced the number of possibilities to 9, the second weighing to 3, and the third weighing to just one. But consider the first weighing, and suppose that we put m coins in each pan. If the left-hand pan is heavier, then we have 2 m possibilities (a coin in the left-hand pan may be heavy, or a coin in the

right-hand pan may be light). There is no integer m satisfying 2 m  9, however,

so the weighing is not possible.