• Keine Ergebnisse gefunden

Exercise Sheet 3: Local Search Algorithms

N/A
N/A
Protected

Academic year: 2021

Aktie "Exercise Sheet 3: Local Search Algorithms"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Antonios Antoniadis and Marvin Künnemann Winter 2018/19

Exercises for Randomized and Approximation Algorithms

www.mpi-inf.mpg.de/departments/algorithms-complexity/teaching/winter18/rand-apx-algo/

Exercise Sheet 3: Local Search Algorithms

To be handed in by November 6th, 2018via e-mail to André Nusser (CC to Antonios Antoniadis and Marvin Künnemann)

Exercise 1 (10 Points)

Prove the following lemma(Lemma 2.8 from the book).

Lemma. For any input to the problem of minimizing the makespan on identical parallel ma- chines for which the processing requirement for each job is more than one-third the optimal makespan, the longest processing time rule computes an optimal schedule.

Recall that the longest processing time (LPT) rule performs list scheduling, but instead of assigning the jobs in an arbitrary order it does so by always assigning the job with the longest processing requirement amongst the remaining jobs.

Exercise 2 (10 Points)

Consider the problem of scheduling jobs on identical machines, as seen in the lecture, but where the jobs are subject to precedence constraints. We say that i ≺ j if in any feasible schedule, job i must be completely processed before the processing of job j begins. We slightly modify the list scheduling algorithm so that instead of picking an arbitrary job to assign it picks an arbitrary available job, where a job j is said to be available if all jobs i such that i ≺ j have already been completely processed.

Prove that this algorithm is a2-approximation algorithm for the problem of scheduling jobs on identical machines with precedence constraints.

Exercise 3(20 Points) Consider the following local search algorithm for the Max Cut problem.

(2)

Algorithm: Local Search Max Cut

· Start with some arbitrary partition/cut(A, B)of the vertex set.

· Pick an arbitrary vertexv and move it from its current side of the partition to the other side if this increases the value of the cut.

· If no such vertexv exists, return the current (last) partition.

(i) Prove that the cut output by the algorithm has weight at least 1/2 times the weight of an optimal cut. You may assume that the edge weight are positive integers. (8 Points) (ii) Argue about the runtime of the algorithm. Why is it not polynomial in the input size?

Would it be polynomial time if all edges had unit weight? (4 Points)

(iii) Can you adapt the algorithm, so that it runs in polynomial time (at perhaps a slight loss in the approximation guarantee)?

Hint: Only move a vertex from its current side if the increase in the cut value is “big”.

Can you define “big” so that the algorithm runs in polynomial time and achieves an ap- proximation ratio of 1/2− for any given 0< <1/2?

(8 Points)

Referenzen

ÄHNLICHE DOKUMENTE

Gregg (2008), in his report for the Department for Work &amp; Pensions, concluded that ILMs are successful in providing the preparation necessary to integrate or re-integrate people

In our forecasting exercise we compare a total of 520 linear ARMA models for the variable u t − u t−1 , which denotes the first differences of the US unemployment rate. As a

The energy levels ǫ J,n of our molecule are enumerated by the angular momentum and the radial quantum number n.. To understand the structure of the low-lying energy levels we

Recently it has been pointed out by Usher (1998) that in a world of zero transaction costs, efficiency may not only be achieved for any initial allocation of clearly de fi ned

A Wilcoxon rank sum test with a significance level of 95% indicates that a local search that uses N Z1 generates significantly better solutions for most instances below 300

Now, we turn to the ubiquitous search environment. To simplify, consider the case where the discount rate r tends to 0. Then, the efficient allocation maximizes the stationary

Now, we turn to the ubiquitous search environment. To simplify, consider the case where the discount rate r tends to 0. It follows that the optimal number of matching places is in

The cointegration test, shown in Table 9, (see Engle &amp; Granger, 1987; Engle and Yoo, 1987, Table 2), shows that in the two cases with monthly data (models 5 and 6),