《算法设计与分析算法设计与分析 (12).ppt》由会员分享,可在线阅读,更多相关《算法设计与分析算法设计与分析 (12).ppt(43页珍藏版)》请在taowenge.com淘文阁网|工程机械CAD图纸|机械工程制图|CAD装配图下载|SolidWorks_CaTia_CAD_UG_PROE_设计图分享下载上搜索。
1、1Chapter 11ApproximationAlgorithmsSlides by Kevin Wayne.Copyright 2005 Pearson-Addison Wesley.All rights reserved.2Approximation AlgorithmsQ.Suppose I need to solve an NP-hard problem.What should I do?A.Theory says youre unlikely to find a poly-time algorithm.Must sacrifice one of three desired feat
2、ures.nSolve problem to optimality.nSolve problem in poly-time.nSolve arbitrary instances of the problem.-approximation algorithm.nGuaranteed to run in poly-time.nGuaranteed to solve arbitrary instance of the problemnGuaranteed to find solution within ratio of true optimum.Challenge.Need to prove a s
3、olutions value is close to optimum,without even knowing what optimum value is!11.1 Load Balancing4Load BalancingInput.m identical machines;n jobs,job j has processing time tj.nJob j must run contiguously on one machine.nA machine can process at most one job at a time.Def.Let J(i)be the subset of job
4、s assigned to machine i.Theload of machine i is Li=j J(i)tj.Def.The makespan is the maximum load on any machine L=maxi Li.Load balancing.Assign each job to a machine to minimize makespan.5List-scheduling algorithm.nConsider n jobs in some fixed order.nAssign job j to machine whose load is smallest s
5、o far.Implementation.O(n log m).Load Balancing:List Scheduling6Load Balancing:List Scheduling AnalysisTheorem.Graham,1966 Greedy algorithm is a 2-approximation.nFirst worst-case analysis of an approximation algorithm.nNeed to compare resulting solution with optimal makespan L*.Lemma 1.The optimal ma
6、kespan L*maxj tj.Pf.Some machine must process the most time-consuming job.Lemma 2.The optimal makespan Pf.nThe total processing time is j tj.nOne of m machines must do at least a 1/m fraction of total work.7Load Balancing:List Scheduling AnalysisTheorem.Greedy algorithm is a 2-approximation.Pf.Consi
7、der load Li of bottleneck machine i.nLet j be last job scheduled on machine i.nWhen job j assigned to machine i,i had smallest load.Its load before assignment is Li-tj Li-tj Lk for all 1 k m.j0L=LiLi-tj machine iblue jobs scheduled before j8Load Balancing:List Scheduling AnalysisTheorem.Greedy algor
8、ithm is a 2-approximation.Pf.Consider load Li of bottleneck machine i.nLet j be last job scheduled on machine i.nWhen job j assigned to machine i,i had smallest load.Its load before assignment is Li-tj Li-tj Lk for all 1 k m.nSum inequalities over all k and divide by m:nNowLemma 1Lemma 29Load Balanc
9、ing:List Scheduling AnalysisQ.Is our analysis tight?A.Essentially yes.Ex:m machines,m(m-1)jobs length 1 jobs,one job of length mmachine 2 idlemachine 3 idlemachine 4 idlemachine 5 idlemachine 6 idlemachine 7 idlemachine 8 idlemachine 9 idlemachine 10 idlelist scheduling makespan=19m=1010Load Balanci
10、ng:List Scheduling AnalysisQ.Is our analysis tight?A.Essentially yes.Ex:m machines,m(m-1)jobs length 1 jobs,one job of length mm=10optimal makespan=1011Load Balancing:LPT RuleLongest processing time(LPT).Sort n jobs in descending order of processing time,and then run list scheduling algorithm.12Load
11、 Balancing:LPT RuleObservation.If at most m jobs,then list-scheduling is optimal.Pf.Each job put on its own machine.Lemma 3.If there are more than m jobs,L*2 tm+1.Pf.nConsider first m+1 jobs t1,tm+1.nSince the tis are in descending order,each takes at least tm+1 time.nThere are m+1 jobs and m machin
12、es,so by pigeonhole principle,at least one machine gets two jobs.Theorem.LPT rule is a 3/2 approximation algorithm.Pf.Same basic approach as for list scheduling.Lemma 3(by observation,can assume number of jobs m)13Load Balancing:LPT RuleQ.Is our 3/2 analysis tight?A.No.Theorem.Graham,1969 LPT rule i
13、s a 4/3-approximation.Pf.More sophisticated analysis of same algorithm.Q.Is Grahams 4/3 analysis tight?A.Essentially yes.Ex:m machines,n=2m+1 jobs,2 jobs of length m+1,m+2,2m-1 and one job of length m.11.2 Center Selection15centerr(C)Center Selection ProblemInput.Set of n sites s1,sn.Center selectio
14、n problem.Select k centers C so that maximum distance from a site to nearest center is minimized.sitek=416Center Selection ProblemInput.Set of n sites s1,sn.Center selection problem.Select k centers C so that maximum distance from a site to nearest center is minimized.Notation.ndist(x,y)=distance be
15、tween x and y.ndist(si,C)=min c C dist(si,c)=distance from si to closest center.nr(C)=maxi dist(si,C)=smallest covering radius.Goal.Find set of centers C that minimizes r(C),subject to|C|=k.Distance function properties.ndist(x,x)=0(identity)ndist(x,y)=dist(y,x)(symmetry)ndist(x,y)dist(x,z)+dist(z,y)
16、(triangle inequality)17Greedy Algorithm:A False StartGreedy algorithm.Put the first center at the best possible location for a single center,and then keep adding centers so as to reduce the covering radius each time by as much as possible.Remark:arbitrarily bad!greedy center 1k=2 centerssitecenter18
17、Center Selection:Greedy AlgorithmGreedy algorithm.Repeatedly choose the next center to be the site farthest from any existing center.Observation.Upon termination all centers in C are pairwise at least r(C)apart.Pf.By construction of algorithm.19Center Selection:Analysis of Greedy AlgorithmTheorem.Le
18、t C*be an optimal set of centers.Then r(C)2r(C*).Pf.(by contradiction)Assume r(C*)r(C).nFor each site ci in C,consider ball of radius r(C)around it.nExactly one ci*in each ball;let ci be the site paired with ci*.nConsider any site s and its closest center ci*in C*.ndist(s,C)dist(s,ci)dist(s,ci*)+dis
19、t(ci*,ci)2r(C*).nThus r(C)2r(C*).C*sites r(C)cici*s r(C*)since ci*is closest center r(C)r(C)-inequality20Center SelectionTheorem.Greedy algorithm is a 2-approximation for center selection problem.Question.Is there hope of a 3/2-approximation?4/3?Theorem.Unless P=NP,there no-approximation for center-
20、selectionproblem for any 2.11.4 The Pricing Method:Vertex Cover22Weighted Vertex CoverWeighted vertex cover.Given a graph G with vertex weights,find a vertex cover of minimum weight.4922weight=2+2+423Weighted Vertex CoverPricing method.Each edge must be covered by some vertex i.Edge e pays price pe
21、0 to use vertex i.Fairness.Edges incident to vertex i should pay wi in total.Lemma.For any vertex cover S and any fair prices pe:e pe w(S).Proof.4922sum fairness inequalitiesfor each node in Seach edge e covered byat least one node in S24Pricing MethodPricing method.Set prices and find vertex cover
22、simultaneously.25Pricing Methodvertex weightFigure 11.8price of edge a-b26Pricing Method:AnalysisTheorem.Pricing method is a 2-approximation.Pf.nAlgorithm terminates since at least one new node becomes tight after each iteration of while loop.nLet S=set of all tight nodes upon termination of algorit
23、hm.S is a vertex cover:if some edge i-j is uncovered,then neither i nor j is tight.But then while loop would not terminate.nLet S*be optimal vertex cover.We show w(S)2w(S*).all nodes in S are tightS V,prices 0fairness lemmaeach edge counted twice11.6 LP Rounding:Vertex Cover28Weighted Vertex CoverWe
24、ighted vertex cover.Given an undirected graph G=(V,E)with vertex weights wi 0,find a minimum weight subset of nodes S such that every edge is incident to at least one vertex in S.36107AEHBDICFJG61610723910933total weight=553229Weighted Vertex Cover:IP FormulationWeighted vertex cover.Given an undire
25、cted graph G=(V,E)with vertex weights wi 0,find a minimum weight subset of nodes S such that every edge is incident to at least one vertex in S.Integer programming formulation.nModel inclusion of each vertex i using a 0/1 variable xi.Vertex covers in 1-1 correspondence with 0/1 assignments:S=i V:xi=
26、1 nObjective function:minimize i wi xi.nIf(i,j)E,must take either i or j:xi+xj 1.30Weighted Vertex Cover:IP FormulationWeighted vertex cover.Integer programming formulation.Observation.If x*is optimal solution to(ILP),then S=i V:x*i=1 is a min weight vertex cover.31Linear ProgrammingLinear programmi
27、ng.Max/min linear objective function subject to linear inequalities.nInput:integers cj,bi,aij.nOutput:real numbers xj.Simplex algorithm.Dantzig 1947 Can solve LP in practice.Ellipsoid algorithm.Khachian 1979 Can solve LP in poly-time.Interior Point Method.Karmarkar 1984 Can solve LP in poly-time and
28、 in practice.32Weighted Vertex Cover:LP RelaxationWeighted vertex cover.Linear programming formulation.Observation.Optimal value of(LP)is optimal value of(ILP).Pf.LP has fewer constraints.Note.LP is not equivalent to vertex cover.Q.How can solving LP help us find a small vertex cover?A.Solve LP and
29、round fractional values.33Weighted Vertex CoverTheorem.If x*is optimal solution to(LP),then S=i V :x*i is a vertex cover whose weight is at most twice the min possible weight.Pf.S is a vertex covernConsider an edge(i,j)E.nSince x*i+x*j 1,either x*i or x*j (i,j)covered.Pf.S has desired costnLet S*be
30、optimal vertex cover.ThenLP is a relaxationx*i 34Weighted Vertex CoverTheorem.2-approximation algorithm for weighted vertex cover.Theorem.Dinur-Safra 2001 If P NP,then no-approximationfor 0.Consequence.PTAS produces arbitrarily high quality solution,but trades off accuracy for time.This section.PTAS
31、 for knapsack problem via rounding and scaling.37Knapsack ProblemKnapsack problem.nGiven n objects and a knapsack.nItem i has value vi 0 and weighs wi 0.nKnapsack can carry weight up to W.nGoal:fill knapsack so as to maximize total value.Ex:3,4 has value 40.1Value1822281Weight56627Item13452W=11well
32、assume wi W 38Knapsack is NP-CompleteKNAPSACK:Given a finite set X,positive weights wi,positive values vi,a weight limit W,and a target value V,is there a subset S X such that:SUBSET-SUM:Given a finite set X,positive values ui,and an integer U,is there a subset S X whose elements sum to exactly U?Cl
33、aim.SUBSET-SUM P KNAPSACK.Pf.Given instance(u1,un,U)of SUBSET-SUM,create KNAPSACK instance:39Knapsack Problem:Dynamic Programming 1Def.OPT(i,w)=max value subset of items 1,.,i with weight limit w.nCase 1:OPT does not select item i.OPT selects best of 1,i1 using up to weight limit wnCase 2:OPT select
34、s item i.new weight limit=w wiOPT selects best of 1,i1 using up to weight limit w wiRunning time.O(n W).nW=weight limit.nNot polynomial in input size!40Knapsack Problem:Dynamic Programming IIDef.OPT(i,v)=min weight subset of items 1,i that yields value exactly v.nCase 1:OPT does not select item i.OP
35、T selects best of 1,i-1 that achieves exactly value vnCase 2:OPT selects item i.consumes weight wi,new value needed=v viOPT selects best of 1,i-1 that achieves exactly value vRunning time.O(n V*)=O(n2 vmax).nV*=optimal value=maximum v such that OPT(n,v)W.nNot polynomial in input size!V*n vmax41Knaps
36、ack:FPTASIntuition for approximation algorithm.nRound all values up to lie in smaller range.nRun dynamic programming algorithm on rounded instance.nReturn the best of optimal items in rounded instance and the item with largest value.ItemValueWeight1134,22112656,342231,810,0135422,217,8006528,343,199
37、7W=11ItemValueWeight121272319542365297original instancerounded instanceW=1142Knapsack:FPTASKnapsack FPTAS.Round up all values:vmax=largest value in original instance =precision parameter =scaling factor=vmax/nObservation.Optimal solution to problems with or are equivalent.Intuition.close to v so opt
38、imal solution using is nearly optimal;small and integral so dynamic programming algorithm is fast.Running time.O(n3/).nDynamic program II running time is ,where43Knapsack:FPTASKnapsack FPTAS.Round up all values:Theorem.If S is solution found by our algorithm and S*is any other feasible solution thenPf.Let S*be any feasible solution satisfying weight constraint.always round upsolve rounded instance optimallynever round up by more than|S|nn =vmax,vmax iS viDP alg can take vmax