A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
Local search breaks 1.75 for Graph Balancing
[article]

2018
*
arXiv
*
pre-print

Graph Balancing is the problem of orienting the edges of a weighted multigraph so as to minimize the maximum weighted in-degree. Since the introduction of the problem the best algorithm known achieves an approximation ratio of 1.75 and it is based on rounding a linear program with this exact integrality gap. It is also known that there is no (1.5 - ϵ)-approximation algorithm, unless P=NP. Can we do better than 1.75? We prove that a different LP formulation, the configuration LP, has a strictly

arXiv:1811.00955v1
fatcat:alvrverlrvdpxdpos6px4opqdy
## more »

... maller integrality gap. Graph Balancing was the last one in a group of related problems from literature, for which it was open whether the configuration LP is stronger than previous, simple LP relaxations. We base our proof on a local search approach that has been applied successfully to the more general Restricted Assignment problem, which in turn is a prominent special case of makespan minimization on unrelated machines. With a number of technical novelties we are able to obtain a bound of 1.749 for the case of Graph Balancing. It is not clear whether the local search algorithm we present terminates in polynomial time, which means that the bound is non-constructive. However, it is a strong evidence that a better approximation algorithm is possible using the configuration LP and it allows the optimum to be estimated within a factor better than 1.75. A particularly interesting aspect of our techniques is the way we handle small edges in the local search. We manage to exploit the configuration constraints enforced on small edges in the LP. This may be of interest to other problems such as Restricted Assignment as well.##
###
Robust Algorithms under Adversarial Injections
[article]

2020
*
arXiv
*
pre-print

In this paper, we study streaming and online algorithms in the context of randomness in the input. For several problems, a random order of the input sequence—as opposed to the worst-case order—appears to be a necessary evil in order to prove satisfying guarantees. However, algorithmic techniques that work under this assumption tend to be vulnerable to even small changes in the distribution. For this reason, we propose a new adversarial injections model, in which the input is ordered randomly,

arXiv:2004.12667v1
fatcat:nzmok3bbdjeg3iatdo6jtyvime
## more »

... t an adversary may inject misleading elements at arbitrary positions. We believe that studying algorithms under this much weaker assumption can lead to new insights and, in particular, more robust algorithms. We investigate two classical combinatorial-optimization problems in this model: Maximum matching and cardinality constrained monotone submodular function maximization. Our main technical contribution is a novel streaming algorithm for the latter that computes a 0.55-approximation. While the algorithm itself is clean and simple, an involved analysis shows that it emulates a subdivision of the input stream which can be used to greatly limit the power of the adversary.##
###
Additive Approximation Schemes for Load Balancing Problems
[article]

2020
*
arXiv
*
pre-print

In this paper we introduce the concept of additive approximation schemes and apply it to load balancing problems. Additive approximation schemes aim to find a solution with an absolute error in the objective of at most ϵ h for some suitable parameter h. In the case that the parameter h provides a lower bound an additive approximation scheme implies a standard multiplicative approximation scheme and can be much stronger when h ≪ OPT. On the other hand, when no PTAS exists (or is unlikely to

arXiv:2007.09333v1
fatcat:qpq4gs24zvfxnca75yrp7fytle
## more »

... ), additive approximation schemes can provide a different notion for approximation. We consider the problem of assigning jobs to identical machines with lower and upper bounds for the loads of the machines. This setting generalizes problems like makespan minimization, the Santa Claus problem (on identical machines), and the envy-minimizing Santa Claus problem. For the last problem, in which the objective is to minimize the difference between the maximum and minimum load, the optimal objective value may be zero and hence it is NP-hard to obtain any multiplicative approximation guarantee. For this class of problems we present additive approximation schemes for h = p_max, the maximum processing time of the jobs. Our technical contribution is two-fold. First, we introduce a new relaxation based on integrally assigning slots to machines and fractionally assigning jobs to the slots (the slot-MILP). We identify structural properties of (near-)optimal solutions of the slot-MILP, which allow us to solve it efficiently, assuming that there are O(1) different lower and upper bounds on the machine loads (which is the relevant setting for the three problems mentioned above). The second technical contribution is a local-search based algorithm which rounds a solution to the slot-MILP introducing an additive error on the target load intervals of at most ϵ· p_max.##
###
Cardinality Constrained Scheduling in Online Models
[article]

2022
*
arXiv
*
pre-print

Makespan minimization on parallel identical machines is a classical and intensively studied problem in scheduling, and a classic example for online algorithm analysis with Graham's famous list scheduling algorithm dating back to the 1960s. In this problem, jobs arrive over a list and upon an arrival, the algorithm needs to assign the job to a machine. The goal is to minimize the makespan, that is, the maximum machine load. In this paper, we consider the variant with an additional cardinality

arXiv:2201.05113v1
fatcat:dv7aifrttfdrres5bokunk3ria
## more »

... straint: The algorithm may assign at most k jobs to each machine where k is part of the input. While the offline (strongly NP-hard) variant of cardinality constrained scheduling is well understood and an EPTAS exists here, no non-trivial results are known for the online variant. We fill this gap by making a comprehensive study of various different online models. First, we show that there is a constant competitive algorithm for the problem and further, present a lower bound of 2 on the competitive ratio of any online algorithm. Motivated by the lower bound, we consider a semi-online variant where upon arrival of a job of size p, we are allowed to migrate jobs of total size at most a constant times p. This constant is called the migration factor of the algorithm. Algorithms with small migration factors are a common approach to bridge the performance of online algorithms and offline algorithms. One can obtain algorithms with a constant migration factor by rounding the size of each incoming job and then applying an ordinal algorithm to the resulting rounded instance. With this in mind, we also consider the framework of ordinal algorithms and characterize the competitive ratio that can be achieved using the aforementioned approaches.##
###
Learning Augmented Energy Minimization via Speed Scaling
[article]

2020
*
arXiv
*
pre-print

As power management has become a primary concern in modern data centers, computing resources are being scaled dynamically to minimize energy consumption. We initiate the study of a variant of the classic online speed scaling problem, in which machine learning predictions about the future can be integrated naturally. Inspired by recent work on learning-augmented online algorithms, we propose an algorithm which incorporates predictions in a black-box manner and outperforms any online algorithm if

arXiv:2010.11629v1
fatcat:wfhjel2mxbehxozxs5s4tn2sg4
## more »

... the accuracy is high, yet maintains provable guarantees if the prediction is very inaccurate. We provide both theoretical and experimental evidence to support our claims.##
###
Online Bin Covering with Limited Migration
[article]

2019
*
arXiv
*
pre-print

Semi-online models where decisions may be revoked in a limited way have been studied extensively in the last years. This is motivated by the fact that the pure online model is often too restrictive to model real-world applications, where some changes might be allowed. A well-studied measure of the amount of decisions that can be revoked is the migration factor β: When an object o of size s(o) arrives, the decisions for objects of total size at most β· s(o) may be revoked. Usually β should be a

arXiv:1904.06543v1
fatcat:zek736g6jjbidlrpblpytzx5ke
## more »

... onstant. This means that a small object only leads to small changes. This measure has been successfully investigated for different, classic problems such as bin packing or makespan minimization. The dual of makespan minimization - the Santa Claus or machine covering problem - has also been studied, whereas the dual of bin packing - the bin covering problem - has not been looked at from such a perspective. In this work, we extensively study the bin covering problem with migration in different scenarios. We develop algorithms both for the static case - where only insertions are allowed - and for the dynamic case, where items may also depart. We also develop lower bounds for these scenarios both for amortized migration and for worst-case migration showing that our algorithms have nearly optimal migration factor and asymptotic competitive ratio (up to an arbitrary small ). We therefore resolve the competitiveness of the bin covering problem with migration.##
###
The Submodular Santa Claus Problem in the Restricted Assignment Case
[article]

2020
*
arXiv
*
pre-print

Klaus Jansen and

arXiv:2011.06939v1
fatcat:qh4ysuy2ufcg3cwvqowiy6mkwa
*Lars**Rohwedder*. A note on the integrality gap of the configuration lp for restricted santa claus. Information Processing Letters, 164:106025, 2020. ...##
###
Near-Linear Time Algorithm for n-fold ILPs via Color Coding
[article]

2018
*
arXiv
*
pre-print

We study an important case of ILPs {c^Tx Ax = b, l ≤ x ≤ u, x ∈Z^n t} with n· t variables and lower and upper bounds ℓ, u∈ Z^nt. In n-fold ILPs non-zero entries only appear in the first r rows of the matrix A and in small blocks of size s× t along the diagonal underneath. Despite this restriction many optimization problems can be expressed in this form. It is known that n-fold ILPs can be solved in FPT time regarding the parameters s, r, and Δ, where Δ is the greatest absolute value of an entry

arXiv:1811.00950v1
fatcat:3gkpu5p3gvdqpk6z4aeriqqej4
## more »

... in A. The state-of-the-art technique is a local search algorithm that subsequently moves in an improving direction. Both, the number of iterations and the search for such an improving direction take time Ω(n), leading to a quadratic running time in n. We introduce a technique based on Color Coding, which allows us to compute these improving directions in logarithmic time after a single initialization step. This leads to the first algorithm for n-fold ILPs with a running time that is near-linear in the number nt of variables, namely (rsΔ)^O(r^2s + s^2) L^2 · nt ^O(1)(nt), where L is the encoding length of the largest integer in the input. In contrast to the algorithms in recent literature, we do not need to solve the LP relaxation in order to handle unbounded variables. Instead, we give a structural lemma to introduce appropriate bounds. If, on the other hand, we are given such an LP solution, the running time can be decreased by a factor of L.##
###
Towards Non-Uniform k-Center with Constant Types of Radii
[article]

2021
*
arXiv
*
pre-print

In the Non-Uniform k-Center problem we need to cover a finite metric space using k balls of different radii that can be scaled uniformly. The goal is to minimize the scaling factor. If the number of different radii is unbounded, the problem does not admit a constant-factor approximation algorithm but it has been conjectured that such an algorithm exists if the number of radii is constant. Yet, this is known only for the case of two radii. Our first contribution is a simple black box reduction

arXiv:2110.02688v1
fatcat:kvdzplxljjdx7jb5coi3siiv5a
## more »

... ich shows that if one can handle the variant of t-1 radii with outliers, then one can also handle t radii. Together with an algorithm by Chakrabarty and Negahbani for two radii with outliers, this immediately implies a constant-factor approximation algorithm for three radii, thus making further progress on the conjecture. Furthermore, using algorithms for the k-center with outliers problem, that is the one radii with outliers case, we also get a simple algorithm for two radii. The algorithm by Chakrabarty and Negahbani uses a top-down approach, starting with the larger radius and then proceeding to the smaller one. Our reduction, on the other hand, looks only at the smallest radius and eliminates it, which suggests that a bottom-up approach is promising. In this spirit, we devise a modification of the Chakrabarty and Negahbani algorithm which runs in a bottom-up fashion, and in this way we recover their result with the advantage of having a simpler analysis.##
###
On Integer Programming, Discrepancy, and Convolution
[article]

2019
*
arXiv
*
pre-print

Integer programs with a constant number of constraints are solvable in pseudo-polynomial time. We give a new algorithm with a better pseudo-polynomial running time than previous results. Moreover, we establish a strong connection to the problem (min, +)-convolution. (min, +)-convolution has a trivial quadratic time algorithm and it has been conjectured that this cannot be improved significantly. We show that further improvements to our pseudo-polynomial algorithm for any fixed number of

arXiv:1803.04744v3
fatcat:2kteb4oetnaojbpjgkgnvbf64u
## more »

... nts are equivalent to improvements for (min, +)-convolution. This is a strong evidence that our algorithm's running time is the best possible. We also present a faster specialized algorithm for testing feasibility of an integer program with few constraints and for this we also give a tight lower bound, which is based on the SETH.##
###
Load Balancing: The Long Road from Theory to Practice
[article]

2021
*
arXiv
*
pre-print

We can then formulate the problems as an integer program and solve it via the algorithm of Jansen and

arXiv:2107.13638v1
fatcat:hlsgrbjynbg75jcppbmkcwhkry
*Rohwedder*[16] . ... Applying the JR-algorithm Jansen and*Rohwedder*[16] described an algorithm for integer programming and applied it to the configuration ip for P ||C max . ...##
###
On Integer Programming and Convolution

2018
*
Innovations in Theoretical Computer Science
*

*Rohwedder*43:15 without an objective function, i.e., the problem of finding a multiset of items whose weights sum up to exactly C. We assume that no two items have the same weight. ...

##
###
A note on the integrality gap of the configuration LP for restricted Santa Claus
[article]

2018
*
arXiv
*
pre-print

In the restricted Santa Claus problem we are given resources R and players P. Every resource j∈ R has a value v_j and every player i desires a set R(i) of resources. We are interested in distributing the resources to players that desire them. The quality of a solution is measured by the least happy player, i.e., the lowest sum of resource values. This value should be maximized. The local search algorithm by Asadpour et al. and its connection to the configuration LP has proved itself to be a

arXiv:1807.03626v1
fatcat:ymbla2q3tzdvlgkqscs7lmqhou
## more »

... influential technique for this and related problems. In the original proof, a local search was used to obtain a bound of 4 for the ratio of the fractional to the integral optimum of the configuration LP (integrality gap). This bound is non-constructive since the local search has not been shown to terminate in polynomial time. On the negative side, the worst instance known has an integrality gap of 2. Although much progress was made in this area, neither bound has been improved since. We present a better analysis that shows the integrality gap is not worse than 3 + 5/6 ≈ 3.8333.##
###
A (2+ε)-approximation algorithm for preemptive weighted flow time on a single machine
[article]

2020
*
arXiv
*
pre-print

Weighted flow time is a fundamental and very well-studied objective function in scheduling. In this paper, we study the setting of a single machine with preemptions. The input consists of a set of jobs, characterized by their processing times, release times, and weights and we want to compute a (possibly preemptive) schedule for them. The objective is to minimize the sum of the weighted flow times of the jobs, where the flow time of a job is the time between its release date and its completion

arXiv:2011.05676v1
fatcat:ed2cp2c3njb5tldl6ypbcerq6m
## more »

... ime. It had been a long-standing open problem to find a polynomial time O(1)-approximation algorithm for this setting. In a recent break-through result, Batra, Garg, and Kumar (FOCS 2018) found such an algorithm if the input data are polynomially bounded integers, and Feige, Kulkarni, and Li (SODA 2019) presented a black-box reduction to this setting. The resulting approximation ratio is a (not explicitly stated) constant which is at least 10.000. In this paper we improve this ratio to 2+ε. The algorithm by Batra, Garg, and Kumar (FOCS 2018) reduces the problem to Demand MultiCut on trees and solves the resulting instances via LP-rounding and a dynamic program. Instead, we first reduce the problem to a (different) geometric problem while losing only a factor 1+ϵ, and then solve its resulting instances up to a factor of 2+ϵ by a dynamic program. In particular, our reduction ensures certain structural properties, thanks to which we do not need LP-rounding methods. We believe that our result makes substantial progress towards finding a PTAS for weighted flow time on a single machine.##
###
A Quasi-Polynomial Approximation for the Restricted Assignment Problem
[article]

2019
*
arXiv
*
pre-print

The Restricted Assignment Problem is a prominent special case of Scheduling on Parallel Unrelated Machines. For the strongest known linear programming relaxation, the configuration LP, we improve the non-constructive bound on its integrality gap from 1.9142 to 1.8334 and significantly simplify the proof. Then we give a constructive variant, yielding a 1.8334-approximation in quasi-polynomial time. This is the first quasi-polynomial algorithm for this problem improving on the long-standing approximation rate of 2.

arXiv:1701.07208v2
fatcat:wcp6rzfcwjgqjmfw3hjqfd7jyi
« Previous

*Showing results 1 — 15 out of 89 results*