NYCAC 5 - Abstracts

Friday, November 16, 2012,   New York City
The Graduate Center, CUNY
Rooms 5414 and 4102


The New York Colloquium on Algorithms and Complexity


Jin-Yi Cai

Computational complexity theory --- The world of P and NP
Computational Complexity Theory is the study of intrinsic difficulties of computational problems. The most prominent open problem is the conjecture that P is not equal to NP. In essence this conjecture states that it is intrinsically harder to find proofs than to verify them. It has a fundamental importance in many areas from computer science to mathematics, to our basic understanding of nature.
Valiant's new theory of holographic algorithms is one of the most beautiful ideas in algorithm design in recent memory. It gives a new look on the P versus NP problem. In this theory, information is represented by a superposition of linear vectors in a holographic mix. This mixture creates the possibility for exponential sized cancellations of fragments of local computations. The underlying computation is done by invoking the Fisher-Kasteleyn-Temperley method for counting perfect matchings for planar graphs (Dimer Problem). Holographic algorithms challenge our conception of what polynomial time computation can do, in view of the P vs. NP question.
In this talk we will survey the developments in holographic algorithms. No specialized background is assumed.


Vasilis Gkatzelis

Mechanism Design for Fair Division
Joint work with (Richard Cole and Gagan Goel)
We revisit the classic problem of fair division from a mechanism design perspective and provide an elegant truthful mechanism that yields surprisingly good approximation guarantees for the widely used solution concept of Proportional Fairness. This solution concept, which is very closely related to Nash bargaining and the competitive equilibrium, is known to be not implementable in a truthful fashion, which has been its main drawback. To alleviate this issue, we propose a new mechanism, which we call the Partial Allocation mechanism, that discards a carefully chosen fraction of the allocated resources in order to incentivize the agents to be truthful in reporting their valuations.
For a multi-dimensional domain with an arbitrary number of agents and items, and for the very large class of homothetic valuation functions, we prove that our mechanism provides every agent with at least a 1/e fraction of her Proportionally Fair valuation. To the best of our knowledge, this is the first result that gives a constant factor approximation to every agent for the Proportionally Fair solution. We also uncover a connection between this mechanism and VCG-based mechanism design, which introduces a way to implement interesting truthful mechanisms in settings where monetary payments are not an option.
We also ask whether better approximation ratios are possible in more restricted settings. In particular, motivated by the massive privatization auction in the Czech republic in the early 90s we provide another mechanism for additive linear valuations that works really well when all the items are highly demanded.


Stuart Haber

No looking back: Efficient submatch extraction for extended regular expressions
Modern regular expression implementations use "capturing groups" to specify a subexpression of a regular expression. Given a string that matches an entire regular expression, "submatch extraction" is the process of extracting the substrings corresponding to those subexpressions. "Greedy" and "reluctant" closures are variants of the standard closure operator that specify how submatches are extracted. The state of the art and practice in submatch extraction consists of algorithms based either on finite automata or on backtracking. In theory, the number of states in an automata-based approach can be exponential in the size of the given regular expression, and backtracking algorithms can run in exponential time. But in practice, worst-case behavior rarely occurs, so achieving a practical speed-up over state-of-the-art methods is of significant interest.
In this talk, we present an O(lc) runtime automata-based algorithm for extracting submatches from a string that matches a regular expression, where l is the length of the string, and c is the (positive) number of capturing groups. The previously fastest automata-based algorithm runs in time O(nlc), where n is the number of states in the automaton corresponding to the regular expression. Both our approach and the previous one require worst-case exponential compile time. Our experimental results, for a large set of regular expressions used in practice, show an approximate 2.5 fold improvement over Java's backtracking based regular expression library and between five and fifty-nine times faster than Google's RE2.
This is joint work with Bill Horne, Pratyusa Manadhata, Miranda Mowbray, and Prasad Rao.


Lane A. Hemaspaandra

Search versus Decision for Election Manipulation Problems
Most theoretical definitions about the complexity of manipulating elections focus on the decision problem of recognizing which instances can be successfully manipulated, rather than the search problem of finding the successful manipulative actions. Since the latter is a far more natural goal for manipulators, that definitional focus may be misguided if these two complexities can differ. Our main result is that they probably do differ: If integer factoring is hard, then for election manipulation, election bribery, and some types of election control, there are election systems for which recognizing which instances can be successfully manipulated is in polynomial time but producing the successful manipulations cannot be done in polynomial time. This is joint work with Edith Hemaspaandra and Curtis Menton.


Aggelos Kiayias

Secure Computation with Corruptible Setups
Most functionalities involving two or more parties are impossible to implement securely when the underlying protocol is to be arbitrarily composed with other protocols. To circumvent this impossibility, various forms of ``trusted setups'' have been shown to be sufficient (and even "complete") and can thus be leveraged to securely carry out any desired task in arbitrary composition settings.
With only a few notable exceptions, past work has viewed these setup assumptions as being implemented by some ideal, incorruptible entity. In reality, however, setups are likely be carried out by some mechanism that could be subverted, or by some party that could be compromised. Most prior work provides no guarantees in such cases.
In this talk we put forth a new general approach for modeling the corruption of setups in the context of cryptographic protocol design and we provide constructions and impossibility results.
Joint work with Joanathan Katz, Hong-Sheng Zhou, Vasilis Zikas


Michael Lampis

Improved inapproximability for TSP: the role of bounded occurrence CSPs
The Traveling Salesman problem is one of the most famous problems in combinatorial optimization and its approximability is a huge open problem. Though there have been recent breakthroughs on the algorithmic side, until recently the best inapproximability constant was only 220/219, due to Papadimitriou and Vempala. In this talk we will sketch a much simpler reduction which achieves a slightly better inapproximability bound. The main ingredient is inapproximable constraint satisfaction problems where each variable only appears a bounded number of times. We will discuss the role of expanders and expander-like graphs in the inapproximability of such problems and outline a simple (but hard to analyze) idea which allows us to obtain better expander graphs using the probabilistic method, improving on a 25-year old result by Bollobas. Finally, we will discuss whether it is realistic to hope that further work in this direction might eventually lead to a strong inapproximability results for TSP (spoiler: probably not!).


George Tourlakis

A proof theoretic tool for two first-order modal logics
PDF


Mihalis Yannakakis

Computation of Least Fixed Points
Many problems from different areas can be formulated as problems of computing a fixed point of a suitable function. For many of these problems, the function in the fixed point formulation is monotone, and the objects we want to compute are given by a specific fixed point, namely the least fixed point of the function. Many models and problems from a broad variety of areas can be thus expressed as least fixed point computation problems, including in particular the analysis of various probabilistic models, problems in stochastic optimization, and games. It turns out that for some classes of functions we can compute the least fixed point in polynomial time and thus we can analyze efficiently several of these models, while for others there are strong indications that they cannot be solved in polynomial time, and for yet others the question remains open. In this talk we will discuss progress in this area. The talk is based on a series of works with Kousha Etessami and Alistair Stewart.


Stathis Zachos

Tribute to Alan Turing
A group of brilliant innovators spanning three centuries were concerned with the nature of human reason. This endeavor led to the all-purpose digital computer.
Except for Turing, none of them had any idea that his work might have such a tremendous application to our modern world. Leibniz saw far, but not that far. Boole could hardly have imagined that his algebra of logic would be used to design complex electric circuits. Frege would have been amazed to find equivalents of his logical rules incorporated into computer programs for carrying out deductions. Cantor certainly never anticipated the ramifications of his diagonal method. Hilbert’s program to secure the foundations of mathematics was pointed in a very different direction. And Goedel, living his life of the mind, hardly thought of application to mechanical devices. (M. Davis) However, Turing’s great vision has now been realized.