Abstracts

Martin Fürer

Combinatorial Methods for the Graph Isomorphism Problem
The strongest successes towards polynomial time isomorphism tests for significant classes of graphs come from group theory. Given the combinatorial flavor of the graph isomorphism problem, we are concerned here with the question of how much can be achieved in this area with combinatorial methods. We give a review of positive and negative results.
The k-dimensional Weisfeiler-Lehman algorithm (k-dim W-L) has established itself as an ultimate combinatorial approach to the graph isomorphism problem. Many properties based on the neighborhood of vertices have provided useful invariants distinguishing typical non-isomorphic graphs. Usually k-dim W-L can be shown to easily provide at least the same benefits.
Yet on the negative side, k-dim W-L is strictly weaker than the group theoretic approach. Nevertheless, from a practical point of view, k-dim W-L can speed up the group theoretic method significantly for typical graphs.
On the positive side, the 2-dimensional Weisfeiler-Lehman algorithm is able to replace the somewhat unnatural numerical approximations that come with a linear algebra approach to the graph isomorphism problem for graphs of bounded eigenvalue multiplicity. 2-dim W-L also provides invariants which are at least as strong as many spectral invariants based not just on eigenvalues, but also on angles among projections of standard basis vectors into eigenspaces.


Shafi Goldwasser

Cryptography when Secrets are Hard to Keep
The absolute privacy of the secret-keys associated with cryptographic algorithms has been the corner-stone of modern cryptography. Still, in practice, keys do get compromised at times for a variety or reasons. A particularly disturbing loss of secrecy is as a result of side channel attacks. These attacks exploit the fact that every cryptographic algorithm is ultimately implemented on a physical device and such implementations enable ‘observations’ which can be made and measured on secret data and secret keys. Indeed, side channel observations can lead to information leakage about secret keys.
In recent years, several theories of how to achieve provable security against a large class of families of side channel attacks have been proposed. Examples of such families include (1) the `computation but only computation' attacks where one assumes that computation proceeds in steps and the only portions of secret memory which may leak during a computational step are those which effect its outcome, and (2) the `Memory-attacks' that assumes that any function of the entire secret memory may leak as long as its output is of bounded length, and thus keys must be refreshed periodically.
Provably secure cryptographic constructions, under standard cryptographic intractability, have been proposed for the above attack models. These include constructions of encryption scheme, digital signatures schemes, pseudo random number generators, and even general compilers of cryptographic algorithms.
The talk will survey highlights of these developments.

This talk will be part of the CS colloquium and will take place in room 9204/9205


Neil Immerman

A Simple Inductive Synthesis Methodology and its Applications
Given a high-level specification and a low-level programming language, our goal is to automatically synthesize an efficient program that meets the specification. We present a new algorithmic methodology for inductive synthesis that allows us to do this. We use Second Order logic as our generic high level specification logic. For our low-level languages we choose small application-specific logics that can be immediately translated into code that runs in expected linear time in the worst case. We explain our methodology and provide examples of the synthesis of several graph classifiers, e.g, linear-time tests of whether the input graph is connected, acyclic, etc. In another set of applications we automatically derive many finite differencing expressions equivalent to ones that Bob Paige built by hand in his thesis. Finally we describe directions for automatically combining such automatically generated building blocks to synthesize efficient code implementing more complicated specifications.

This is joint work with Shachar Itzhaky, Sumit Gulwani, and Mooly Sagiv.


Dexter Kozen

Eliminating Hypotheses
The validity problem for certain Horn formulas of Kleene algebra with tests (KAT) can be efficiently reduced to the validity of equations. This reduction is known as elimination of hypotheses. Hypotheses are used to describe the interaction of atomic programs and tests and are an essential component of practical program verification with KAT. The ability to eliminate hypotheses of a certain form means that the Horn theory with premises of that form remains decidable in PSPACE. Hypothesis elimination for KA and KAT has been previously studied in (Cohen 1994, Kozen and Smith 1996, Kozen 1997, Hardin and Kozen 2002). In this talk I will describe the utility of these constructions and explicate the algebraic principles behind them.


Ken McAloon

Report from the Front: Confronting NP-hard problems in practice
TBA


Milena Mihail

High Dimensional Generalizations of Erdos-Renyi Random Graphs
Complex networks arising from real data have properties distinct from classical random graphs (and their generalizations to skewed degree distributions). These properties include cut sparsity and hierarchy. At the same time, categorical attributes abound in real datasets. We therefore focus on random graphs whose nodes are vectors (one dimension for each relevant attribute), and the connectivity probability between two nodes is a function of the corresponding endpoints.
For Random Inner Product Graphs and Stochastic Kronecker Graphs, the connectivity probability between two nodes can be though of as expressing endpoit similarity. We obtain upper and lower bounds for the second eigenvalue of the normalized Laplacian of Random Inner Product Graphs and Stochastic Kronecker Graphs. Our bounds are expressed in terms of the critical parameters of each model, and differ from classical random graphs of the same density and expected degrees.
We also note that Inhomogeneous Random Graphs (of which Random Inner Product Graphs are a special case) can capture hierarchy, under suitable setting of parameters.
The above results hold for graphs with minimum average degree of order at least logn, where n is the number of nodes, by analogy to Erdos-Renyi G(n,p).


Mihalis Yannakakis

Computational Aspects of Equilibria
Many models from a variety of areas involve the computation of an equilibrium or fixed point of some kind. Examples include Nash equilibria in games; price equilibria in markets; optimal strategies and the values of competitive games (stochastic and other games); stable configurations of neural networks; analysis of basic stochastic models for evolution like branching processes and for language like stochastic context-free grammars; and models that incorporate the basic primitives of probability and recursion like recursive Markov chains. It is not known whether these problems can be solved in polynomial time. Despite their broad diversity, there are certain common computational principles that underlie different types of equilibria and connect many of these problems to each other. In this talk we will discuss some of these common principles, the corresponding complexity classes that capture them, and their relationship with other open questions in computation.


Back to Main