|Jan 21st||Michael Mitzenmacher, Harvard Univ.||
Geometrical and Multi-dimensional Generalizations of the Power of Two Choices
In this talk I present some new load-balancing results under the paradigm of the ``Power of Two Choices''. In standard analysis, such as for hash-tables, load-balancing is modeled as a balls-and-bins problem: items (balls) are thrown into locations (bins) uniformly at random. The power of two choices paradigm allows each ball two (or more) choices of bins, each taken uniformly at random, which is known to greatly improve the balance of the load. In peer-to-peer and sensor networks, bins may correspond to spaces defined by the underlying geometry that may not be equal in size; locations therefore cannot be modelled as being chosen uniformly at random. We therefore consider geometrical generalizations of the power of two choices. For search engines, we are concerned with word frequencies; the goal is to keep the number of documents with a given word stored at each server balanced simultaneously for all (sufficiently frequent) words. In this case, balls correspond to documents, but now a ball is really a vector (or word list) instead of a single item. We therefore consider multi-dimensional generalizations of the power of two choices.
| dec 5
|Dror Weitz, IAS||Counting Down the Tree
We present a novel tree representation for the hard-core lattice gas model (independent sets) on a general graph. We use this representation to show that for any graph of maximum degree \Delta, the Gibbs measure is unique (the influence of any boundary condition decays with distance) provided that the activity parameter \lambda < \lambda_c, where \lambda_c is the critical activity for the regular tree of degree \Delta. This resolves an open conjecture in statistical physics. Also, since \lambda_c is known, this extends the known uniqueness regime for many interesting graphs, including the square integer lattice Z^2. Our proof is algorithmic in nature, consisting of an elegant recursive procedure for calculating the probabilities that a given vertex is occupied. This procedure yields an efficient deterministic approximation scheme for counting independent sets (in other words, for calculating the partition function) of any graph of maximum degree \Delta in the above regime of \lambda. This extends the regime of \lambda for which an efficient approximation scheme is known to exist, and includes the interesting case of \lambda=1 (all independent sets are equally weighted) and maximum degree \Delta=5.
| dec 02
|Elitza Maneva, UC Berkeley||Probabilistic Inference Heuristics for Satisfiability
The known NP-hardness results imply that for most combinatorial optimization problems there are no efficient algorithms that find an optimal, or even a near optimal solution, on every instance. A heuristic for an NP-hard problem is a polynomial time algorithm that produces such solutions on some input instances, but may fail on others. One of the existing methods for evaluating heuristics is to study their performance on inputs coming from a particular distribution. I will talk about heuristics based on probabilistic inference, which have recently been applied to Boolean satisfiability problems, and exhibit unprecedented success over the uniform distribution on formulas with fixed ratio of clauses to variables. I will show how intuition about the structure of the space of solutions of such formulas influences the design of these heuristics.
| nov 7
|Juan Vera, CMU||Variations of the preferential attachment model
The preferential attachment graph is a dynamically evolving random graph. At each time step a new node is added. This new node chooses, at random, m existing nodes to connect to. For every existing node the probability of being chosen is proportional to its degree.
This random graph was proposed as a model for large networks and in particular for the web by Barabasi and Albert (1999). It has been intensively studied since then. We study variations and extensions to the basic model, as the influence of random deletions of edges and vertices, the influence of deletions done by an attacker, and the influence of search engines.
This is joint work with C. Cooper, A. Flaxman and A. Frieze.
| nov 1
|Leonard Schulman, Caltech||Error-Correcting Codes for Automatic Control
In many control-theory applications one can classify all possible states of the device by an infinite state graph with polynomially-growing expansion. In order for a controller to control or estimate the state of such a device, it must receive reliable communications from its sensors; if there is channel noise, the encoding task is subject to a stringent real-time constraint. We show a constructive on-line error correcting code that works for this class of applications. Our code is computationally efficient and enables on-line estimation and control in the presence of channel noise. It establishes a constructive (and optimal-within-constants) analog, for control applications, of the Shannon coding theorem.
Joint work with Rafail Ostrovsky, UCLA and Yuval Rabani, Technion