Date: May 23, 2003.

Time: 1:00-3:00pm.

Room: CoC 358

In this talk I will present basics of Lattices and some of the motivation behind studying algorithmic problems related to lattices: Factoring polynomials over integers, Diophantine approximation. Then I will present the LLL algorithm for finding a short vector in a given lattice and show how it is applied to the above problems. If time permits I will outline the algorithm of Schnorr.

Date: May 30, 2003.

Time: 1:00-3:00pm.

Room: CoC 358

The SVP is the problem of finding the shortest non-zero vector in a lattice. In this talk I will show that the SVP is NP-Hard (under randomized reductions) to approximate within a factor less than 2^(1/p) in any l_p norm (p>=1). In particular, the SVP is NP-Hard to approximate within a factor less than sqrt(2) in l_2 norm. The proof will involve the problem of packing lattice points in a sphere, and a combinatorial theorem on low-degree hyper-graphs, which might be of independent interest. The talk will be mostly based on the PhD Thesis of D. Micciancio, available here .

Date: June 6, 2003.

Time: 1:00-3:00pm.

Room: CoC 358

This talk is based on Regev's paper "New Lattice Based Cryptographic Constructions" that is to appear in STOC '03. In this talk, I will present the construction and analysis of Regev's new public-key cryptosystem, based on the *worst-case* hardness of the Unique Shortest Vector problem in lattices. Regev's cryptosystem significantly improves the previous Ajtai and Dwork cryptosystem in terms of security, and the construction is much simpler. I will also mention other results in the paper. The talk will be self-contained. Background in cryptography and lattices will be briefly discussed.

Date: June 23, 2003.

Time: 2:30-4:30pm.

Room: CoC 358

In this talk I will complete Regev's paper- started by Yan. The main emphasis of this talk will be on harmonic analysis on lattices. This will be used to show how finding the shortest vector in a n^{1.5} "unique" lattice (a lattice in which all vectors not parallel to the shortest vector are of length at least n^{1.5} times the shortest vector) reduces to distinguishing two distributions on [0,1) - one is uniform and the other is a "spiky periodic gaussian". Recall that distinguishing between these distributions reduces to breaking Regev's cryptosystem and was presented by Yan. Hence this will establish a cryptosystem whose security is based on the worst case hardness assumption of the n^{1.5} unique shortest vector problem.

Date: June 27, 2003.

Time: 1:00-3:00pm.

Room: CoC 358

The best known approximation algorithm for the shortest vector problem has factor of about 2^{O(n)}, while the best hardness result known is sqrt(2) (for l_2 norm). Goldreich and Goldwasser proved the following intriguing result: if the shortest vector is hard to approximate within a factor of O(sqrt(n/logn)), then the polynomial hierarchy collapses (which is believed to be false). (Lenstra, Lagarias and Schnorr proved a similar result before them with a factor of n rather than O(sqrt(n/logn).) While Jin-Yi Cai proved a similar result for the "unique" shortest vector problem. More precisely, it cannot be NP-Hard to find the shortest vector in a lattice which is n^{0.25} unique unless the poly. hierarchy collapses. In this talk I will present these remarkable results (of LLS,GG,C) which under standard complexity assumptions- establish impossibility of hardness for these lattice problems.

Date: July 1, 2003.

Time: 2:30-4:30pm.

Room: CoC 358

In this talk I will present "Extractors: Optimal Up to Constant Factors" by Lu, Reingold, Vadhan, and Wigderson that appeared in STOC '03. The LRVW extractor is the first extractor that achieves optimality up to constant factors in both seed length and constant length simultaneously for arbitrary min-entropy, after a decade of intense effort toward this goal. I will present the construction and analysis of the LRVW extractor.

Date: July 16, 2003.

Time: 3:00-5:00pm.

Room: CoC 354

I will present results from a paper by Luca Trevisan to be presented in the next FOCS. The title of the paper is "List Decoding using the XOR Lemma". In this, Trevisan points out a relationship between direct product (or XOR) Lemmas and list decoding. In particular he points out that if we have a direct product lemma then we can convert a code which is uniquely decodable into a code that is list decodable (for large errors). The parameters of the code depend on the parameters of the direct product lemma. Trevisan also finds direct product lemmas with better parameters than previously known to give good list decodable codes. The presentation will be completely self-contained.

Date: July 18, 2003.

Time: 3:15-5:30pm.

Room: CoC 358

I will present results from a paper by Fischer, Kindler, Ron, Safra and Samorodnitsky which appeared in FOCS 2002. The authors give a series of algorithms for testing whether a given boolean function on n variables is a k-junta (i.e. depends only on k variables, where k is much smaller than n). Given oracle access to f, the algorithms make a number of queries which is polynomial in k and 1/epsilon, where epsilon is an approximation parameter. I will also present a lower bound of sqrt(k) on the number of queries required for non-adaptive algorithms.

Date: September 8, 2003.

Time: 12:00-1:00pm.

Room: CoC 358

An important property that an auction may be required to have is that of truthfulness - i.e. a bidder should have no benefit in lying about his/her utility for the items being auctioned. Digital goods auctions are those in which we have sufficient copies of one item (e.g. software, music files) so that any number of bidders can be sold a copy, if required. We prove the equivalence of two definitions of truthfulness in the setting of randomized auctions for digital goods - truthfulness in expectation and truthfulness everywhere. The second definition seems to be stronger than the first, and indeed it has been assumed to be so previously. This is joint work with Vijay Vazirani.

Date: Friday, 12th September.

Time: 12:00-1:00pm.

Room: CoC 109

This will be a continuation of the talk I'll be giving on Thursday. I'll present proofs of some things that I'll be stating without proof on Thursday. Also, there's a LOT of FASCINATING material that there simply isn't room for on Thursday. I'll be selecting various topics from the following papers (all of which are related to Thursday's talk, too): * Derandomization and Distinguishing Complexity CCC 2003 * Power from Random Strings FOCS 2002 * When Worlds Collide: Derandomization, Lower Bounds, and K-Complexity FSTTCS 2001 * What can be efficiently reduced to the K-random strings? in preparation If you won't be able to attend Thursday's talk, fear not! I'll try to make this talk relatively self-contained. I'm also open to the idea of having this talk be driven by questions and comments from the audience, going into more depth than I'll do in the general talk on Thursday.

Date: September 19, 2003.

Time: 12:00-1:00pm.

Room: CoC 109

In this talk I will present some recent results on the degree of symmetric polynomials representing Boolean functions over Z_m. The motivation for this problem comes from a result of Barrington et. al, in which they prove that OR can be represented over Z_6 by polynomials of degree O(\sqrt n), while over Z_p the degree is \Omega(n). Using a classical result called Lucas' Theorem we show that over Z_m, the existence of low degree symmetric polynomials representing a function is equivalent to the existence of certain simultaneous communication protocols. For example, over Z_6, functions represented by low degree symmetric polynomials correspond to functions computable by 2 player protocols where Player1 and Player2 are given only a few of the lower order bits of the weight in base 2 and 3 respectively. This equivalence allows us to use techniques from communication complexity and number theory to show bounds on the degree. We show \Omega(n) bounds for symmetric polynomials (weakly) representing Mod_r over Z_pq. We show an interesting connection (going both ways) between the degree of symmetric polynomials representing Threshold , and the number of solutions to certain classes of Diophantine equations. This is joint work with Parikshit Gopalan and Richard J. Lipton

Date: September 26, 2003.

Time: 12:00-1:00pm.

Room: CoC 109

It has been observed that the degrees of the topologies of several communication networks, including the Internet and the WWW follow heavy tailed statistics. What is the impact of such heavy tailed statistics on the performance of basic communication tasks that a network is presumed to support? How does performance scale with size for such networks? We study routing, searching and game theoretic questions in families of sparse random graphs whose degrees follow such distributions by establishing strong conductance and spectral gap. Joint work with C. Gkantsidis, C. Papadimitriou, A. Saberi.

Date: 3rd October, 2003.

Time: 12:00-1:00pm.

Room: CoC 109

I will prove some of the results that Milena mentioned last week. In particular, I will prove that random graphs in prefrential attachment model have constant conductance and hence have worst-case routing congestion that scales logarithmically with the number of nodes. I will also talk about frugality (overpayment in the Vickrey-Clarke-Groves mechanism for shortest paths) and prove that the expected frugality of a random graph is bounded by a small constant. Joint work with Milena Mihail and Christos Papadimitriou.

There is no seminar on October 10th due to FOCS 2003.

Date: 17th October, 2003.

Time: 12:00-1:00pm.

Room: CoC 109

We analyze incentive compatible (truthful) mechanisms over restricted domains of preferences, the leading example being combinatorial auctions. Our work generalizes the characterization of Roberts (1979) who showed that truthful mechanisms over

Date: 5th December, 2003.

Time: 12:00-1:00pm.

Room: CoC 109

Graph reachability problems are central to the study of space bounded complexity classes. The undirected version of the problem is better understood than the directed version, where not much is known about time space tradeoffs or randomized algorithms. We present a spectrum of randomized time space tradeoffs for solving directed graph connectivity in small space. We use a strategy parametrized by a number k that uses k pebbles to perform several small random walks. The spectrum ranges from log n to (log n)^2 in space and 2^poly(n) to 2^polylog(n) in time. Our approach allows us to look at Savitch's recursive doubling algorithm and Gill's random walk with restarts algorithm as two extremes of the same divide and conquer strategy. This is joint work with Dick Lipton and Aranyak Mehta.