This page belongs to my old website.
To see my updated, new homepage, enable JavaScript and refresh this page.
With my students, I do research in the areas of robotics and computer vision, which present some of the most exciting challenges to anyone interested in artificial intelligence. I am especially keen on Bayesian inference approaches to the difficult inverse problems that keep popping up in these areas. In many cases, exact solutions to these problems are intractable, and as such we are interested in examining whether Monte Carlo (samplingbased) approximations are applicable in those cases. We think so.
Since coming to Georgia Tech I have explored the theme of probabilistic, modelbased reasoning paired with randomized approximation methods in three main research areas:
For an overview of some recently published research results, click here.
Recent: 
See 2004 Research Highlights 
ICRA 2003 

ICRA 2002 
Linear 2D Localization and Mapping for Single and Multiple Robots 
Phd Thesis 

NIPS 2001 

CVPR 2000 

ICRA 1999 

WACV 1998 

Tutorial 
For the complete picture, see my publications page.
For an out of date "pretty pictures" index see the BORG Research page.
My scientific interests are driven by the vision that new ways of computing will enable us to tackle problems of unprecedented scale in the coming decades. Many important open problems hinge on our ability to make sense of vast amounts of data, generated by an explosively growing number of digital interfaces to the physical world. I am primarily driven by such problems in the area of robotics and especially computer vision, as cameras are by far the highest bandwidth sensors that interface robots and computers to the real world. And in computer vision and robotics, I am especially attracted to problems where there is more of everything: more robots, more cameras, more world to interface with. How can we make sense of a year's worth of recorded behavior inside a beehive? How could we model the evolution of a city from tens of thousands of historical images? What if we insert one hundred robots in an unknown environment with the goal to model it? Increasing computing power is inadequate by itself to tackle questions of this magnitude. The answer lies in the development of new computing paradigms that are especially attuned to deal with problems of this nature.
In light of this vision, my research focuses on developing computationally efficient algorithms to construct models of physical phenomena from massive amounts of noisy, ambiguous data. It is my firm belief that this should be done by building algorithms on a strong theoretical foundation, where explicit assumptions and approximations guide the search for efficiency. In my view, the most fruitful theoretical framework in which to view problems of this type is that of probability theory, in order to deal with the imperfect nature of the data. However, data does not exist in a vacuum: there is considerable expert domain knowledge that provides a context for how the data came into existence. Thus, a key to efficient algorithms is the development of representations that exploit this knowledge. Finally, whereas exact probabilistic reasoning is often prohibitively expensive, one can devise theoretically sound approximation algorithms that come at a fraction of the cost. Hence, my research deals with finding novel solutions in the following three directions:
Frank Dellaert's Homepage