Bo
Dai

General Information

Email:
bodai@cc.gatech.edu
Phone:
4043854785
Location - Building:
Coda
Location - Room:
E1342A
Roles:
Professor (any rank)
Primary Unit:
School of Computational Science and Engineering

Details

Degrees with subject and Postdoc Experience:
Degree Type
Ph.D.
Subject
Computation Science and Engineering
Year
2018
Institution
Georgia Institute of Technology
Location
Atlanta
Degree Type
M.S.
Subject
Computer Science and Statistics
Year
2013
Institution
Purdue University
Location
West Lafayette
Degree Type
M.S.
Subject
Computer Science
Year
2011
Institution
Chinese Academy of Science
Location
Beijing, China
Degree Type
B.S.
Subject
Computer Science
Year
2008
Institution
Nanjing University
Location
Nanjing, China
Statement of Research Interests:

Dr. Dai's principal research interest mainly lies on Agent AI upon Generative Models, Representation, and Reinforcement Learning, aiming for creating agents with decision-making and planning ability through modeling the world.

Statement of Teaching Interests:

To establish the big picture of the entire machine learning and include the recent progress in appropriate position in the big picture, Dr. Dai redesigned the machine learning related courses in Georgia Tech, including CX 4240, CS 4641, and CSE 6243.

Selection of recent research, scholarly, and creative activities:

Representation Learning via Non-Contrastive Mutual Information 
Daniel Guo, Bernardo Avila Pires, Khimya Khetarpal, Dale Schuurmans, Bo Dai 
International Conference on Artificial Intelligence and Statistics (AISTATS) 2026. Spotlight 
[ Paper ]

MLE-Smith: Scaling MLE Tasks with Automated Multi-agent Pipeline 
Rushi Qiang, Yuchen Zhuang, Anikait Singh, Percy Liang, Chao Zhang, Sherry Yang, Bo Dai 
International Conference on Learning Representations (ICLR) 2026. 
[ Paper ]

Spectral Bellman Method: Unifying Representation and Exploration in RL 
Ofir Nabati, Shie Mannor, Bo Dai, Guy Tennenholtz 
International Conference on Learning Representations (ICLR) 2026. 
[ Paper ]

AmorLIP: Efficient Language-Image Pretraining via Amortization 
Haotian Sun, Yitong Li, Yuchen Zhuang, Niao He, Hanjun Dai, Bo Dai 
Advances in Neural Information Processing Systems (NeurIPS) 2025. 
[ Paper ]

MLE-Dojo: Interactive Environments for Empowering LLM Agents in Machine Learning Engineering 
Rushi Qiang, Yuchen Zhuang, Yinghao Li, Dingu Sagar, Rongzhi Zhang, ChangHao Li, Ian Shu-Hei Wong, Sherry Yang, Percy Liang, Chao Zhang, Bo Dai 
Advances in Neural Information Processing Systems (NeurIPS) 2025. 
[ WebPages ][ Leaderboard ][ Paper ]

Exploration from a Primal-Dual Lens: Value-Incentivized Actor-Critic Methods for Sample-Efficient Online RL 
Tong Yang, Bo Dai, Lin Xiao, Yuejie Chi 
Advances in Neural Information Processing Systems (NeurIPS) 2025. 
[ Paper ]

Matryoshka Pilot: Learning to Drive Black-Box LLMs with LLMs 
ChangHao Li, Yuchen Zhuang, Rushi Qiang, Haotian Sun, Hanjun Dai, Chao Zhang, Bo Dai 
Advances in Neural Information Processing Systems (NeurIPS) 2025. 
[ Paper ]

Efficient Online Reinforcement Learning for Diffusion Policy 
Haitong Ma, Tianyi Chen, Kai Wang, Na Li, Bo Dai 
International Conference on Machine Learning (ICML) 2025. 
[ Paper ]

Value-Incentivized Preference Optimization: A Unified Approach to Online and Offline RLHF 
Shicong Cen, Jincheng Mei, Katayoon Goshvadi, Hanjun Dai, Tong Yang, Sherry Yang, Dale Schuurmans, Yuejie Chi, Bo Dai 
International Conference on Learning Representations (ICLR) 2025. 
[ Paper ]

Faster WIND: Accelerating Iterative Best-of-N Distillation for LLM Alignment 
Tong Yang, Jincheng Mei, Hanjun Dai, Zixin Wen, Shicong Cen, Dale Schuurmans, Yuejie Chi, Bo Dai 
International Conference on Artificial Intelligence and Statistics (AISTATS) 2025. 
[ Paper ]