Charles Isbell studies artificial intelligence. His work explores the frontiers of machine learning, looking for ways to create ever-more-functional autonomous agents and plugging those agents into such applications such as interactive entertainment. To understand this work, consider the story of a charming adolescent named Cobot.
Cobot lived in a highly social world and had many friends. Indeed, he was the most popular personality in a world of thousands of personalities. People engaged with him, liked him and disliked him, asked him questions and were left scratching their heads at his enigmatic replies. In a line diagram of social interactions within Cobot’s world, his was the node springing thick black masses of tendrils.
Cobot, it should be noted, was a piece of software—a virtual intelligence who nonetheless could have run for mayor of his world (and won).
Isbell, associate professor in the School of Interactive Computing, came up with Cobot in the 1990s while working at the AT&T Shannon Lab in Florham Park, N.J. At first, the idea was for Cobot simply to model human behavior within the online community LambdaMOO, one of the first object-oriented MUDs (multi-user dungeons) on the Internet.
Cobot collected statistics. Whenever another user engaged in an action within Cobot’s “view” (LambdaMOO was designed as a virtual mansion, with a series of interconnected rooms), he noted it in what Isbell called a “social map.” Eventually, with enough interactions logged, Cobot learned to predict appropriate responses when other users interacted with him. He knew which users to joke around with and which ones to play it straight. He learned how and when to hug.
More importantly, he showed his human creators that such behavior can be quantified, learned and predicted.
“What if you could ask your watch to track all the social behavior around you—just imagine how much that would affect what you do and how you behave,” Isbell says. “We’re all remarkably predictable. If people weren’t like this, we couldn’t get anything done. What is it we need to know about people to build systems that accomplish what we’re trying to accomplish?
“Fundamentally,” he says, “I’m interested in applying this to human behavior.”
The humans of LambdaMOO were certainly interested in Cobot’s behavior. Simply through his ability to share statistics on other users (i.e., tell people how (un)popular they were), Cobot’s Q rating shot skyward. But to fully integrate him into the online world, Isbell knew, Cobot had to do more. He had to chat. That’s when, like a 2-year-old just developing language to express personality, Cobot would really get interesting.
Armed with his new knowledge of what actions were appropriate and when, all Cobot lacked was something to say to others (who may or may not have known he was a bot). Isbell supplied him with a large reservoir of dialogue lifted from publicly available documents: the Unabomber manifesto. Screenplays for The Big Lebowski and the original Planet of the Apes. Cobot could mine these documents for potential responses to user interactions, making sure his response contained at least one word the other user had typed.
Suddenly Cobot became mysterious—and darn funny:
|Cobot [to User 1]: There is good reason to believe that many primitive societies subjected people to less pressure than the European society did, but European society proved far more efficient than any primitive society…
User 1 [to Cobot]: Thats what they want you to think!
Cobot [to User 1]: Tell me who and what you really are and where you come from, and no veterinarian will touch you.
User 1 [to Cobot]: Veterinarians!!!
Cobot [to User 1]: The Roman system of urban sanitation was forgotten, so that until rather recent times did the sanitation of European cities that of ancient Rome.
User 2: cobot’s conversation topics just get weirder and weirder.
Cobot [to User 1]: In spite of every sign I’m an intelligent being meaning no harm, you continue to hate and fear me.
Humor aside, Cobot serves as an example that machine learning algorithms can indeed not only observe and model behavior, but under certain circumstances they can reproduce that behavior, even in a noisy, human environment. Hard-coding Hollywood scripts as fodder for an artificial intelligence’s playful banter may be neither sufficient nor scalable in a more “real-world” setting, but Cobot’s lesson is that such banter (and the social “understanding” behind it) can be accomplished.
“When you’re mixing chemicals in a vat, you’re talking about physics—very difficult physics, but still physics, and the ‘answer’ should be the same every time,” Isbell says. “If you stop working in small, controlled environments and instead try to live with humans in social environments, that’s where you learn all the cool stuff.”
Finding better ways to enable students to learn the “cool stuff” of computer science is the other passion driving Isbell’s professional life. In 2010 he was officially named the College of Computing’s associate dean for academic affairs, with responsibility for overseeing all academic programs, both undergraduate and graduate. One of the creators of Threads, the College’s groundbreaking approach to undergraduate CS education, Isbell was a natural choice for the AD role.
“Threads is an operationalization of the notion that computing is broad,” says Isbell, who also was recently selected by the University System of Georgia to participate in its Executive Leadership Institute, a leadership development program for faculty and administrators seen as possible senior leaders within the system.
“Over the past few years Charles has been deeply involved in all our programs that affect students, working tirelessly on behalf of both undergraduates and graduate students,” says Zvi Galil, who joined the College as its third John P. Imlay Dean of Computing in July 2010. “He is a charismatic leader who’s able to connect with both students and faculty, and appointing him to his current associate dean role was one of the easier decisions I’ve had to make since arriving at Georgia Tech.”
“What I want to do,” Isbell says, “is graft some path that allows people—any people, lots of people—to use computing to do what they want to do.”