College of Computing News

Professor Margaret Martonosi Delivers Mary Jean Harrold Memorial Distinguished Lecture

Professor Margaret Martonosi gave the Mary Jean Harrold Memorial Distinguished Lecture on Friday, Jan. 12. It was standing room only for the Princeton computer science professor’s talk, A “Post-ISA” Era in Computer Systems: Challenges and Opportunities.

The lecture is named after former School of Computer Science Professor Mary Jean Harrold, who was a leading software engineer. Martonosi worked with Harrold at the Computing Research Association–Women, a division of the CRA that focuses on bringing more women to computing research. “Mary Jean was simultaneously very tough but very nice,” Martonosi said. “Balancing that is a life-long challenge for most of us, but she did it masterfully.”

The renowned computer architect discussed the future of computing in this post-Moore’s law era. In 1965, businessman Gordon Moore posited that the number of transistors on an integrated circuit doubles almost every two years. Computer trends have been dictated by this ever since, yet as transistors shrink so does their efficiency. Almost a decade later, electrical engineer Robert Dennard introduced Dennard scaling, or the idea that even as transistors get smaller, their power density remains consistent.

Yet today we can only drop supply voltage so much. The future of hardware is devoted to figuring out what comes after Moore and Dennard are obsolete, and computer architects are some of the chief researchers trying to discover the next phase.

“Computer architects mediate between application trends from one side and technology trends from the other,” Martonosi said. “Some technology trends are challenges like the end of Moore’s law and Dennard scaling, and some are opportunities like technologies that might be emerging. Application trends are also guiding much of what architecture does, so coming up with accelerators and ways of better supporting big data and machine learning.”

These opportunities were the focus of Martonosi’s talk. She detailed the history of attempts to solve this dilemma, such as per module approaches (not powering what a programmer isn’t using) or on-chip parallelism (spreading out operations through multiple processors). Yet these fixes could only go so far. Now the field is focused on heterogeneity and specialization, such as using accelerators or putting central processing units and graphics processing units on the same chip.

“What’s interesting about this is not just the degree of resourcefulness employed to keep scaling performance at manageable power,” Martonosi said. “The interesting thing —and perhaps the scary thing about it — is that over time more and more of this became exposed to software. Whether you’re architects or if you sit higher in the food chain, we’re all in this together.”

There may be more complexity now, but with that comes more bugs and more security risks. “This is a fundamentally disruptive moment, not just for architects but for computer systems overall because the hardware-software interface is undergoing a seismic change,” Martonosi said.

She outlined the challenges in this new era:

  • How to program these heterogeneous systems in a way that’s correct
  • How to verify that correctness
  • How long will this method be sustainable

Ultimately, Martonosi argued, we need new software abstractions to mitigate the problem in the short and long term. For the short term, Martonosi’s research is focusing on memory consistency models and how to verify them. These models guarantee that memory will be consistent if the programmer follows a systems’ rules. Her team has created several verification tools that have found bugs. One such tool, known as TriCheck, is a full stack memory consistency model verification tool that can analyze everything from high-level languages to microarchitectures.

“Architects can have a lot of leverage if they verify things earlier,” Martonosi said.

Yet Martonosi believes the long-term solution could be quantum computing. It may be too specialized to take over completely for Moore’s law, but it is the new paradigm. As researchers find faster quantum algorithms that need fewer qubits, the basic unit of quantum information, and improve reliability, this could translate to scaling improvements.

“I ­see this as the golden age of computer systems design,” Martonosi said. “The rules are ready to be broken.”