Two New Interdisciplinary Research Centers Shaping Future of Computing
Georgia Tech is meeting the future of computing head on as it stands up two new research centers.
Machine Learning at Georgia Tech (ML@GT) and the Center for Research into Novel Computing Hierarchies (CRNCH), both officially launched on July 1, are tackling major challenges that need to be overcome to advance computing.
“Although both are being led by College of Computing faculty members, these interdisciplinary research centers bring together some of the brightest minds from across the campus to solve extremely difficult problems,” said John P. Imlay Jr. Dean Zvi Galil. “The work these centers are undertaking is crucial to the advancement of computing, and strengthens Georgia Tech’s position at the vanguard of computer science.”
Machine Learning @ Georgia Tech
Based in the College of Computing, ML@GT represents all of Georgia Tech. It is tasked with pushing forward the ability for computers to learn from observations and data. As one of the fastest-growing research areas in computing, machine learning spans many disciplines that use data to discover scientific principles, infer patterns, and extract meaningful knowledge.
According to School of Interactive Computing Professor Irfan Essa, inaugural director of ML@GT, machine learning (ML) has reached a new level of maturity and is now impacting all aspects of computing, engineering, science, and business.
“We are in the era of aggregation, of collecting data,” said Essa. “However, machine learning is now propelling data analysis, and the whole concept of interpreting that data, toward a new era of making sense of the data, using it to make meaningful connections between information, and acting upon it in innovative ways that bring the most benefit to the most people.”
The new center begins with more than 100 affiliated faculty members from five Georgia Tech colleges and the Georgia Tech Research Institute, as well as some jointly affiliated with Emory University.
Machine Learning is the new calculus
In addition to leading research, ML@GT is focused on developing human capital in machine learning. According to Dean Galil, “There is extensive demand for these classes from students in all majors across campus. Machine learning is the new calculus and it is showing.”
There are currently more than 200 undergraduate and 250 graduate students enrolled in introductory ML courses.
“I meet people in finance, hardware, logistics, and other industries that either want to know how machine learning can change their business or they want to hire somebody with in-depth knowledge of machine learning,” said Essa.
To meet this growing demand, students, researchers, and faculty at ML@GT focus on three areas: examining the foundations of ML theory; building upon existing technologies at the application level; and adapting ML algorithms in different domains.
With leading computing, statistics, and optimization experts as part of the team, a primary area of focus for ML@GT is delving into the foundations of ML theory.
These foundational building blocks include:
- Dynamic data and decision-making – building systems that can continuously update and process new data streams in order to make informed decisions.
- Neural Computation – creating more efficient and powerful computational processes inspired by biological systems.
- Data Mining and Anomaly Detection – developing new ways to detect anomalies that may be indicators of fraud, disease, or structural defects.
- Interactive Machine Learning – designing systems that can develop and learn from interaction with humans.
- Artificial Intelligence – strengthening the foundational roots of ML to build machines that can accomplish increasingly complex tasks and functions.
- Ethics and bias – considering the ethical implications and appropriate limitations of ML and its evolving roles in research, industry, and culture, and how biased inputs can affect outcomes.
At the application level, ML@GT is pushing established ML tools and practices forward to create more innovative modeling and predictive capabilities. These capabilities will focus on healthcare, education, logistics and operations, sensors and detection, social computing, information systems, security, and privacy, and financial markets. They will also be used to reveal informative patterns and identify abnormal behavior in these and other application areas.
ML@GT is also going inside the application level to research new ways of applying ML knowledge to new disciplines.
“Sometimes it is just not sufficient to take an existing algorithm and apply a new dataset to it,” said Essa. “You have to go inside the machinery in order to understand how to adapt the algorithm to a new domain.”
“This is the starting point. In the first year, we want to develop additional focal points, further strengthen the center’s educational mission, and move forward with establishing a machine learning Ph.D. program at Georgia Tech.”
As for ML@GT’s longer-term goals, Essa said, “Within five years we fully expect Georgia Tech will be a global leader, and the center will be recognized as the international home for advanced machine learning research and education.”
Center for Research into Novel Computing Hierarchies
Just as ML@GT is working to move machine learning into a new era, CRNCH is focused on getting over one of the biggest hurdles facing computing today: the impending end of Moore’s Law.
“We knew that at some point physics would come into play. We hit that wall around 2005,” said Tom Conte, inaugural director of CRNCH and professor in Georgia Tech’s schools of Computer Science and Electrical and Computer Engineering.
Since the 1960s, Moore’s Law has essentially held that, for a given price point, the number of transistors on an integrated circuit doubles roughly every two years. However, there are hard limits to building smaller integrated circuits because, as transistors get smaller they become less energy efficient. The problem gets worse as the chips get faster.
“We’ve gotten by since 2005 using multiple, slower-clocked cores per chip-- 'multicore processing,' in essence -- but it’s only a partial solution,” said Conte. “Very soon, computing’s historic performance growth rate will no longer be sustainable.”
In fact, according to the Institute of Electrical and Electronics Engineers (IEEE) Rebooting Computing Initiative (of which Conte is a founding member and current co-chair) this inevitability means that wholesale changes are needed in both computing technologies and computer architecture if the next level of high-performance supercomputing – machines capable of 10 million trillion floating-point operations per second (10 exaflops) – is to be achieved.
To facilitate these fundamental changes, Conte, along with School of Computational Science and Engineering Chair David Bader, Charlotte B. and Roger C. Warren Chair of Computing Richard DeMillo, Frederick G. Storey Chair in Computing Richard Lipton, Georgia Research Alliance Eminent Scholar and Rhesa "Ray" S. Farmer, Jr. Distinguished Chair in Embedded Computing Systems Marilyn Wolf, and other Georgia Tech faculty proposed the establishment of a new interdisciplinary research center.
This IRC would take the lead in breaking down traditional barriers between computing’s various facets, such as devices, circuits, architecture, software, and algorithms, in order to restart the exponential growth rate embodied in Moore’s Law.
The proposal has come to fruition in the form of CRNCH.
“We didn’t want to be a close follower in the space,” said Conte. “By leveraging the broad, interdisciplinary talent that exists here, Georgia Tech is well-poised to become the international leader in novel computing hierarchies. In doing so, we firmly believe Georgia Tech will be the global epicenter of a new computing economy.”
To achieve the bold breakthroughs necessary to propel computing to the exascale level and beyond, CRNCH researchers are evaluating a number of possibilities.
“Several promising research areas have been identified,” said Conte. “However some are more disruptive to the computer stack than others.”
Some of the possible solutions that have been identified by the IEEE Rebooting Computing Initiative for further research include:
- Quantum computing – uses properties of quantum mechanics to solve optimization problems.
- Neuromorphic computing – leverages what is known about the human brain to create new technologies.
- Approximate and stochastic computing – complementary approaches based on the observation that computers often calculate results to higher than required accuracy and precision.
- Adiabatic and reversible computing – recycles unused inputs and utilizes non-traditional devices for substantial power savings
- Cryogenic superconducting – uses low-temperature superconducting materials to conserve energy.
The CRNCH team includes experts in each of these focus areas. However, Conte said, “It’s important to note that these general approaches are only a sampling of what may be possible and other approaches and techniques are actively being explored by the team.”
With the team in place and the center open for business, Conte expects CRNCH to quickly take a leading role in researching new computer hierarchies and become recognized as the academic research embodiment IEEE’s rebooting computing initiative.
“Within three years we firmly believe that CRNCH will be providing international leadership and have significant influence on the development of new computer hierarchies,” said Conte. “ We also believe that CRNCH will establish Midtown Atlanta as a hotspot for this new computing technology.”