The latest supercomputer

The latest supercomputer McGill University

| Skip to search Skip to navigation Skip to page content

User Tools (skip):

Sign in | Monday, October 24, 2016
Sister Sites: McGill website | myMcGill

McGill Reporter
December 12, 2002 - Volume 35 Number 07
| Help
Page Options (skip): Larger
Home > McGill Reporter > Volume 35: 2002-2003 > December 12, 2002 > The latest supercomputer

The latest supercomputer

In 24 hours last November, a chemist at the University of Alberta used a supercomputer to solve a sophisticated molecular modeling problem that would have taken three and a half years to calculate on a single processor. But this supercomputer did not reside down the hall or across campus. Instead, it was a string of 1,400 processors at 21 academic sites, stretching across the breadth of Canada, all harnessed together for one day's hard work to hyper-expedite a worthy research project.

Photo Wagdi Habashi and Ron Haber
PHOTO: Owen Egan

And the biggest computational muscle at this digital barn-raising was the new CLUMEQ supercomputer, recently switched on in McGill's Burnside Hall. It is the fastest in Canada, among the fastest in the world, and a precious new asset to researchers from the Consortium Laval-UQAM-McGill and Eastern Québec (CLUMEQ). "With this machine, we're orders of magnitude ahead of where we were before," said Tony Masi, vice-principal (information systems and technology) at McGill.

The installation was made possible by an $8 million grant from the Canada Foundation for Innovation and the Ministère de l'Éducation du Québec, though the CLUMEQ machine would have retailed at around $23 million, according to consortium founder Wagdi Habashi, the NSERC-Bombardier Chair of Multidisciplinary Computational Fluid Dynamics at McGill.

"Instead of asking separately for our own toys," he said, "we put our heads together and asked for a much larger toy. Then we made sure that this toy was not only powerful but also that it was novel. Not only is this supercomputer a great asset for research, the computer itself is a research area."

This is because the CLUMEQ supercomputer has been custom built -- two independent systems programmed to work in parallel, distributing one large calculation over many processors simultaneously, thus accelerating the entire process. The left side of CLUMEQ's brain is a 64-processor Silicon Graphics machine with a shared memory of 128 gigabytes. The right side is a more experimental 256-processor "Beowulf" cluster with a distributed memory of 384 gigabytes. In this new mode of supercomputing, racks of ordinary PC processors are conjoined by smart software and super fast communications to tackle mighty calculations in parallel -- thus the Beowulf metaphor (Beowulf being the ancient Scandinavian hero who slew the bloodthirsty monster, Grendel, with brains and steel).

When all 320 of its processors are clicking, CLUMEQ can perform nearly 400 billion operations per second. Preliminary benchmarking places this new entry firmly among the world's 150 fastest supercomputers.

But what does all this power mean? Plenty, to Habashi. For more than two decades, he has been working to increase the efficiency of aircraft design. In the old days, advances in design came only after grueling cycles of trial and error. Today, mathematical modeling made possible by supercomputers like CLUMEQ make such research far more efficient and economical than physical testing. And, sometimes, more true to life. "Airplane models are tested in wind tunnels," Habashi explained. "But when is the last time you saw an airplane fly within four walls? Or at ground level? People think that testing is real while modeling is theoretical. But sometimes modeling can be more realistic than testing."

According to CLUMEQ director general Ron Haber, the CLUMEQ supercomputer was designed to be "polyvalent," using the descriptive French adjective to describe the multidisciplinary nature of the machine. He sees potential applications in engineering, robotics, nanomaterials, environmental sciences, bioinformatics, computer science, medical sciences, architecture, even the arts.

"Every segment in which McGill is involved will benefit from these technologies," Haber said. "But the biggest challenge is education."

"This is a very fast computer and it has enormous potential," agreed Masi. "But the potential will only be realized if we can write software that can utilize the full power of the hardware."

Masi's office, ordinarily responsible only for support of infrastructure, has injected start-up funding to staff and program the machine, and educate researchers for two years. But start-up funds are just that. "We have to make sure researchers, knowing about this computer and its capabilities, will be able to write into their research grants utilization of the computer."

Habashi doesn't think it will take long for researchers to catch on. But the state of the art in computing is a moving target. Moving ever forward.

"At a family party," Habashi recalled, "my cousin once asked me, 'What do you do in research?' I told her I develop mathematical models to better design airplanes. She said, 'okay, I understand that. But after you have done that, what do you do after?'

"That's research. It's incremental and never ending. And it's the same with supercomputing. Once you have launched yourself down the road of supercomputing, it's impossible to go back to a PC. Today's supercomputer will be tomorrow's PC. In three years we are going to look at this machine and smile. So we have to have the mentality and the will that, in three years, we will be back, asking for upgrades."

view sidebar content | back to top of page