Next spring, the University of Toronto will flip the switch on a machine that will rank among the top 20 supercomputers in the world.
When it’s booted up in April, the mammoth system – composed of 30,000 individual central processing units (CPUs), or cores – will blaze through calculations at a peak speed of 300 teraflops (short for trillion floating operations per second). That’s about 30 times faster than Canada’s current largest research computer.
A companion supercomputer, designed to handle different types of computational problems, will include fewer but more powerful CPUs and operate at a speed of about 60 teraflops. Both computers will be housed in a new data centre that U of T and IBM are constructing in a warehouse north of Toronto.
Richard Peltier, the scientific director of the project and a U of T physics professor, explains that the larger system is designed to solve problems involving vast amounts of data, in which each piece of data is unrelated (or only moderately related) to most other pieces of data. Experimental physicists will use it to examine particle collisions produced by the Large Hadron Collider in Switzerland, for example. The smaller computer is designed for fields in which each piece of data influences many other pieces of data, making it suitable for studying such areas as climate change and fluid turbulence.
Peltier says the powerful new computers are now a standard research requirement in disciplines as disparate as genetics, chemical physics and aerodynamics. They will allow Canadian researchers to remain competitive with their international counterparts. “Most fields are now deeply involved in high-performance computation,” he says. “You really can’t do modern research without it.”
A U of T lab is working with actors, writers and directors on how they could harness AI and other emerging technologies to generate new ideas and – just maybe – reinvent theatre