When I ask geologist Marianne Douglas what she makes of all the fuss over the coming new millennium, she breaks into a big laugh. “I don’t know. A thousand years doesn’t seem like a long time to me.” Her blasé attitude towards what others deem a momentous tick of the clock makes sense when you consider that in her field – paleolimnology – “old” means millions or even billions of years.
Douglas spends her summers in the high Arctic collecting microfossils that she later studies to determine how Earth’s environmental conditions have changed over passages of time far more grandly glacial than one piddling millennium. Her work contributes to the growing mountain of scientific evidence showing that human actions have had a profound, climate-altering effect on the planet – mostly in the last 150 years. So if her response to the human-centred time-mark of a thousand years is nothing more than a polite “whoop-de-doo,” it’s easy to understand why.
In fact, whoop-de-doo pretty much sums up the prevailing attitude of all the faculty with whom I spoke about millennium hype versus reality. If you’ve been suffering a touch of millennial paranoia, take comfort: some of Canada’s smartest people aren’t stocking up on canned goods or bottled water, and they merely shrug over predictions of Y2K-induced disaster. “I don’t believe there will be madness in the streets,” says bemused psychiatry professor George Tolomiczenko.
And while more flamboyant members of the human race are set to ring in 2000 with spiffy festivities, our sample scholars will opt for quiet evenings among family and friends, secure in the knowledge that the new millennium doesn’t start for another year anyway. So attention, party-planners looking to entertain this demographic group: go easy on the fireworks and silly lampshades; a nice bottle of champagne and some microwave popcorn will do just fine. But if partying hard to welcome the millennium doesn’t particularly excite this group, being asked to take stock of their various fields – of where they’ve been and where they’re headed – certainly does. What’s notable is their optimism; for years they’ve operated in a climate of vision-cramping funding cuts. That’s changing, especially in the sciences, thanks to major infusions of cash from a variety of sources. In this, faculty are refreshingly un-Canadian in their willingness to toot their own horns – albeit not at big New Year’s parties.
New millennium or not, time is of the essence for researchers like Michael Chazan, an archeologist in the department of anthropology, who studies ancient stone tools to gain insights into the behaviour and cognitive processes of Neanderthals and Homo erectus. But he says it will be a long while, if ever, before we understand such subtleties as how these species viewed time – although we can probably assume that they didn’t celebrate it in thousand-year blocks.
“We don’t know if they had recollections of the past or if they anticipated the future,” says Chazan. Evidence that suggests modern humans recorded lunar time cycles is found in 40,000-year-old cave paintings, but before that there are few clues to early ideas of time.
Thanks to innovations in artifact-dating techniques, in some cases pioneered at U of T, we do know more than ever about the age of many archeological finds, and researchers have discovered that many artifacts are actually a lot older than previously thought, shifting our concepts of the capacities of early members of our species. “The idea that the time line isn’t fixed is strange for students,” Chazan admits. “There is a lot of flux.”
With collaborative projects under way in the Middle East, Africa and Europe, Chazan and other U of T researchers have the potential not only to link us to the past but also to act as significant cultural bridge-builders in the contemporary world. Chazan is taking part in a five-year project with French and German archeologists excavating near the famous painted caves at Lascaux in southwestern France.
“Archeology is part of its time, and in the past there has often been a colonial approach. We’re in an emerging global period of ethnic tensions, and I hope that archeology will be a place where people will think about their past and how to interpret it as a peace-building activity.” As part of that effort, Chazan and colleagues have ambitious plans to create a state-of-the-art research facility at U of T called the Laboratory for Archeological Science and Human Evolution. To see it succeed, says Chazan, his department will be digging up ways to raise the necessary funding.
t may be hard to pin down what our earliest ancestors understood about time, but even as we fast-forward to more recent human history, we find that concepts of time we now take for granted, including millennia, aren’t exactly carved in stone. “Historically, it’s just chance that we’re marking the millennium,” says Roberta Frank, professor of English and medieval studies.
As Frank points out, the Muslim and Jewish calendars aren’t the same as the Christian one from which the millennium springs. In Rome, dating was done by reigns of emperors, popes and consuls, as well as tax periods known as indictions. Dating by Anno Domini (AD) was invented in what we think of as the early sixth century, later standardized by the Anglo-Saxons. But nothing was fixed, calendar-wise, until the 11th century. “Anyone in medieval Europe would have learned to date in multiple fashion,” says Frank, “by Roman consular year, by the 13-month Anglo-Saxon year, by the moon, by kings and by the Roman 15-year indiction period. They would have been juggling always.”
Although the mark of the first millennium was not culturally significant, as the 11th century progressed, “there was a belief among Christian writers that humankind was living in the last era of the world,” says Frank. “In the homilies, there are some wonderful, melancholic poetic lines about how the world is older and approaching its end, how the fruits are smaller and the men punier.”
This idea of millennial apocalypse has survived into our day. And while it may arise from the Christian tradition, “notions of Ages or periods of recurring time cycles are found among Hindus and Janes,” says Willard Oxtoby of Trinity College, professor emeritus of the study of religion and author of The Meaning of Other Faiths. Islam, he adds, contains the hope that within each century, there will be a “renewer of the faith.”
Teaching for the year at the College of William and Mary in Virginia, Oxtoby thinks that religion, rather than blending us into homogeneity, will continue to be a major marker of group identity. And, he predicts, Muslim youth raised in Canada will have a profound effect on Islam in the 21st century. “They’re cosmopolitan. It’s this diaspora that will influence what is important, and shape the future of these traditions.” As for the rise of Christian fundamentalism in some parts of the U.S., powerful enough in 1999 to ban the teaching of evolutionary science in Kansas schools, Oxtoby says: “Conservative literalists have muzzled scientists temporarily in the past, but these attempts to hang on to literal readings [of the Bible] don’t survive in educated circles.”
Our greater connectedness to the world is nowhere more apparent than in the program for international development studies that is headed by economics professor Sue Horton at the University of Toronto at Scarborough. Horton says enrolment has increased from 30 to 90 students in 10 years. Students are no longer just well-meaning Canadians flying off to the developing world, they also come from abroad to be educated here, and return to their home countries armed with knowledge and skills to participate in economic and social development.
Along with the traditional humanitarian side of development, Horton says there’s a need for creating business, trade and export opportunities. For the future, she envisions an innovative strand of programming that would add courses in computer communications and international business management to what has traditionally been a social science-oriented approach. The Internet is allowing local populations to gain access to information and resources at a greatly accelerated pace, says Horton, and it is essential that people trained in international development tap into its potential. “The Internet can give people more say in their own destinies, but without more training and awareness, communities could be left open to control by really scary corporate agglomerations.”
For insights into how Canada might manage its own economy and communities, an excellent source is Meric Gertler, Goldring Professor of Canadian Studies, professor of geography and director of the program in planning. Gertler doesn’t believe we must inevitably follow in the shadow of the United States, where economic development often comes with greater social inequity. In fact, he argues that we can gain from increased ties with Europe. “I think Canada can occupy a unique niche between the European and American models. There are lots of benefits to trading more with Europe. It could free us from dependence on the U.S.”
Gertler also argues that while Canadians may be rightly proud of their legacy of a social safety net, there’s a danger in being smug. Ironically, after years of government restraint, Canada has been surpassed by some U.S. states in spending on social infrastructure, such as public transportation. “The foundations of our past prosperity are crumbling. We haven’t allowed ourselves to become Detroit but we need to reinvest in our infrastructures, in health care and education. People have forgotten that so many of our attributes others admire came from public financing. To go with the rhetoric of the lean, mean global economy right now is crazy.”
Others share Gertler’s concern over the decline in social spending, including Professor George Tolomiczenko of the department of psychiatry, an expert on mental illness and homelessness. “At the beginning of the new millennium we have to ask ‘how much is it a societal and moral choice, a community effort to integrate what’s become a large population of marginalized people?'” Pessimistic about finding solutions in the short term, he does hope for long-term progress.
“Psychiatry has to be more open – the answers won’t come only from drugs. The voice of patients will continue to gather force; it used to be that experts would dictate, but that doesn’t work any more. The social aspects of psychiatry will have to come more to the fore. What we need is a unified approach.” Innovative methods of caring for the homeless can go a long way, he says, such as roving teams of caregivers, including not only doctors and nurses, but patient advocates as well.
The dangers of marginalizing large segments of the population, especially the young, aren’t lost on Morley Gunderson, an expert in economics and industrial relations and new CIBC Chair in Youth Employment. One of his research topics is the possible “scarring” effects of youth unemployment. Gunderson says that making sure young people receive sufficient education and other opportunities is a real concern. “If I were young now, I would triple-emphasize education. I’m afraid there will be polarization, with some people completely bypassed [in the new economy].”
After spending a year at Stanford University in California, Gunderson compares the flurry of entrepreneurial activity in the Silicon Valley to the Gold Rush days. Some will strike it rich, says Gunderson, and others, especially the young with low levels of skills and education, will be left in the dust. Unfortunately, he adds, “global and free trade have put a lot of pressure on governments to pay attention to the cost consequences of their policies. Those that play a pure, equity-oriented role in helping the disadvantaged are going to be harder to justify because they cost more.”
To offset this trend, he thinks Canada should invest as much as possible in its young people and bring its labour laws up to date. Designed early this century primarily to protect male workers engaged in the physical labour of heavy industry, legislation must reflect the environment of increasing numbers of workers in knowledge-based industries, who are more likely to be felled by computer-related repetitive strain injuries than by workplace accidents involving heavy machinery.
No one is more firmly planted on the front lines of technological change than genetic researchers. Dr. Janet Rossant, based at the Samuel Lunenfeld Research Institute, Mount Sinai Hospital, is working to understand how mouse genes function, a project that has huge potential to revolutionize medicine in the new millennium, especially in the areas of cancer and degenerative diseases. It’s exciting research happening at the same time as a worldwide effort to map mouse and human genomes. She is specifically interested in trying to understand how an egg develops from a single cell into a complex organism, to determine precisely when, where and how genes might affect development. It’s critical that the research be done on mice, Rossant says, because their genes are similar to ours, and “we can undertake many experiments with a mouse that we can’t with a human.”
Ethical questions are always at play with this kind of research, which tends to conjure up lurid scenarios in the public mind. But Rossant cautions against sensationalism; for now, she points out, we live in a society that would find experimentation using human fetal tissue morally reprehensible. Research in her lab focuses on the treatment of human disease, not wild changes in reproductive technology, and on that score, she and her colleagues are optimistic that the future will see breakthroughs.
For further insights on future scenarios in the science lab, I talked to Scott Mabury, a recent recipient of the Premier’s Research Excellence Award and two teaching awards, one from the Ontario Confederation of University Faculty Associations, the other from the Faculty of Arts and Science.
Mabury researches “chemical architecture” to provide industry and environmental regulators with knowledge about how various substances biodegrade when discarded into the environment. As a teacher, his goal is to “make students better scientists and better citizens.” He’s optimistic about the possibilities of eliminating potential pollutants such as PCBs from the environment, and of safely extending world crop yields through research. The latter is already being done, he points out, and he argues it must be done despite the flurry of controversy surrounding the issue of genetic alteration of food.
When it comes to threats to human health, Mabury is worried about how we’ll deal with communicable diseases in the new millennium, now that many strains of virus and bacteria are resistant to antibiotics. “The tools to control disease will be more expensive and less effective.” He is also concerned about the misuse and lack of regulation of antibiotics in many parts of the world. Just as people are marginalized economically, there may also be increasing inequity when it comes to access to medical treatment and anything beyond a subsistence living. “An educated populace that takes responsibility for its actions is the hope for the 21st century.”
So, what do academics tell us about the future? Some of us, especially the well-educated young, will do very well in the rapidly shifting, technologically focused economies of the future; others will struggle with disproportionately fewer resources and opportunities. We will more successfully battle deadly diseases such as cancer, but we’ll be stymied by new viruses. We will grapple with a multitude of ethical dilemmas involving everything from our use of new medical and scientific technology to the distribution of wealth, and we will do it from a variety of thriving religious and cultural perspectives. And we will continue to find ways of celebrating. So pass the microwave popcorn – you can worry about the implications of its genetically altered contents tomorrow.
Moira Farr (BA 1982 UC) is a freelance writer. Her first book, After Daniel: A Suicide Survivor’s Tale (HarperFlamingo) was published in spring 1999.
By bringing artificial intelligence into chemistry, Prof. Aspuru-Guzik aims to vastly shrink the time it takes to develop new drugs – and almost everything else