The past, present and future of the universe is about to be revealed in unprecedented detail by Britain's biggest academic supercomputer, called the Cosmology Machine, based at the University of Durham.Trade and Industry (DTI) Secretary Patricia Hewitt launched the “time machine” on its first simulation program today when she switched on the £1.4 million state-of-the-art installation at the University’s Physics Department.
The Cosmology Machine takes data from billions of observations about the behavior of stars, gases, galaxies and the mysterious dark matter throughout the universe and then calculates, at ultra high speed, how galaxies and solar systems formed and evolved.
By testing different theories of cosmic evolution, it can simulate virtual universes to test which ideas come closest to explaining the real universe.
The gigantic new facility -- manufactured by Sun Microsystems and supplied by Esteem Systems plc - has been installed at Durham with the help of £652,000 from the Joint Research Equipment Initiative (JREI).
JREI was set up by the DTI’s Office of Science and Technology, the Higher Education Funding Council for England (Hefce) and the research councils - in this case, the Particle Physics and Astronomy Research Council (PPARC) -- to provide strategic investment in key scientific infrastructure for research of international quality.
The funding forms part of £18 million worth of special strategic investment in Durham science by the DTI and the research and funding councils over the past two years.
The supercomputer is operated by the Institute for Computational Cosmology (ICC), part of the Ogden Centre for Fundamental Physics now being developed at Durham. Its breathtaking capacity for calculations will set new standards in science that could also help other areas of research.
The supercomputer:
* Is called the Cosmology Machine. Its engine room is an integrated cluster of 128 Ultra-SparcIII processors and a 24-processor SunFire. It is the largest computer in academic research in the UK and one of the 10 largest in the UK as a whole.
* Can perform 10 billion arithmetic operations in a second. This number of operations would a take a numerate individual about a million years of continuous calculation to complete. Alternatively, if all of Earth’s six billion inhabitants were proficient at arithmetic, it would take them about two hours to carry out the same number of operations that the supercomputer can carry out in a single second.
* Has a total of 112 Gigabytes of RAM and 7 Terabytes of data storage. (A Terabyte is more than a million million bytes.) This is the equivalent of nearly 11,000 CD-ROMs. It could hold the contents of the 10 million books that make up the British Library collection and still have plenty of space left over.
Vice-Chancellor Sir Kenneth Calman said, “This is a fascinating and important branch of physics. I am delighted that my colleagues in Durham have established the expertise and quality to take a lead in advancing the frontiers of knowledge even further.”
Professor Carlos Frenk, Director of the ICC, says, “The new machine will allow us to recreate the entire evolution of the universe, from its hot Big Bang beginning to the present. We are able to instruct the supercomputer on how to make artificial universes which can be compared to astronomical observations. It is truly remarkable that all that is required to emulate the Universe are the same laws of Physics, such as gravity, that govern everyday events on Earth.”
Chief Executive of PPARC Professor Ian Halliday said, “This is a stunning resource for astronomical research in Britain. It will enable consortium members in the UK, Germany, Canada and the USA to perform cosmological calculations of unprecedented size and detail. We are poised to confront one of the grandest challenges of science: the understanding of how our universe was created and how it evolved to its present state.”
The Durham Institute is a leading international center for research into the origin and evolution of the universe and is the UK base of the “Virgo consortium for cosmological simulations,” a collaboration of about 30 researchers in the UK, Germany, Canada and the USA.
Research ranges from the formation of the first objects in the universe to the physics of the great clusters of galaxies. Long-term goals are to understand the formation of structures in the universe, to establish the identity and properties of the dark matter that dominates the dynamics of the universe, to determine the parameters of our world model and to relate the Big Bang theory to astronomical observations.
The switching on of the Cosmology Machine by Patricia Hewitt marks the formal launch of the ICC and the beginning of the new Center. A new building for the Center is under construction and due for completion in the summer of 2002.
(Editor's Note: A more detailed scientific note by Professor Carlos Frenk is attached. Because of strong UniSci reader interest in cosmology, we reprint it here in its entirety.)
Cosmic Architecture: Building the Universe
By Professor Carlos Frenk
Director of the Institute for Computational Cosmology, Ogden Center for Fundamental Physics, Department of Physics, University of Durham
Cosmology confronts some of the most fundamental questions in the whole of science. How and when did our universe begin? What is it made of? How did it acquire its current appearance? How will it end?
These are all questions that have preoccupied mankind since the beginning of civilisation, but which only relatively recently have become accessible to established scientific methodology. Recent advances suggest that these and related questions will be answered in the next few years through a combination of astronomical observations and computer simulations.
The enormous progress in cosmological studies over the past decade has been driven by the timely coincidence of new theoretical ideas and technological innovation.
In the past 40 years, astronomers have gathered incontrovertible evidence that our universe began about 10 billion years ago in a hot, dense phase -- the Big Bang -- and that most of its material content today consists of invisible “dark matter,” very likely made up of exotic elementary particles produced in the earliest stages of the Big Bang.
The radiation generated by the primordial fireball is detected today as a background of microwaves and this provides a direct window to the early universe.
In 1993, the COBE satellite discovered tiny ripples in this radiation, the fossil records of primordial irregularities which, over 10 billion years of cosmic evolution, have been amplified by the gravity of the dark matter to produce the rich variety of structures seen today in large galaxy surveys.
Every day, observatories on the ground and in space peer into the cosmos, collecting huge amounts of astronomical information. Cosmological data are unique because the finite light travel time implies that objects are observed not as they are now but as they were at some time in the distant past which depends on how far the object is.
Using the fundamental laws of physics, computer simulations are able to recreate the evolution of the universe, thus providing the means for connecting objects or “events” observed at widely different cosmic epochs.
On the scales of galaxies and clusters, the evolution is complex and involves not only gravitational interactions, but also gas dynamic and radiative effects associated with the gas that ultimately ends up in the stars that make up the galaxies.
In spite of this apparent complexity, the problem is much better posed than most computational problems in Physics or Biology: the initial conditions are known precisely.
Modern computer simulations recreate the major events which have shaped our Universe:
* The formation of the primordial plasma;
* Its irradiation by the earliest quasars and stars;
* The motion of primordial hydrogen gas clouds and their accretion onto spinning dark matter clumps;
* The growth of dark matter halos and the galaxies within them by repeated mergers of substructures;
* The emergence of spiral galaxies like the Milky Way;
* The formation of great aggregates of galaxies like the Coma cluster.
The output of a simulation is a virtual universe over which scientists have control; the input values of fundamental parameters and the underlying assumptions about the nature of the dark matter can be changed at will and new virtual universes created.
A detailed comparison of the virtual universes with the real one reveals the model assumptions and parameter values that best describe our Universe.
Cosmological simulations present a formidable computational challenge not only because of the intrinsic complexity of the problem, but also because of the huge range of scales involved.
The processes that lead to the formation of an individual star operate on a length scale at least one hundred million times smaller than the size of the largest galaxy structures seen in the universe.
To overcome these problems, cosmologists have devised clever algorithms to calculate efficiently the evolution of a collisionless N-body system (the dark matter) as well as novel approaches to fluid dynamics. Many of these techniques have applications to a broad range of problems in other disciplines.
In spite of the spectacular achievements of the past two decades, even the largest supercomputers today are still too small to recreate our universe in the detail required for a full interpretation of astronomical data.
For example, the largest cosmological calculation ever carried out, the “Hubble volume” simulation performed by the Virgo consortium using the 750-processor Cray-T3E supercomputer in Munich, used a record-breaking one billion particles to follow the evolution of dark matter over practically the entire visible universe. However, galaxy clusters could only be resolved with 1,000 particles, and individual galaxies not at all.
Similarly, the largest simulation of the dark halo of our own galaxy (also carried out by the Virgo consortium) took several months of continuous calculation on a 64-processor Cray-T3E at Edinburgh, but did not resolve the crucial inner parts of the galaxy where processes that probe the nature of the dark matter occur.
The new supercomputer at the Institute for Computational Cosmology in Durham will increase the computing power available to the Virgo consortium tenfold. It is the largest supercomputer for academic research in Britain and one of the largest in Europe.
It will enable consortium scientists in the UK, Germany, Canada and the USA to perform cosmological calculations of unprecedented size and detail. In close connection with astronomical data collected with a new generation of giant telescopes and space observatories, these calculations will confront one of the grandest challenges of science: the understanding of how our universe was created and how it evolved to its present state.
[Contact: Professor Carlos S. Frenk ]
31-Jul-2001