SDSC Systems Enable 14 Billion-Year Sky Survey With Highest Resolution To Date
- Date:
- June 3, 2005
- Source:
- University Of California, San Diego
- Summary:
- The San Diego Supercomputer Center has announced that the most highly defined spatial and temporal simulation of the universe was recently run at the center. Conducted by a research team of astrophysicists headed by Mike Norman of UCSD, the simulation was created using Enzo, a parallel, 3D cosmology hydrodynamics code.
- Share:
The San Diego Supercomputer Center has announced that the most highly defined spatial and temporal simulation of the universe was recently run at the center. Conducted by a research team of astrophysicists headed by Mike Norman of UCSD, the simulation was created using Enzo, a parallel, 3D cosmology hydrodynamics code. It took place during a 48-hour period and involved more than 10,000 CPU hours on the center's TeraGrid system. The calculation will help develop a cosmological visualization for the planetarium competition at DomeFest 2005 in Albuquerque this July and for viewing on the PBS television show NOVA this fall.
"Besides its immediate use for the visualization, the calculation will also benefit those involved in spatial mapping and simulated sky surveys due to its incredibly high resolution of space and time," said Brian O'Shea, member of the UCSD research team. "Without SDSC's TeraGrid system and its responsive staff and support team, such a simulation wouldn't have been possible in such a time frame."
SDSC's global parallel filing system (GPFS), combined with a strategic area network (SAN) that SDSC has especially configured to handle vast amounts of data, allowed the research team to write the 26 terabytes to disk though 128 nodes running simultaneously. "Such a computational calculation would take a month or two on other high-end systems," said Patricia Kovatch, high-performance computing team leader at SDSC. "But with our configuration, every node acts as a server, so the process is exponentially faster."
The calculation involved 2000 simulated snapshots of a wide expanse of the universe--approximately 250 millions light years across--each one signifying the passage of 6.8 million years, to encompass the nearly 14 billion years from the Big Bang to the present. It generated 26 terabytes of data to disk--nearly enough raw data to fill the entire Library of Congress--which was mirrored to the TeraGrid system of the National Center for Supercomputing Applications (NCSA) in Champaign, Ill. The calculation supplies researchers with key data for determining galaxy formations and a variety of cosmological parameters, including those related to matter, mass and speed.
Story Source:
Materials provided by University Of California, San Diego. Note: Content may be edited for style and length.
Cite This Page: