MrSeb writes: "IBM and ASTRON, the Netherlands Foundation for Research in Astronomy, have announced that they have begun work on building an exascale supercomputer that, come 2024, will collect data from the Square Kilometre Array (SKA), a 3,000km-wide telescope that will have "millions of antennae". The current world’s fastest supercomputer, the K, has 700,000 processor cores and a peak performance of 10 petaflops — an exascale (exaflop) computer would be 100 times faster than that. The SKA is anticipated to produce a few exabytes of data per day, which will then be processed by the IBM supercomputer to produce between 300 and 1,500 petabytes of stored data per year. To put this into perspective, the web’s daily traffic — i.e. two billion people surfing the web — currently adds up to "only" half an exabyte. The Large Hadron Collider currently produces 15 petabytes of data per year. IBM and Astron will be chasing exascale computing through technologies such as phase-change memory and photonics, and chip stacking. The 3,000km-wide telescope will be networked together using more than 80,000km of fiber optics. The telescope itself will be tasked with studying the origins of the universe, performing extreme tests on Einstein’s theory of general relativity, investigating dark matter, and more." Link to Original Source
Programmers used to batch environments may find it hard to live without
giant listings; we would find it hard to use them.
-- D.M. Ritchie