Big Data on the Final Frontier


Missions in space may come and go, but the National Aeronautics and Space Administration has always stuck to a mission of bringing in data.

(
One of its early achievements in this field was sending a spacecraft close enough to Venus to get accurate readings of its surface and atmosphere. On Dec. 14, 1962, the Mariner 2 spacecraft got within 34,762km (21,600 miles) of the planet. Over a 42-minute period, it was able to pick up many points of data that proved Venus, which had been thought of as Earth's twin, would be uninhabitable, with a surface temperature of 425°C (797°F) and a toxic atmosphere.
This picture (from NASA's site) of the data gathered in that mission is cropped. The paper showing the data that was gathered is actually much longer, as this uncropped version shows.

Back then, the data covered a roll of paper, but the data NASA handles today takes supercomputing power to process. As Nick Skytland wrote in NASA blog post in October:
In the time it took you to read this sentence, NASA gathered approximately 1.73 gigabytes of data from our nearly 100 currently active missions! We do this every hour, every day, every year -- and the collection rate is growing exponentially...
In our current missions, data is transferred with radio frequency, which is relatively slow. In the future, NASA will employ technology such as optical (laser) communication to increase the download and mean a 1000x increase in the volume of data. This is much more then we can handle today and this is what we are starting to prepare for now. We are planning missions today that will easily stream more 
[than] 24TB's a day. That's roughly 2.4 times the entire Library of Congress -- EVERY DAY. For one mission.
read more at 

Big Data on the Final Frontier

0