Space data storage
NASA Needs More Data Storage Amid Output Surge From New Satellites
NASA’s decision to migrate its data to the cloud was prompted by a need for more space rather than because of security concerns, an agency official said. The space agency has spent billions of dollars addressing the shortage of data storage capacity amid the flood of new data coming from next-generation satellites and overseas ground stations, FedScoop reported Tuesday.
Joe Foster, cloud computing program manager at NASA’s Goddard Space Flight Center, said that beaming space data to the U.S. via the cloud saves money that may have been spent on additional storage infrastructure. He said that the funds saved can be allocated elsewhere, such as on upgrading satellite antennas.
It was explained that the blueprint for the space agency’s move to the commercial cloud is contained in its Data Acquisition Processing and Handling Network Environment project. The first mission to benefit from the DAPHNE service will be the Neutron star Interior Composition Explorer, which churns out 41 terabits of earth science data each day.
Aside from the problem of where to put the flood of fresh data, NASA is also responsible for making sure that information accumulated over the years is preserved for posterity. The plan is to put all the data amassed through legacy missions in a private network, where they can still be accessed and analyzed in the cloud, Foster revealed.
It was further revealed that NASA seeks a more secure platform for more sensitive data, such as those related to flight and launch capabilities and human space exploration. As it stands, the agency’s current cloud platform is only “moderate” level rated under the Federal Information Security Management Act of 2002, because much of NASA’s scientific data is being made available to the general public.
Tags: cloud Data Acquisition Processing and Handling Network Environment FedScoop Goddard Space Flight Center Joe Foster NASA Neutron star Interior Composition Explorer Pangeo space