By Matthew Dublin
Amazon Web Services (AWS) launched the AWS GovCloud today (US region only) to help government agencies and contractors store and analyze their data on the cloud.
Given AWS' failures as of late, which have fueled concerns about the cloud's robustness and security, this might not seem like a feasible IT solution for big government data. But Amazon is claiming that they have implemented a number of US government-specific regulatory requirements.
As Amazon.com's CTO Werner Vogels writes on his blog, the US Federal Cloud Computing Strategy does compel US agencies to consider the cloud first as the target for their IT operations. Only time will tell if research agencies will take advantage of this new service, but the whole announcement does beg a few questions: why would AWS not apply these same security standards across the board for all their customers? Why not just make the cloud as secure as possible for everybody, even if they don't have the same regulatory requirements as government agencies?
To reiterate, just keeping the cloud consistently operational has proven to be a challenge, and maybe that's what AWS is looking to improve upon with this new service. The AWS GovCloud will have some substantial redundancy and is spread far and wide geographically, with data centers or "general purpose regions" on the west coast and east coast, one in Ireland — which was recently downed due to a lightening strike — and two in Singapore and Tokyo.
If you're not sure how you feel about having your social security number spread across the globe, let alone your genetic information, nobody's blaming you. But now that AWS has this new market, with its stringent security requirements, individual users or research sites will inevitably benefit from the lessons learned as the cloud hopefully becomes more secure and consistent for everybody.