Amazon's cloud computing business is all about moving infrastructure to the internet. So why is the company encouraging some developers to move their data to the cloud using the postal system? In cases where very large files are involved, Amazon says snail mail can be faster than a standard internet connection. So the company has rolled out a new service called AWS Import/Export that lets people ship large data sets to Amazon for loading into its S3 storage service.
The new service, which is being launched in limited beta in the U.S., costs $80 per storage device, and $2.49 per data-loading hour. Amazon recommends a variety of external hard drives for storing the data for shipment.
Amazon CTO Werner Vogels in a blog post sheds more light on the decision to go postal, explaining about how the growth of data sets has outpaced the capacity of internet networks:
Gigabyte data sets are considered small, terabyte sets are common place, and we see several customers working with petabyte size datasets.
No matter how much we have improved our network throughput in the past 10 years, our datasets have grown faster, and this is likely to be a pattern that will only accelerate in the coming years. While network may improve another other of magnitude in throughput, it is certain that datasets will grow two or more orders of magnitude in the same period of time.
Here's a table from Vogels on how much time it takes to move a terabyte data set on typical network speeds:
READ MORE and COMMENT, more
No comments:
Post a Comment