Amazon Offers Online Storage
EWeek posted this article today.

Amazon.com's Amazon Web Services on March 14 announced Amazon S3, an inexpensive, reliable storage service that allows small companies to instantly scale their growth.

Amazon S3 offers software developers a highly scalable, reliable, and low-latency data storage infrastructure at low cost, according to the Seattle-based company. It provides developers with access to the same storage system that Amazon uses to run its own infrastructure, company officials said.

"Amazon S3 is based on the idea that quality Internet-based storage should be taken for granted," said Andy Jassy, vice president of Amazon Web Services, in a statement. "It helps free developers from worrying about where they are going to store data, whether it will be safe and secure, if it will be available when they need it, the costs associated with server maintenance, or whether they have enough storage available. Amazon S3 enables developers to focus on innovating with data, rather than figuring out how to store it."

Jassy said Amazon S3 lets developers pay only for what they consume and there is no minimum fee. Developers pay $0.15 per gigabyte of storage per month and $0.20 per gigabyte of data transferred, Jassy said.

Amazon S3 provides a Web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the Web.

The new Amazon storage service enables developers to write, read and delete objects containing from one byte to 5GB of data each; each object is stored and retrieved via a unique developer-assigned key; objects can be made private or public, and rights can be assigned to specific users; and Amazon S3 uses standards-based REST (Representational State Transfer) and SOAP (Simple Object Access Protocol) interfaces designed to work with any Internet-development tool kit, company officials said.

Amazon S3 already has some early users, including the University of California, Berkeley's Stardust@Home team, which is responsible for NASA's "Stardust@Home" project. The team is using Amazon S3 to store and deliver the 60,000 images that represent the data collected from its dust particle aerogel experiment. These images will be delivered to 100,000 volunteers around the world who scan the images looking for dust particles from comet Wild2.

"We quickly ran into challenges when we started the project using our own infrastructure," said Andrew Westphal, project director of Stardust@Home, in a statement. "Using Amazon S3 has allowed us to proceed without having to worry about building out the massive storage infrastructure we realized that we needed to successfully complete the project. The fact that Amazon S3 is an Internet-connected storage service is particularly useful to us as we expect the data examination phase of the project to take only a few months. We can quickly ramp up and back down again without a huge investment."

Archives