The Twelve Days of AWS: S3

12 Days of AWS Day 1 written around snowflakes with a penguin sledding

For a while now I have been thinking about writing a blog post about Amazon Web Services (AWS) tools that have become a part of my toolkit recently. However, there are plenty of wonderful in-depth articles about specific parts of AWS and I really only wanted to focus on some of the parts to help demystify the storm of acronyms and code names that can make the AWS environment seem a bit opaque.

Seeing that we are at the end of the year, and in the spirit of the carol ’The Twelve Days of Christmas’, I have decided to provide a new twist, ’The Twelve Days of AWS,’ covering 12 tools that I have used recently. As does the carol, I'm going to keep each item short and sweet.

The First Day of AWS

On the first day of AWS I want to talk about S3, which was the first AWS product I ever encountered:
Simple Storage Service (S3) is a form of object storage. 

Data is organized into buckets that have unique names. Not unique within your organization, but unique across S3. If you delete a bucket and the name was snapped up by someone else, you need to choose a new name.

There are no directories, although it might look like there are when you are looking at the contents of a bucket. Instead, each object has a key, which can contain what looks like a directory path. If your object key is just bucket/key1 then it will appear at the root of the bucket. If you have two objects with keys bucket/some/key2, bucket/some/key3 then in the S3 browser it will look like there is a folder called 'some.' However, that is not the case, this is just how it is presented in the user interface. If you delete what appears as the ‘contents’ of 'some' in the UI then the 'some' entry no longer exists, as it is just representing the grouping of the objects with a common key section.

  • Objects can be cut and pasted easily around S3, and it is very fast.
  • S3 can serve as a temporary directory for script files such as lock files or for tracking the progress through multiple scripts with a key.
  • Unlike a lot of AWS, S3 is not region-based. When you view the S3 buckets you are viewing the global display. Regions such as us-east-1 or ap-northeast-3 are selected for many services and some services only work on certain regions. The UI can limit what is displayed based on the region you have selected.
  • With Drupal you can use a module like s3fs to access S3 buckets from Drupal and upload files to S3 from the Drupal file system.