bdunagan

Brian Dunagan

November 21 2011
Deleting large AWS S3 buckets

2011.12.08 Update: Amazon now supports multi-object delete in a single API request, up to 1k objects.

Amazon’s AWS S3 is awesome. However, there are times when the API makes certain tasks a bit hard. Today, I wanted to delete a bucket with 100k files. AWS does not support deleting a bucket with files inside; you have to delete the files first.

First, I tried aws/s3. It supports :force when deleting buckets, in case they have files in them. That call was taking a long time, so I poked around the code. Unfortunately, the gem didn’t make some magic API call. It just looped:

def delete_all
  each do |object|
    object.delete
  end
  self
end

Next, I tried to find out how geemus’s brilliant fog gem handles this need. It doesn’t, because providers don’t.

Finally, I gave up that one API call dream and looked for what other people do. They use pagination and threads, and I ended up forking SFEley’s s3nuke script from GitHub. In the process of porting his solution from RightAWS to Fog, I discovered that fog handles auto-pagination internally. No need to mess with is_truncated or :marker. Quite nice.

This Ruby script deleted an S3 bucket with 100k files in 15 minutes.

iPhone Tip: Tether to your Mac Ad-Hoc and App Store IPAs with xcrun
LinkedIn GitHub Email