Restrict access to S3 hosted files via HTTP


#1

Hello guys, I’m using a S3 Bucket to host my images for my latest project, my problem is the following: some websites have been using this images to republish my project and last month I got a crazy bill for a bandwidth traffic of more than 170gbs… So I was wondering if any of you know how to do this?

I did myself some research and spend a few days trying to solve this but not such luck.
I read this and I thought that was it but after I tried using those policies it didn’t particularly work I was still seeing my images in those other sites.

Thanks for everything.


#2

You may reconsider using S3 if you want to keep costs low. There’s a lot of cheap cloud servers available that offer unlimited bandwidth for $5/mo. DigitalOcean is one I believe.

You can use this to generate bucket permissions
http://awspolicygen.s3.amazonaws.com/policygen.html

Then you can add the referrer conditions manually. Here’s an example.

{
  "Id": "Policy1350503700228",
  "Statement": [
    {
      "Sid": "Stmt1350503699292",
      "Action": [
        "s3:GetObject"
      ],
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::files.example.com/*",
      "Condition": {
        "StringLike": {
          "aws:Referer": [
            "http://example.com/*",
            "http://www.example.com/*"
          ]
        }
      },
      "Principal": {
        "AWS": [
          "*"
        ]
      }
    }
  ]
}

#3

You may reconsider using S3 if you want to keep costs low. There’s a lot of cheap cloud servers available that offer unlimited bandwidth for $5/mo.

Every cloud/storage service I used before S3 was just utter garbage.