Question

I'm hosting a static website on S3. To push my site to Amazon I use the s3cmd command line tool. All works fine except setting the Content-Type to text/html;charset=utf-8.

I know I can set the charset in the meta tag in the HTML file, but I would like to avoid it.

Here is the exact command I'm using:

s3cmd --add-header='Content-Encoding':'gzip'
    --add-header='Content-Type':'text/html;charset=utf-8'
    put index.html.gz s3://www.example.com/index.html

Here is the error I get:

ERROR: S3 error: 403 (SignatureDoesNotMatch): The request signature we calculated does not match the signature you provided. Check your key and signing method.

If I remove the ;charset=utf-8 part from the above command it works, but the Content-Type gets set to text/html not text/html;charset=utf-8.

Was it helpful?

Solution

Two step process to solve your problem.

(1) Upgrade your installation of S3cmd. Version 1.0.x does not have the capability to set the charset. Install from master on github. Master includes fixes for this (1) bug and this (2) bug that result in failure to recognize the format of the content-type and the "called before definition" problem in earlier versions.

To install s3cmd from master on OSX do the following:

git clone https://github.com/s3tools/s3cmd.git
cd s3cmd/
sudo python setup.py install (sudo optional based on your setup)

Make sure your python libraries are in your path by adding the following to your .profile or .bashrc or .zshrc (again, depending on your system).

export PATH="/Library/Frameworks/Python.framework/Versions/2.7/bin:$PATH"

but if you use homebrew to might cause conflicts so - just symlink to the executable.

ln -s /Library/Frameworks/Python.framework/Versions/2.7/bin/s3cmd /usr/local/bin/s3cmd

Close terminal and reopen.

s3cmd --version 

will still output

s3cmd version 1.5.0-alpha3 - but its the patched version.

(2) Once upgraded, use:

s3cmd --acl-public --no-preserve --add-header="Content-Encoding:gzip" --add-header="Cache-Control:public, max-age=86400" --mime-type="text/html; charset=utf-8" put index.html s3://www.example.com/index.html

If the upload succeeds and sets the Content-Type to "text/html; charset=utf-8" but you see this error in the process:

WARNING: Module python-magic is not available...

I prefer to live without python-magic - I find that if you don't specifically set the mime-type, python-magic often guesses wrong. Install python-magic but be sure to set mime-type="application/javascript" in s3cmd or python-magic will guess it to be "application/x-gzip" if you gzip your js locally.

Install python-magic:

sudo pip install python-magic

PIP broke with the recent OSX upgrade so you may need to update PIP:

sudo easy_install -U pip

That will do it. All this works with S3cmd sync too - not just put. I suggest you put s3cmd sync into a thor-type task so you don't forget to set the mime-type on any particular file (if you are using python-magic on gzipped files).

This is a gist of an example thor task for deploying a static Middleman site to s3. This task allows you to rename files locally and use s3cmd sync rather than using S3cmd put to rename them one-by-one.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top