Get Django Updates

Automate deployment of Django static files (Images, CSS, JavaScript) to Amazon S3


You want to automate the deployment of Django static files (Images, CSS, JavaScript) to Amazon S3


Install the boto and django-storages packages that include the facilities to upload Django static files to Amazon S3 via the collecstatic command. Set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY variables in to access AWS services. Create an S3 bucket and declare it in the AWS_STORAGE_BUCKET_NAME variable in Set the AWS_QUERYSTRING_AUTH = False variable in to disable authentication tokens in Django static links. In addition, set the STATIC_URL variable to reflect the URL of the S3 bucket (e.g.https://<bucket_name> Finally, set STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage' in to tell Django to upload static files to S3 when the collecstatic command is run.

To enforce authentication of Django static resources in Amazon S3, follow the previous steps, but also set the AWS_DEFAULT_ACL = 'authenticated-read' variable and change AWS_QUERYSTRING_AUTH = True in Additional configuration parameters to influence the behavior of S3 via Django are also available (e.g. AWS_S3_FILE_OVERWRITE to set overwriting behavior, AWS_REDUCED_REDUNDANCY to set S3 reduced redundancy, AWS_HEADERS to set S3 HTTP headers).

How it works

The python collectstatic command organizes all your Django static files (Images, CSS, JavaScript) and places them in the folder defined in STATIC_ROOT in preparation for final deployment. However, this process still requires that you manually upload the static files to Amazon S3. Next, I'll describe how to automate the uploading of static files directly to Amazon S3 when you run the collectstatic command.

Note Ensure you know the ABCs of Django static file management

This recipe assumes you already know about Django's static file management. If the STATIC_ROOT concept or collectstatic command are unfamiliar to you, I strongly recommend you first read the recipe Set up static web page resources -- Images, CSS, JavaScript otherwise some of following techniques may not work as described.

As a first step you need to install two pre-requisite packages: boto -- a Python library designed to work with Amazon AWS services -- and django-storages -- a Python library designed so Django can automate work with different storage technologies. Once you install these packages with pip install django-storages boto you're ready to move onto the configuration.

Listing 1 illustrates the minimum set of variables needed in to automate the upload process of Django static files to Amazon S3.

Listing 1 - Django minimum variables to automate upload of static files to Amazon S3 via collectstatic

AWS_ACCESS_KEY_ID = '...........'
AWS_SECRET_ACCESS_KEY = '..........'


STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

The first pair of variables in listing 1 AWS_ACCESS_KEY_ID and AWS_SECRET_ACCES_KEY are related to your AWS account and grant your Django project access to AWS services.

If you go to the 'Security Credentials' option in the top menu of your AWS account -- as illustrated in figure 1 -- you'll be taken to the 'Your Security Credentials' page. From there click on the 'Access Keys' tab to see your account's access key. You can create a new set of access keys by clicking on the 'Create New Access Key' button, after which you'll see a pop-up window with the new credentials as illustrated in figure 2.

AWS console to get access keys
Figure 1.- AWS console to get access keys
AWS console new set of access keys
Figure 2.- AWS console new set of access keys

Once you define the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCES_KEY values, you then need to define the AWS_STORAGE_BUCKET_NAME variable. The AWS_STORAGE_BUCKET_NAME value points to the S3 "bucket" that will hold the static files. "Bucket" is simply the term used by S3 to refer to a location.Figures 3, 4 & 5 illustrate the sequence to create a new S3 bucket.

Next, the AWS_QUERYSTRING_AUTH is set to False so Django generates simple (i.e. no authentication) static resource links in Django templates (e.g. with AWS_QUERYSTRING_AUTH = False Django generates links like and with AWS_QUERYSTRING_AUTH = True (the default) Django generates links like I'll provide additional details on the use of S3 simple and authentication links in the next section.

AWS S3 home page
Figure 3.- AWS S3 home page
AWS S3 create bucket pop-up
Figure 4.- AWS S3 create bucket pop-up
AWS S3 bucket page
Figure 5.- AWS S3 bucket page

Figure 3 illustrates the AWS S3 home page where you can click on the 'Create bucket' button to create a new bucket. Next, a pop-up window appears -- as illustrated in figure 4 -- where you can name the bucket and select the region to create it on. Once you create the bucket, you'll be taken to the bucket main page illustrated in figure 5.

By default, AWS S3 buckets are accessible under the https://<bucket_name> domain. This means that if the S3 bucket name is coffeehouse, the public URL for this bucket is Taking this a step further, if you place the logo.png image in a sub-folder named images in this same bucket, the public URL for the image would be

Because of this last default convention, the next variable definition in listing 1 is STATIC_URL. By assigning the STATIC_URL variable the public URL of the S3 bucket, Django can then substitute all static file references in templates to point to this public URL (e.g. the template snippet {% load static %} <img src="{% static 'images/logo.png' %}">, gets converted to <img src="">).

Finally, the STATICFILES_STORAGE variable in listing 1 is assigned the storages.backends.s3boto.S3BotoStorage value. The storages.backends.s3boto.S3BotoStorage class provides the necessary logic so when the python collectstatic is run, all Django project static files are uploaded to an S3 bucket -- instead of the default behavior to copy static files to the STATIC_ROOT folder -- and also provides functionality to generate static links (i.e. {% static %}) based on Amazon S3 configuration, namely authentication tokens if required.

Once you place the configuration parameters in listing 1 in your file, you can run python collectstatic as illustrated in listing 2 to upload a Django's static files to an S3 bucket.

Listing 2 - Django collectstatic to upload static files to Amazon S3

[user@coffeehouse ~]$ python collectstatic

You have requested to collect static files at the destination
location as specified in your settings.

This will overwrite existing files!
Are you sure you want to do this?

Type 'yes' to continue, or 'no' to cancel: yes
Copying '/www/STORE/coffeestatic/website-static-default/sitemap.xml'
Copying '/www/STORE/coffeestatic/website-static-default/robots.txt'
Copying '/www/STORE/coffeestatic/website-static-default/favicon.ico'

Copying '/www/STORE/coffeehouse/about/static/css/custom.css'

732 static files copied

As you can see in listing 2, after you run the collectstatic command you'll see log messages like 'Copying ...' to indicate Django is copying the files to the S3 bucket.

Enforce authentication of Django static resources in Amazon S3 to limit public access

The previous Django static file / Amazon S3 set up is a very basic configuration. One of the characteristics of the previous configuration is that it publishes Django static files to Amazon S3 with full public access, which means not only can your Django application users access the static files, but anyone on the Internet can (e.g. scrapers, non-application users). To limit the access of Django static resources in Amazon S3 to only Django application users you can integrate authentication into S3.

Listing 3 shows another set of configuration variables that enforces S3 authentication

Listing 3 - Django variables to set authentication of static files on Amazon S3 via collectstatic

AWS_ACCESS_KEY_ID = '...........'
AWS_SECRET_ACCESS_KEY = '..........'
AWS_DEFAULT_ACL = 'authenticated-read'
AWS_QUERYSTRING_AUTH = True # Technically not needed, as it defaults to True


STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

The variables in listing 3 are very similar to those in listing 1, so I'll just touch on the differences. The first difference is the AWS_DEFAULT_ACL = 'authenticated-read' variable, this tells Django that when it uploads files to an S3 bucket it set the permissions so only authenticated users can read files. This effectively blocks everyone from accessing the S3 static files unless they're authenticated. So the next issue becomes, how do you grant Django application users authentication ? You do this through the actual Django application links to static files.

In the previous configuration in listing 1, I set the AWS_QUERYSTRING_AUTH variable to False so Django didn't generate links with authentication information. Because by default all static files are uploaded to S3 with public-read permission, authentication information is irrelevant. But now that listing 3 sets the S3 upload permissions to authentication-read, the authentication information is required once again, so in listing 3 I set the AWS_QUERYSTRING_AUTH variable to True -- or note you can omit it altogether, since AWS_QUERYSTRING_AUTH = True is the default.

Once you set the configuration in listing 3, Django then substitutes all static file references in templates to use URLs with authentication tokens (e.g. the template snippet {% load static %} <img src="{% static 'images/logo.png' %}">, gets converted to <img src="">). In this case, the additional URL parameters for every static file are generated from AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, providing a means to enforce that only users with such tokens can access static files in S3.

Besides the configuration parameters presented in listing 1 and 3, table 1 presents the full list of options available to automate the uploading of Django static files to Amazon S3.

Table 1 - Django static file / Amazon S3 configuration parameters
VariableDescriptionDefault value
AWS_S3_ACCESS_KEY_IDAmazon AWS access keyAWS_ACCESS_KEY_ID (If set for boto package)
AWS_S3_SECRET_ACCESS_KEYAmazon AWS secret keyAWS_SECRET_ACCESS_KEY (If set for boto package)
AWS_S3_FILE_OVERWRITEIndicates if files should be overwrittenTrue
AWS_HEADERSCan specify HTTP headers (e.g. Expires,Cache-Control) as a dictionary {} (Empty dictionary)
AWS_STORAGE_BUCKET_NAMEIndicates the name of the S3 bucketRequired
AWS_AUTO_CREATE_BUCKETIndicates to create the bucket if it doesn't existFalse
AWS_DEFAULT_ACLIndicates the upload permissions (e.g. public-read,authenticated-read)public-read
AWS_BUCKET_ACLIndicates the upload permissions for the bucketValue from AWS_DEFAULT_ACL
AWS_QUERYSTRING_AUTHTells Django to generate static links with authentication tokensTrue
AWS_QUERYSTRING_EXPIRESet the expiration time of authenticated links in seconds3600 (1 hour)
AWS_REDUCED_REDUNDANCYAbility to set S3 reduced redundancy which lowers costs at the expense of redundancyFalse
AWS_LOCATIONCan be used to prepend a specific path to filters. Otherwise, the paths are relative to the root of the bucket''
AWS_S3_ENCRYPTIONSets S3 storage (i.e.server-side) encryptionFalse
AWS_S3_CUSTOM_DOMAINAbility to provide a custom domain vs. the S3 default https://<bucket_name>
AWS_S3_CALLING_FORMATDefines the calling mechanism, see boto.s3.connection for more details and alternativesSubdomainCallingFormat()
AWS_S3_SECURE_URLSTells Django to build URLs with the HTTPS protocolTrue
AWS_S3_FILE_NAME_CHARSETIndicates the charset to use for file namesutf-8
AWS_IS_GZIPPEDCreates gzipped versions of the file on S3, used in conjunction with the GZIP_CONTENT_TYPES variableFalse
AWS_PRELOAD_METADATAProvides caching of S3 file metadataFalse
GZIP_CONTENT_TYPESDefines the types of content to gzip, note this is dependent on the AWS_IS_GZIPPED variable'text/css',
AWS_S3_URL_PROTOCOLDefines the protocol to access fileshttp:
AWS_S3_HOSTSets the default S3 host, set by the boto packageS3Connection.DefaultHost which is
AWS_S3_USE_SSLIndicates to use SSL to communicate with S3True
AWS_S3_PORTCan customize port used to communicate with S3None
AWS_S3_PROXY_HOSTSets a proxy host to communicate with S3None
AWS_S3_PROXY_PORTSets a proxy port to communicate with S3None
AWS_S3_MAX_MEMORY_SIZEThe maximum amount of memory a returned file can take up before being rolled over into a temporary file on disk0 (Do not roll over)