VIX-Chevron-Up

Deploying Wagtail In Production

By Damon Jagger Christopher Shaw

Database Setup

If you have been working straight from the Wagtail tutorials, chances are you are using SQLite and want to move to something more suitable for your production application. We recommend using PostgreSQL, and here is the approach we take to adding a production database.


The first step is to define the DATABASES section in your settings.py. Depending on your environment and desired workflow this might as simple as overwriting the base.py which will ensure postgres is used in development and production. If only the latter is desired, this can be done overwritten exclusively in production.py.
DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql_psycopg2',
        'NAME': '####',
        'HOST': '####',
        'PORT': '5432',
    }
}
  • Due to the fact that we are using PostgreSQL for this example, we have opted for the postgresql_psycopg2 backend engine.
  • The Name attribute specifies the name of the database you wish to use in your project
  • The Host attribute specifies the location of the target database, defaulting to localhost if unspecified.
  • The Port attribute specifies the port of the target database, in this case the default PostgreSQL port 5432.

Application Setup

To run your Wagtail application in production mode, there are a few commands that need to be run. We typically include these in a script that executes during deployment, these commands include:

python manage.py migrate
python manage.py createsuperusers

In addition to these commands, you need to specify that the application should run in production mode. This is done by setting the environmental variable DJANGO_SETTINGS_MODULE to ####.settings.production, where the hashes are replaced by your project name. This ensures Django uses the appropriate settings.py file.

Production.py comes pre built with barely any additional setup other than the DEBUG=False flag. This flag tells Django the run the application in production mode, but there are additional configuration values needed to prepare your application for production:

Import os
env = os.environ.copy()
SECRET_KEY = env['SECRET_KEY']

Django requires a secret key to manage the cryptographic functions in the application, with it being bad practice and insecure to store this directly in the production.py file. As such, this should be defined at an environment level and referenced instead.


Finally, there are two commands to be run to make your applications static files available in production.

python manage.py collectstatic
python manage.py compress

These commands collect all of the static files in your application and compress them ready to be served to the user. At this stage, your application should be deployable. Opening a browser and navigating to your wagtail application’s admin page should just work. The first issue you will notice is that there is no static assets, i.e. styling. That’s because we are not yet serving those assets in production.

Serving Static Assets

If you are new to Wagtail or Django you might be unfamiliar with the way your site content, assets and media are served in Production. In short, they aren’t. In development assets are served insecurely. For Production, Django, and therefore Wagtail, stick to what they do well and leave the serving of assets to third parties.


There are numerous ways of serving this content, but for the purpose of this tutorial we will be using WhiteNoise. WhiteNoise is a Python library that allows Django to serve static assets in production.
pip install whitenoise
pip freeze > requirements.txt

The next step is to configure wsgi.py in your base site folder, in the same level as settings. WSGI is the primary deployment platform for Django, and we configure it to make use of the WhiteNoise library.

Import os
from django.core.wsgi import get_wsgi_application
from whitenoise.django import DjangoWhiteNoise

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "####.settings")

application = get_wsgi_application()
application = DjangoWhiteNoise(application)

Where #### is replaced by the name of your Wagtail project.

The next step is to configure our application in production.py to make use of WhiteNoise to activate offline compress and generate the admin section assets.

STATICFILES_STORAGE = 'whitenoise.django.GzipManifestStaticFilesStorage'

COMPRESS_OFFLINE = True
COMPRESS_CSS_FILTERS = [
    'compressor.filters.css_default.CssAbsoluteFilter',
    'compressor.filters.cssmin.CSSMinFilter',
]
COMPRESS_CSS_HASHING_METHOD = 'content'

With these changes added, you should be able to re-deploy your application and see the majority of your assets served. At this stage, the deployment setup is almost complete. The final step is to ensure your dynamic media assets are served to users. WhiteNoise won’t handle this for you, and the recommended solution we use is to serve our assets using Amazon’s S3 service.

Serving Dynamic Media

If you’ve stuck with us this far, great, you’re almost done! The last step of the deployment is to ensure media assets generated after runtime are served to users. Which is quite important in a CMS solution. We do this by utilizing Amazon S3.

Creating An Iam User

First we need to set up an IAM user which will provide us with a key pair that allows our wagtail application to access the S3 bucket we will be creating to store our media shortly.

To do this, log into AWS and head over to the IAM section. Create a new user and don't apply any rules or policies to it just yet. Make sure you download the CSV file containing the credentials as you won't be able to retrieve the private key again (if you lose your private key you need to generate new keys for that user).

Make sure that you note down your users ARN

Setting Up Your S3 Bucket

Now we need to create our S3 bucket. This is where all of our uploaded media will be stored. Head over to the AWS S3 managmement console this time and click Create Bucket.

Create your bucket with the bare minimum settings, and don't allow any access to it just yet in the wizard - we're going to create our own policy for that! Your bucket name needs to be unique across all buckets in S3 but make sure it's clear to you what it's for so that you don't get confused if you ever end up with a lot of different buckets for different things.

After creating the bucket, click on it to see the bucket management screen and then click on Permissions > Bucket Policy. This is where you will input our JSON bucket policy that will control who/what has access to your stored files. Copy the snippet below and paste it into your bucket policy - be sure to remember to change the YOUR-BUCKET-NAME and YOUR-IAM-ARN stubs to match your own bucket name and user ARN that you took a note of earlier.

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "PublicReadForGetBucketObjects",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::YOUR-BUCKET-NAME/*"
        },
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "YOUR-IAM-ARN"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::YOUR-BUCKET-NAME",
                "arn:aws:s3:::YOUR-BUCKET-NAME/*"
            ]
        }
    ]
}

The first part of this policy gives everyone read access to your media, which is important for users to be able to see your images and download any files you're prodiving them with.

The second part of the policy gives your newly created IAM user CRUD capabilities in the bucket, that's to say your Wagtail application will be able to create, read, update and/or delete any of the files there. Smashing!

Setting Iam User Permissions

So here's where AWS can be a bit weird. You'd think the bucket permissions clearly gave your IAM user the ability to interact with the bucket and a lot of tutorials stop there. However, I've found that you usually still need to apply a permissions policy to your IAM user itself to make sure it can get as far as trying to access the bucket. To do this, head back to the IAM console, click on Policies in the sidebar and then on Create Policy.

AWS gives you three options for creating policies: 

  1. Copy an AWS Managed Policy
  2. Use the AWS Policy Generator (great for people new to AWS)
  3. Create Your Own Policy using JSON

You can use the Policy Generator if you want, but this is a relatively simple policy that we are going to create so I'll show you how to do it manually with JSON. After clicking Create Your Own Policy you'll see a screen similar to the one we saw when we input our S3 Bucket policy. Enter a name and description for your policy and then copy and paste the snippet below remembering to swap out YOUR-BUCKET-NAME for your own buckets name (easy-peasy!).

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": "arn:aws:s3:::YOUR-BUCKET-NAME"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": "arn:aws:s3:::YOUR-BUCKET-NAME/*"
        }
    ]
}

Once you've saved your policy, view it and click Attached Entities > Attach. This is where you select the IAM user we set up earlier and attach this new policy  to them. Once you've selected your user and clicked Attach Policy you're done with the AWS side of things - well done!

Connecting Wagtail With S3

To make use of S3 in Wagtail, you will need to setup a small amount of configuration. The first step is to install two packages.

pip install django-storages
pip install boto3
pip freeze > requirements.txt

You’ll need to add ‘storages’ to your INSTALLED_APPS in base.py.

INSTALLED_APPS = [
    ...
    'storages',
    ...
]

Now your application has the requirements to store dynamic media in S3. The final step is to configure the S3 storage settings in your settings.py. We prefer to do this in our production.py, passing in the variables from env.

AWS_STORAGE_BUCKET_NAME = '####'
AWS_ACCESS_KEY_ID = '####'
AWS_SECRET_ACCESS_KEY = '####'
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME

MEDIA_URL = 'https://%s/" % AWS_S3_CUSTOM_DOMAIN
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'

# Optional

AWS_S3_REGION_NAME = 'eu-west-2'
AWS_S3_SIGNATURE_VERSION = 's3v4'

This setup configures your application to utilize S3 to store your media assets. Replacing the hashes with your amazon S3 connection credentials and bucket names where appropriate. In our deployment we used a specific region and forced our signature version to be more secure, with the default falling back to v2.


Your application should now support dynamic media storage in S3. You can confirm this by going to your admin section and uploading an image.

Further Reading

If you have any issues or require further understanding, we have compiled a list of documentation used to create this post. If you’re still stumped, throw us an email.


Django. How to deploy with WSGI.
https://docs.djangoproject.com/en/1.10/howto/deployment/wsgi/


Wagtail. Deploying Wagtail to Heroku, 2017 edition.
https://wagtail.io/blog/wagtail-heroku-2017/


Wagtail. Using Amazon S3 to Store Wagtail Media Files.
https://wagtail.io/blog/amazon-s3-for-media-files/

<< Back to Insights