Bootstraping a production-ready django project

If you have some experience with django and feel that the django django-admin startproject command is too light for you, then cookiecuter-django might be just what you are looking for.

Through this guide, we will see how to start a new django project using a cookiecutter. This guide is intended for beginners and intermediate level devs who already have experience with django and want a ready to use production grade setup for their django projects. I'm promoting cookiecutter-django here, but it's not the only one available in town, a quick search for django cookiecutter on the django packages website or GitHub will give you a ton of options.

If you are wondering what I mean by production-ready, here is a non-exhaustive list of what I expect :

  • settings oriented towards the highest level of security
  • basic deployment requirements included
  • logging and monitoring set up
  • postgresql requirements included
  • linters and formatters already set up
  • pytest set up
  • etc

And for convenience, cookiecutter-django makes it easy to set up via the prompt common django packages that solve common issues such as :

  • Task queue manager (celery)
  • Mail services (amazon SES, Sendgrid, etc.) etc.
  • CI / CD pipeline (GitHub, gitlab, etc)

If you are already convinced at this point, just run the two commands below, and you are done 😀.

pip install "cookiecutter>=1.7.0"

Read on for a closer look at all the options available and why you might want to ignore some of them to simplify the initial project generated a bit.

So what is a cookiecutter anyway ?

The cookiecutter project was initiated by Audrey Feldroy, the co-author of the excellent Two Scoops of Django. It is a command-line utility that creates projects from cookiecutters (project templates). There are a multitude of cookiecutters for various frameworks, django, flask, etc.

Generate a project using cookiecutter-django

The default values are in brackets, if you see a blank line, then I used the default value by pressing the Enter key. Throughout the article I have included many links to other articles to help understand what I can't go into here, but don't feel obligated to follow each link every time, or you may never get through it 😅. I suggest you read the entire article at least once, then revisit it by following the links that interest you.


pip install "cookiecutter>=1.7.0"

Project metadata

This section is for basic project information such as name, description, authors, etc.

project_name [My Awesome Project]: Ushopify
project_slug [ushopify]:
description [Behold My Awesome Project!]: My amazing ecommerce platform.
author_name [Daniel Roy Greenfeld]: John Doe
domain_name []:
email []:
version [0.1.0]:
Select open_source_license:
1 - MIT
2 - BSD
3 - GPLv3
4 - Apache Software License 2.0
5 - Not open source
Choose from 1, 2, 3, 4, 5 [1]: 1

A source code license is a legal text that tells people what they may do with the source code, for example, edit it, use it, share it with others freely, etc. More information about software licensing here.

timezone [UTC]: US/Pacific

I chose a completely random location. You can get the full list of timezones or open your python shell (ever heard of bpython?) and type this code:

from pytz import all_timezones
for tz in all_timezones:

Editor and OS

I'm on linux, so n for me.

windows [n]:
use_pycharm [n]: y

If you are using pycharm, the cookiecutter will create runserver, migrate and more run configurations automatically for you, neat 😎.


use_docker [n]: n

Docker is an open source platform for developing, shipping, and running applications. Docker enables you to separate your applications from your infrastructure, so you can deliver software quickly. Enter y and the cookiecutter will automatically create docker and docker-compose files to help you run your project locally and even deploy it with less effort. It's a great start to help you deploy your project, but I think it is necessary to have a minimum of experience with docker to understand how to correctly use the generated files. If you want more information about docker, read this.

Select postgresql_version:
1 - 14
2 - 13
3 - 12
4 - 11
5 - 10
Choose from 1, 2, 3, 4, 5 [1]:

Set the version of postgresql to use in the Dockerfile for production. This only matters if you enter y at the use_docker option. As I don't often use the docker and docker compose configuration provided by the cookiecutter (I create my own files), it doesn't really matter to me.

Cloud services

Select cloud_provider:
1 - AWS
2 - GCP
3 - None
Choose from 1, 2, 3 [1]:

This is about static (HTML, CSS, static images like background images) and media files (file upload by you or your users like profile picture for example) storages. If you choose AWS (amazon web service) or GCP (google cloud provider), your project will be configured for these providers to handle static and media files in production. If you choose None, then media files will not work by default (you will have to set it up yourself) in production and static files will work only if you choose y on the option use_whitenoise. For a step-by-step guide on how to configure static and media files serving yourself using AWS, watch this.

Select mail_service:
1 - Mailgun
2 - Amazon SES
3 - Mailjet
4 - Mandrill
5 - Postmark
6 - Sendgrid
7 - SendinBlue
8 - SparkPost
9 - Other SMTP
Choose from 1, 2, 3, 4, 5, 6, 7, 8, 9 [1]: 2

I usually go with amazon SES because it is the easiest to set up for me, but do what you want based on your experience. For a simple setup based on SMTP settings, choose 9.

Async django

use_async [n]:

Indicates whether the project should use web sockets with Uvicorn + Gunicorn.

This statement is from the cookiecutter-django official documentation. If you are not sure what it means, there is a high chance you don't need it, skip it. If you want to know more about websocket in django, read this.

Django rest framework

use_drf [n]:

In case you are building a json based API, then choose y. More info on Dango Rest framework here. DRF is the go-to framework to build json-based APIs with django. What I call a json-based API here most people would call it a REST API, but here's why they're wrong.


Select frontend_pipeline:
1 - None
2 - Django Compressor
3 - Gulp
Choose from 1, 2, 3 [1]:

The tools suggested here are for minification, transpilation, etc of static assets (CSS, JS, etc). I don't use any of them, so my answer is always the default, None. If you want to know more about these tools, check out their official documentation:


use_celery [n]:

Celery is an open source asynchronous task queue or job queue which is based on distributed message passing. While it supports scheduling, its focus is on operations in real time. That was the wikipedia definition 😅. Celery is used to handle asynchronous tasks (background tasks) and scheduled tasks (things that should happen in the future). I don't use celery that much, I find it a bit too complex for most of my use cases. You can read my article on Handling background tasks in django to see how I handle this kind of stuff in my projects.

Local email server setup

use_mailhog [n]: y

MailHog is an email testing tool for developers. The django console backend is set up for local development if you choose n but mailhog gives you a nice graphical user interface (GUI) when you test your email delivery locally.

Sentry - production error tracking

use_sentry [n]: y

Sentry is an error monitoring and tracking system. I recommend you always choose y if you know your app is going in production, it will avoid you the hassle of always switching your DEBUG environment variable value to see errors when they happened in production. It is very easy to set up, just follow the official guide If you want to configure it manually. If you choose y the only thing you need is a Sentry DSN key that you can get by creating a django app on the official web platform.


use_whitenoise [n]: y

Whitenoise serves your static files in production and locally if you want it to. I usually choose y because having the same setup for production and local development can help mitigate surprises on your production environment.


use_heroku [n]: y

Heroku is a hosting platform for web applications. The cookiecutter will add the necessary requirements for your project to work on heroku, these requirements being: - A requirements.txt file with all your project production requirements - A Procfile If you are using a similar deployment method based on a procfile like dokku type y if not then n.

Continuous Integration (CI)

Select ci_tool:
1 - None
2 - Travis
3 - Gitlab
4 - Github
Choose from 1, 2, 3, 4 [1]:

Select an option other than the default if you are planning to set up a CI/CD pipeline for your project. More info on how to set up CI/CD for a django project here.

Environment variables

keep_local_envs_in_vcs [y]:

If you typed y on the use_docker or use_heroku option, then your project will have a .envs folder with .local and .production subdirectories. If on the current option you type n, both folder will be kept out of your version control system (VCS), if you choose the default value your `.local will be tracked by your VCS.

Cookiecutter debug

debug [n]:

This option is only for Cookiecutter Django developers only, choose the default value.

If all went well, you will get this message.

 [SUCCESS]: Project initialized, keep up the good work!

Structures of the project

Let's take a quick look at the generated project. I use tree to get a two-level tree view generated project.

cd ushopify
tree -L 2
├── config
│   ├──
│   ├── settings
│   ├──
│   └──
├── docs
│   ├── Makefile
│   ├──
│   ├──
│   ├── howto.rst
│   ├── index.rst
│   ├── make.bat
│   ├── pycharm
│   └── users.rst
├── locale
│   └── README.rst
├── pytest.ini
├── requirements
│   ├── base.txt
│   ├── local.txt
│   └── production.txt
├── setup.cfg
├── ushopify
│   ├──
│   ├──
│   ├── contrib
│   ├── static
│   ├── templates
│   ├── users
│   └── utils
└── utility
    ├── requirements-bionic.apt
    ├── requirements-bullseye.apt
    ├── requirements-buster.apt
    ├── requirements-focal.apt
    ├── requirements-jessie.apt
    ├── requirements-stretch.apt
    ├── requirements-trusty.apt
    └── requirements-xenial.apt

13 directories, 32 files

At the root of your project should have these directories:

  • config: store all your project settings and configurations. In the settings subdirectory, you have a setting file for common settings, a and a file respectively for development and production specific settings. In the root of this directory, you have your classic (your project level urls configurations) and your file.
  • docs: if you need to write a documentation for your project, it is configured to use sphinx documentation generator
  • locale: this folder is there to store translations
  • requirements: this folder contains all your project's requirements, the base.txt file contains all requirements common to your dev and prod environments, the local.txt file for your development environment and production.txt for your production environment.
  • ushopify: this folder contains all your templates and statics files, it also contains a users app created by the cookiecutter. This app use the excellent allauth package to offers your project full user management system, login, logout, reset password, change password, change email, email verification and much more. The users app contains a tests directory with test files structured like this: test_{module}.py. This test structure is the one defined in the ushopify/ file. Follow it when writing your tests, or update the file to match your needs. In this folder, you also have a utils subdirectory that contains a file. Read this short article for more information on context processors.
  • Utility: this folder contains some bash scripts that help you install system and project requirements, useful only if you are planning on deploying on a linux server and set up the server yourself.

Beside those directories, you have the classic file to run your commands, a Procfile if you choose to deploy using Heroku, a pytest.ini file because the project is configured to use pytest for testing.

Run your project

Running your project works like for any other django project, you create a virtual environment, you install the requirements, create a database, then run your migrations. You can find detailed instructions here.

I know it was a lot to digest and the project generated can seem very complex the first time, but you get used to it with time. If you feel like some of the generated defaults don't work for you, want to use poetry instead of a virtualenv, don't like the exact folder structure, etc, I encourage you to build your own cookiecutter or create a simpler version with the django project template system. If you have any comments, feel free to leave them in the comment section below and subscribe to keep up to date with my findings 😃.