Deploying A Django, Postgresql Application On Google Kubernetes Engine(Gke)

I was trying to find a better cloud solution for my Django web application. I have been using google apis for years now including cloud services like firebase. It made sense for me to think of Google Cloud(gcloud) as my very first choice. That would have been easy, except google offers a lot of cloud services that I promise, you may not even be aware of. There is App Engine, Compute Engine, Kubernetes, Cloud Run, and Cloud Functions. To be honest, I think there will still be another cloud service hiding someone in the GCloud ecosystem that I have no idea of :). It's google, it's huge. Usually when deciding on what service to choose within google cloud, the factor that matters for most startups is the pricing. How this different services cost, is something you probably have to find out for yourself. Here is a link to google cloud services calculator. https://cloud.google.com/products/calculator/ Because of the money palaver in connection to deciding a cloud service for my app. I decided to try different of these services on google cloud. Here comes, the knowledge for this tutorial.

How to deploy a Django, PostgreSQL on Google Kubernetes Engine(GKE) 2020.

This is a beginners guide, if you are a pro, you probably know your way through google cloud .. we have compiled a list of the commands to get your application to run on GKE. Here are the commands, which we will briefly talk about below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
1. gcloud container clusters create gkeexample \
 --scopes "https://www.googleapis.com/auth/userinfo.email","cloud-platform" \
 --num-nodes 4 --zone "us-central1-a"
2. gcloud container clusters get-credentials gkeexample --zone "us-central1-a"
3. kubectl create secret generic cloudsql-oauth-credentials --from-file=credentials.json=/Users/{system_user}/Documents/dev/django/gkeexampleapp/cloudsql-oauth-credentials.json
4. kubectl create secret generic cloudsql --from-literal=username=[instance user] --from-literal=password=[enter your instance user password]
5. docker pull b.gcr.io/cloudsql-docker/gce-proxy
6. docker build -t gcr.io/gkeexampleapp/gkeexample . # in project directory
#..gcloud auth configure-docker
7. docker push gcr.io/gkeexampleapp/gkeexample
8. kubectl create -f app.yaml
9. kubectl get pods
10. kubectl get services gkeexample     // Get external ip url for your app. 

Step 1. Application Requirements & local environment setup:

Setting up your local environment

Assuming the above requirements have been satisfied, 1.Authenticate Gcloud on your computer

1
gcloud auth login

Configure Docker to use gcloud as a credential helper, so that you can push the image to Container Registry:

1
gcloud auth configure-docker

2.Enable the Cloud SQL Admin AP

1
gcloud services enable sqladmin

3.Setting up local environment for App Engine Sql for our cloud PostgreSql. Installing the Cloud SQL Proxy Download Cloud SQL Proxy

1
wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy

Make it executable

1
chmod +x cloud_sql_proxy

Step 2. Creating a Cloud SQL instance and Initializing your Cloud SQL instance

Now that you have cloud_sql_proxy setup on your computer system, head back to google cloud console: * create an instance for your application. * create a database on this instance * create a user for this database beside the default postgreSql user posgres

This post is getting too long, so we won't get into details on how to create an instance on google cloud console. If you need help with that, leave a comment on below.

With that done, let's initialize our Cloud Sql instance

1
./cloud_sql_proxy -instances="[YOUR_INSTANCE_CONNECTION_NAME]"=tcp:5432

You can get [YOUR_INSTANCE_CONNECTION_NAME] and any information about your instance through the following command:

1
gcloud sql instances describe [YOUR_INSTANCE_NAME]

Step 3. Creating a service account

From google itself: The proxy requires a service account with Editor privileges for your Cloud SQL instance. To create a service account, follow the steps below as outlined by google: * Go to the Service accounts page of the Google Cloud Console. https://console.cloud.google.com/projectselector2/iam-admin/serviceaccounts?supportedpurview=project * Select the project that contains your Cloud SQL instance. * Click Create service account. In the Create service account dialog, provide a descriptive name for the service account. For Role, select one of the following roles: Cloud SQL > Cloud SQL Client Cloud SQL > Cloud SQL Editor Cloud SQL > Cloud SQL Admin * Change the Service account ID to a unique, easily recognizable value. * Click Furnish a new private key and confirm that the key type is JSON. * Download this file and save somewhere in your local computer. You will need to use it latter to create secrets that help connect our GKE application the our Cloud Instance.

Step 4.Configuring the database settings to work for both or local environment and cloud console.

1
2
export DATABASE_USER=
export DATABASE_PASSWORD=

Edit your Django application settings.py file to reflect the following database settings:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'HOST': '127.0.0.1',
        'PORT': '5432',
        'NAME': 'gkeexample-db',
        'USER': os.getenv('DATABASE_USER'),
        'PASSWORD': os.getenv('DATABASE_PASSWORD')
    }
}

Setting up your GKE configuration app.yaml

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
# [START kubernetes_deployment]
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  name: gkeexample
  labels:
    app: gkeexample
spec:
  replicas: 3
  template:
    metadata:
      labels:
        app: gkeexample
    spec:
      containers:
      - name: gkeexample-app
        # Replace  with your project ID or use `make template`
        image: gcr.io/gkeexampleapp/gkeexample
        # This setting makes nodes pull the docker image every time before
        # starting the pod. This is useful when debugging, but should be turned
        # off in production.
        imagePullPolicy: Always
        env:
            # [START cloudsql_secrets]
            - name: DATABASE_USER
              valueFrom:
                secretKeyRef:
                  name: cloudsql
                  key: username
            - name: DATABASE_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: cloudsql
                  key: password
            # [END cloudsql_secrets]
        ports:
        - containerPort: 8080

      # [START proxy_container]
      - image: gcr.io/cloudsql-docker/gce-proxy:1.16
        name: cloudsql-proxy
        command: ["/cloud_sql_proxy", "--dir=/cloudsql",
                  "-instances=gkeexampleapp:us-central1:gkeexample-db=tcp:5432",
                  "-credential_file=/secrets/cloudsql/credentials.json"]
        volumeMounts:
          - name: cloudsql-oauth-credentials
            mountPath: /secrets/cloudsql
            readOnly: true
          - name: ssl-certs
            mountPath: /etc/ssl/certs
          - name: cloudsql
            mountPath: /cloudsql
      # [END proxy_container] 
      # [START volumes]
      volumes:
        - name: cloudsql-oauth-credentials
          secret:
            secretName: cloudsql-oauth-credentials
        - name: ssl-certs
          hostPath:
            path: /etc/ssl/certs
        - name: cloudsql
          emptyDir:
      # [END volumes]        
# [END kubernetes_deployment]

---

# [START service]
# The gkeexample service provides a load-balancing proxy over the gkeexample app
# pods. By specifying the type as a 'LoadBalancer', Container Engine will
# create an external HTTP load balancer.
# For more information about Services see:
#   https://cloud.google.com/container-engine/docs/services/
# For more information about external HTTP load balancing see:
#   https://cloud.google.com/container-engine/docs/load-balancer
apiVersion: v1
kind: Service
metadata:
  name: gkeexample
  labels:
    app: gkeexample
spec:
  type: LoadBalancer
  ports:
  - port: 80
    targetPort: 8080
  selector:
    app: gkeexample
# [END service]

At this point it's safe to say your application local environment is all set up and ready. You can go ahead and test your application locally:

1
2
3
python manage.py makemigrations
python manage.py migrate
python manage.py runserver

Your application should be running locally at this point with no errors.

Step 4: Deploying the app to GKE

I know, you are thinking, finally!! well don't, there is much to be still done. We still need upload our static files directory to cloud storage, create cluster for our GKE Application, enable your GKE app to connect with your Cloud SQL instance. The good news is , you only have to do this once. First we need to collect our static files to be uploaded to a bucket in cloud storage. 1.Create a Cloud Storage bucket and make it publicly readable.

1
2
gsutil mb gs://[YOUR_GCS_BUCKET]
gsutil defacl set public-read gs://[YOUR_GCS_BUCKET]

2.Gather all the static content locally into one folder:

1
python manage.py collectstatic

3.Upload the static content to Cloud Storage:

1
gsutil rsync -R static/ gs://[YOUR_GCS_BUCKET]/static

Remember to edit your application settings.py file to include:

1
STATIC_URL = "http://storage.googleapis.com/[YOUR_GCS_BUCKET]/static/"

Set up GKE(clusters):

1.To initialize GKE, go to the Clusters page, follow the instructions. 2.Create a GKE cluster:

1
2
3
4
gcloud container clusters create gkeexampl \
  --scopes "https://www.googleapis.com/auth/userinfo.email","cloud-platform" \
  --num-nodes 4 --zone "us-central1-a"
gcloud container clusters get-credentials polls --zone "us-central1-a"

Set up Cloud SQL:

We need secrets to enable our GKE application to connect to our Cloud SQL instance. 1.Create secrets: First we create the secret for instance-level access

1
kubectl create secret generic cloudsql-oauth-credentials --from-file=credentials.json=[PATH_TO_CREDENTIAL_FILE]

Remember that file from our service accounts? We need it now [PATH_TO_CREDENTIAL_FILE] Next we need to create secrets for our database access:

1
kubectl create secret generic cloudsql --from-literal=username=[PROXY_USERNAME] --from-literal=password=[PASSWORD]

2.Retrieve the public Docker image for the Cloud SQL proxy.

1
docker pull b.gcr.io/cloudsql-docker/gce-proxy

3.Build a Docker image, replacing with your project ID.

1
docker build -t gcr.io//gkeexample .

4.Push the Docker image. Replace with your project ID.

1
docker push gcr.io//gkeexample

5.Create the GKE resource:

1
kubectl create -f app.yaml

2, 3 and 4, 5 are probably sets you'll always redo to deploy your new image to GKE. Final Deploy the app to GKE:

1
kubectl get pods

If the status of your pods indicate running, then voila, you got a working GKE deployed app. To get the EXTERNAL_IP for your GKE application:

1
kubectl get services polls

Pass the EXTERNAL_IP address to your browser url bar. There you have your GKE application. Face any problems? Remember to leave a comment below. References: https://cloud.google.com/python/django/kubernetes-engine#deploying_the_app_to_

Related Posts

0 Comments

12345

    00