'How to deploy multiple cloud functions that are newly pushed using google cloud build and Source Repository?
I have a project folder with different cloud functions folders e.g.
Project_Folder
-Cloud-Function-Folder1
-main.py
-requirements.txt
-cloudbuild.yaml
-Cloud-Function-Folder2
-main.py
-requirements.txt
-cloudbuild.yaml
-Cloud-Function-Folder3
-main.py
-requirements.txt
-cloudbuild.yaml
--------- and so on!
Now what i have right now is. I push the code to the Source Repository one by one from the Cloud Fucntions folder to Source Repository(Separate Repos for each function folder). And then it has a Trigger enabled which trigger the cloud-build and then deploy the function. The cloudbuild.yaml file i have is like this below..
steps:
- name: 'python:3.7'
entrypoint: 'bash'
args:
- '-c'
- |
pip3 install -r requirements.txt
pytest
- name: 'gcr.io/cloud-builders/gcloud'
args:
- functions
- deploy
- Function
- --runtime=python37
- --source=.
- --entry-point=function_main
- --trigger-topic=Function
- --region=europe-west3
Now, what I would like to do is I would like to make a single source repo and whenever i change the code in one cloud function and push it then only it get deploys and rest remains like before.
Update
Now i have also tried something like this below but it also deploy all the functions at the same time even though i am working on a single function.
Project_Folder
-Cloud-Function-Folder1
-main.py
-requirements.txt
-Cloud-Function-Folder2
-main.py
-requirements.txt
-Cloud-Function-Folder3
-main.py
-requirements.txt
-cloudbuild.yaml
-requirements.txt
cloudbuild.yaml file looks like this below
steps:
- name: 'python:3.7'
entrypoint: 'bash'
args:
- '-c'
- |
pip3 install -r requirements.txt
pytest
- name: 'gcr.io/cloud-builders/gcloud'
args:
- functions
- deploy
- Function1
- --runtime=python37
- --source=./Cloud-Function-Folder1
- --entry-point=function1_main
- --trigger-topic=Function1
- --region=europe-west3
- name: 'gcr.io/cloud-builders/gcloud'
args:
- functions
- deploy
- Function2
- --runtime=python37
- --source=./Cloud-Function-Folder2
- --entry-point=function2_main
- --trigger-topic=Function2
- --region=europe-west3
Solution 1:[1]
It's more complex et you have to play with limit and constraint of Cloud Build.
I do this:
- get the directory updated since the previous commit
- loop on this directory and do what I want
Hypothesis 1: all the subfolders are deployed by using the same commands
So, for this I put a cloudbuild.yaml
at the root of my directory, and not in the subfolders
steps:
- name: 'gcr.io/cloud-builders/git'
entrypoint: /bin/bash
args:
- -c
- |
# Cloud Build doesn't recover the .git file. Thus checkout the repo for this
git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
# Copy only the .git file
mv /tmp/repo/.git .
# Make a diff between this version and the previous one and store the result into a file
git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff
# Do what you want, by performing a loop in to the directory
- name: 'python:3.7'
entrypoint: /bin/bash
args:
- -c
- |
for i in $$(cat /workspace/diff); do
cd $$i
# No strong isolation between each function, take care of conflicts!!
pip3 install -r requirements.txt
pytest
cd ..
done
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: /bin/bash
args:
- -c
- |
for i in $$(cat /workspace/diff); do
cd $$i
gcloud functions deploy .........
cd ..
done
Hypothesis 2: the deployment is specific by subfolder
So, for this I put a cloudbuild.yaml
at the root of my directory, and another one in the subfolders
steps:
- name: 'gcr.io/cloud-builders/git'
entrypoint: /bin/bash
args:
- -c
- |
# Cloud Build doesn't recover the .git file. Thus checkout the repo for this
git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
# Copy only the .git file
mv /tmp/repo/.git .
# Make a diff between this version and the previous one and store the result into a file
git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff
# Do what you want, by performing a loop in to the directory. Here launch a cloud build
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: /bin/bash
args:
- -c
- |
for i in $$(cat /workspace/diff); do
cd $$i
gcloud builds submit
cd ..
done
Be careful to the timeout here, because you can trigger a lot of Cloud Build and it take times.
Want to run manually your build, don't forget to add the $BRANCH_NAME as substitution variable
gcloud builds submit --substitutions=BRANCH_NAME=master
Solution 2:[2]
This is quite straightforward, however you need to control the behavior on the Build Trigger side of things and not on the cloudbuild.yaml
. Conceptually, you want to limit the cloud build trigger behavior and limit it to certain changes within a repo.
As such, make use of the regEx glob include filter in the Build Trigger page:
You will build one trigger per cloud function (or cloud run) and set the "Included files filter (glob)" as follows:
Cloud-Function1-Trigger
Project_Folder/Cloud-Function-Folder1/**
Cloud-Function2-Trigger
Project_Folder/Cloud-Function-Folder2/**
...
Assumptions:
- For each trigger the repo and branch is set such that the root of the repo has the Project_Folder/
- Repo and branch is appropriately set to so that the trigger can locate and access files in path Project_Folder/Cloud-Function-Folder1/*
When I have more than 2-3 cloud function, I tend to use Terraform to create all the required triggers in an automated fashion.
Solution 3:[3]
you can do this by creating a folder for each of the functions like this
Project_Folder
-Cloud-Function-Folder1
-main.py
-requirements.txt
-cloudbuild.yaml
-Cloud-Function-Folder2
-main.py
-requirements.txt
-cloudbuild.yaml
-Cloud-Function-Folder3
-main.py
-requirements.txt
-cloudbuild.yaml
--------- and so on!
and create a cloudbuild.yaml
in each dir which would look like this
steps:
- name: 'gcr.io/cloud-builders/gcloud'
args:
- functions
- deploy
- Cloud_Function_1
- --source=.
- --trigger-http
- --runtime=python37
- --allow-unauthenticated
dir: "Cloud-Function-Folder1"
on the cloud build create a trigger with included file filter to only include files from the functions-folder-name
and also manually specify functions-folder-name/cloudbuild.yaml
for each of the triggers.
From this blog post by Torbjorn Zetterlund you can read the entire process of deploying multiple cloud functions from single github repo with include filter to only deploy changed function.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | guillaume blaquiere |
Solution 2 | Jeremy Caney |
Solution 3 | mumboFromAvnotaklu |