Technical Thursday – Jenkins pipeline and AMArETTo
Last week I showed you How you can integrate Git and Jenkins. Inside that post I did not provide script part for Azure related operation. Today I would like to show it.
In Step 4.4.5 we configured a file which is located on our Git. (pipeline/Jenkinsfile). This file is the “link” which can call an upload-to-azure method script. I know you ask: How?
At first I have a good news AMArETTo supports these operations from v0.0.2.9. AMArETTo is available on Git and on PyPi. π
This is the best position for you to create a cool automation solution at your company.
And now let’s see how can we implement the Azure functionality to our Jenkins pipeline.
Step 1: Install AMArETTo to our Jenkins server.
- This step is quite easy because we merely should follow the installation steps for AMArETTo.
# Install from bash sudo pip install amaretto
Step 2: Create Python script which calls AMArETTo
In this step we will create a small python script which execute the upload function from AMArETTo.
- Create uploadtoazure.py file into pipeline directory under your GitLab project’s root.
pipeline/uploadtoazure.py
- Write a short code which get some external parameters
#!/usr/bin/python # import amaterro import amaretto from amaretto import amarettostorage # import some important packages import sys import json # Get arguments fileVersion = str(sys.argv[1]) storageaccountName = str(sys.argv[2]) sasToken = str(sys.argv[3]) filePath = str(sys.argv[4]) modificationLimitMin = str(sys.argv[5]) print "--- Upload ---" uploadFiles = amaretto.amarettostorage.uploadAllFiles(fileVersion = fileVersion, storageaccountName = storageaccountName, sasToken = sasToken, filePath = filePath, modificationLimitMin = modificationLimitMin) try: result = json.loads(uploadFiles) print "--- Upload files' result: '{0}' with following message: {1}".format(result["status"], result["result"]) except: print "--- Something went wrong during uploading files." print "-----------------------------"
- Β Create the Jenkinsfile into pipeline directory under your GitLab project’s root.
pipeline/Jenkinsfile
- Write a valid and lightweight Jenkinsfile code for Python which call our uploadtoazure.py with the right parameters.
pipeline { agent any environment { FILE_VERSION = "1.0.0.0" AZURE_SA_NAME = "thisismystorage" AZURE_SA_SAS = "?sv=..." FILE_PATH = "./upload/" MODIFICATION_LIMIT_IN_MINUTES = "30" } stages { stage('Build') { steps { withCredentials([azureServicePrincipal('c66gbz87-aabb-4096-8192-55d554565fff')]) { sh ''' # Login to Azure with ServicePrincipal az login --service-principal -u $AZURE_CLIENT_ID -p $AZURE_CLIENT_SECRET --tenant $AZURE_TENANT_ID # Set default subscription az account set --subscription $AZURE_SUBSCRIPTION_ID # Execute upload to Azure python pipeline/uploadtoazure.py "$FILE_VERSION" "$AZURE_SA_NAME" "$AZURE_SA_SAS" "$FILE_PATH" "$MODIFICATION_LIMIT_IN_MINUTES" # Logout from Azure az logout --verbose ''' } } } } }
Let me explain the Jenkinsfile. As you can see there is a unfamiliar part above bash code withCredentials()
. This comes from Jenkins and this contains the Azure Service Principal related data for our Storage Account. (this was configured in the Step 2 in the post from last week) When you use this credential you have well configured variables which contain the related values such as AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID and AZURE_SUBSCRIPTION_ID. These are fully enough to login Azure.
Step 3: Push files to Git
- Finaly we have to push these files to our Git
- Then push to Build now button in Jenkins
- And check the result π
I hope together with previous post you can improve your own Pipeline and provide a cool solution to your management. π