Deploying a static website, CI/CD with GitHub Actions, Azure Functions and CosmosDB for the Cloud Resume Challenge

I recently tackled the Cloud Resume Challenge (cloudresumechallenge.dev/) as a way to further develop my skills in Azure.

This post outlines how I configured a static website in Azure, CDN and custom domain on HTTPS, and finally GitHub Actions to automatically deploy updated code.

Broadly speaking, this involved the following steps:

  • create website with HTML, CSS, JS
  • create Azure resources
  • setup CDN and HTTPS
  • add visitor counter
  • CI/CD via GitHub Actions

Website

First off, I created a website with HTML, CSS, JS I found a template on html5up and adjusted it a little bit to fit my needs.

Create resources

Once a rudimentary index.html file was setup, I created resources in Azure via CLI:

resource group:

az group create \ 
   --name storage-resource-group \ 
   --location westus

storage account:

 az storage account create \ 
   --name $account-name \ 
   --location westus \ 
   --sku Standard_RAGRS \ 
   --kind StorageV2

After static website hosting on the storage account was enabled, I uploaded the files.

CDN profile and CDN endpoint -

Microsoft Documentation

Once these were created, I needed to setup a CNAME record, pointing my custom DNS to the CDN endpoint hostname:

Host name = labexample.grantmcomie.tech 
Type = CNAME 
TTIL = 1 hour 
Data = labexampleendpoint.azureedge.net

I was pretty impatient waiting for DNS records to propagate, but they did after a few minutes

I then configured the custom domain to Azure, and set "Custom domain HTTPS" to "On"

Another important step was to redirect HTTP to HTTPS. To do this I configured a rule under the "Rules engine" of the endpoint:

The logic is effectively "If the request protocol is HTTP, then URL redirect to HTTPS protocol."

At this point the static website was up and working in Azure.

Visitor Counter & GitHub Actions -

Now I'll be the first to admit I struggled mightily working on code to display a visitor counter. I haven't been coding long, and while I'm familiar with using Python's requests library for API calls, this was adjacent if not new territory, and still an educational process.

As a temporary measure I first settled on a simple client side call to a namespace I created on countapi/

I ended up building the CI/CD pipeline with GitHub Actions at this point, I found the Microsoft Documentation worked like a charm.

main.yml:

name: Blob Storage Website CI

on:
  push:
    branches: [ main ]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - uses: azure/login@v1
      with:
          creds: ${{ secrets.AZURE_CREDENTIALS }}

    - name: Upload to blob storage
      uses: azure/CLI@v1
      with:
        inlineScript: |
            az storage blob upload-batch --overwrite true --account-name storagename --auth-mode key -d '$web' -s  .
    - name: Purge CDN endpoint
      uses: azure/CLI@v1
      with:
        inlineScript: |
           az cdn endpoint purge --content-paths  "/*" --profile-name cloudresume2022CDN --name profilename --resource-group rgname
    - name: logout
      run: |
            az logout
      if: always()

On a side note, the original main.yml didn't include --overwrite true and this broke with Azure CLI version 2.34. I noted this in another post: blog post.

The backend then needed to pull information from a database using an Azure Function. First, I created a CosmosDB, using serverless capacity mode and then a container to store a single record.

{
    "id": "home",
    "count": 1
}

Then, I created the function and input/output integrations:

index.js -

module.exports = async function (context, req, data) {
    context.log('Javascript HTTP trigger function processed');
        context.bindings.outputDocument = data[0];
        context.bindings.outputDocument.count +=1;
        context.res = {
            body: context.bindings.outputDocument.count
        };
}

function.json -

{
    "bindings": [
      {
        "name": "req",
        "authLevel": "anonymous",
        "methods": [
          "get",
          "post"
        ],
        "direction": "in",
        "type": "httpTrigger"
      },
      {
        "type": "http",
        "direction": "out",
        "name": "res"
      },
      {
        "name": "inputDocument",
        "databaseName": "visits",
        "collectionName": "tutorial-container",
        "connectionStringSetting": "resume2022db_DOCUMENTDB",
        "direction": "in",
        "type": "cosmosDB"
      },
      {
        "name": "outputDocument",
        "direction": "out",
        "type": "cosmosDB",
        "connectionStringSetting": "resume2022db_DOCUMENTDB",
        "databaseName": "visits",
        "collectionName": "tutorial-container"
      }
    ]
  }

The Azure Function returns the value of "count", and then the backend JavaScript saves this as a variable, ultimately passed to index.html as 'views'

window.addEventListener('DOMContentLoaded', (event) => {
    getVisitCount();
});

const functionapi = 'https://functiongrant2022.azurewebsites.net/api/HttpTrigger1?code=PUc5pkDcmjQF9Oz6Z2PBal65OqdT0AcwaCxeoibNroY4UM9qql47dQ==';

const getVisitCount = () => {
    let count = 12;
    fetch(functionapi)
    .then(response => {
        return response.json()
    })
    .then(response => {
        console.log("function api was called.");
        count = response;
        document.getElementById('views').innerText = count;
    }).catch(function(error) {
        console.log(error);
    });
    return count;
}

It was a challenge to code this honestly, I my first attempt at adding input/output integrations resulted in an error where the ID wasn't defined, and I only determined this after live monitoring the logs while the JavaScript attempted to call the function.

Additionally, Cross-Origin Resource Sharing needed to both be enabled and the static site address added to authorize the call.

Conclusion

This was a great challenge, and covered a broad range of skills, code and technology. I connected with people while troubleshooting, gained a better sense of how I can organize my approach to further projects as well.

There were a couple instances where I manually configured a resource after creating it, and going forward I'll strive to do this at the very least via Azure CLI if not via the infrastructure as code software tool Terraform. I've been digging into Terraform lately and am excited to deploy more projects. I additionally would want to include monitoring and cost optimization measures.

My resume is online at https://www.grantmcomie.info/

The project can be found on my GitHub

Let me know your thoughts on this article, feel free to comment or follow me here on Hashnode!

I am also on Twitter and Linkedin

-Grant McOmie