Microsoft had their virtual developer focused conference Build 2021 on 25th — 27th of May. As with most conferences of this style, there was a bunch of service and production updates. One that caught my eye was Azure Arc enabled application services, which allows all of Azure’s Application Services based services (Azure App Service, Functions, Logic Apps, API Management, and Event Grid) to be deployed to any Kubernetes cluster in the cloud or even in a data centre somewhere. I really like the deployment model you get with these services, so the idea of making them available to more infrastructure environments is pretty exciting!
I decided it would be great to get a Python function running outside of Azure, so what better place to try than on a Google Kubernetes Engine cluster. This allowed me to try out the features of Arc without having to actually have any servers in a data centre somewhere. Plus there was just something cool in the idea of running an Azure Function on another public cloud.
If you want to try this out yourself, you’ll need both an Azure Subscription and a Google Cloud Project with permissions to create and configure resources. You’ll also need a workstation with the following tools installed:
Make sure you logged into the right organisation and project with the Google Cloud SDK and then run the following command to stand up a 3 node cluster:
Note: Don’t go too small with the cluster as Arc needs to add a number of services.
We also need to create a static IP address that will be the end point for all services running on the cluster:
Take note of the IP as we will need it later:
Create a firewall rule in to allow port 80, 443 and 8081 to our cluster from the internet:
Authenticate kubectl to your new cluster, Azure Arc uses your local kubeconfig to install the required services on your cluster:
Now you have a Kubernetes cluster ready to be used with App Services
Next we can create a resource group for our Arc enabled cluster and then onboard the cluster:
The connectedk8s is an extension for Azure CLI that will be installed the first time you call it.
All going well, you should see your GKE cluster in the Azure console now.
GKE cluster as a Azure Arc connected cluster
Azure now has the ability to apply policies, GitOps integrations and provide monitoring. However, what we are interested in is using it to deploy Azure Functions…
To enable this cluster to host application services we need to install the new extension. While everything will be done via the Azure CLI tool, I recommend configuring the extension the first time via the console. At time of writing, the console doesn’t actually do any configuration for you, but generates a PowerShell run script.
Click on “Extensions (preview)” and then “Add” and “Application Services (preview)”, click “Create” and you will end up on the following screen:
Important values here, Static IP needs to be the value of the external address you created on the GCP side on step 1 and the Storage class needs to be set to a valid storage class for your cluster, in this case “standard” works for GKE.
After choosing whether to set up a log analytics workspace and adding any tags, you will be presented with a run script, download and run the script from PowerShell.
The script takes a while to run as it needs to register components both within Azure and in your Kubernetes cluster.
Now the fun bit, our cluster is now ready to start hosting Application Services, so let’s create a Python function to test it out. First we need a Function App, the process for this is exactly the same as if you were creating a normal Function App on Azure with one change, we select the custom location we created as the region:
Once the Function App is created, use your favourite tool to publish your function code. Currently it’s not possible to create or edit functions in the web portal that are deployed to Kubernetes. However you can use the portal (or Azure CLI) to view all the functions details such as configuration, keys and URLs.
This was the first time I’d played with Azure Arc and it has really demonstrated to me the future of where this service is heading. I can see the potential as a solution for companies where hybrid cloud is the reality.
In terms of where the service is right now, it’s very obviously still in preview. I found the steps to get a cluster onboarded and running a Function fairly brittle, which resulted in me re-creating my GKE cluster a couple times to make sure I was starting clean. This was generally caused by simple misconfigurations on my part, but it was not immediately obvious what the issue was and I debugged most of them by digging into the cluster state with kubectl or the GCP console. One specific example of this was getting the storage class name wrong (leaving it as its default value which isn’t a valid class on GKE by default) and was only able to find this in the error logs for the app service extension.
I love the potential this has to give the same developer experience regardless of the infrastructure, however it doesn’t remove the operational burden of running a cluster. Even with this simple POC I hit resource limits in my cluster and ended up breaking my function deployment because it could not schedule pods. Again, would be great to see how Microsoft can increase the feedback on the Azure side when this occurs.
In conclusion I will certainly be keeping an eye on both Azure Arc and how these extensions continue to evolve as I think with a bit more time it will become a very compelling solution.
If you want to read more about this update from Microsoft, check out https://techcommunity.microsoft.com/t5/azure-arc/updates-to-azure-arc-enabled-kubernetes/ba-p/2257140