May 31, 2023
5
mins
Microsoft Build 2023 showcased several exciting announcements across the Microsoft ecosystem, with an anticipated focus on AI. Azure, in partnership with OpenAI, has transformed the landscape with AI integration and chatbot advancements. These innovations, powered by Large Language Models (LLMs) like GPT3.5 and GPT4, have found their way into products such as Bing, Windows, and Office 365 as well as the Azure OpenAI Service.
In this blog post, we'll explore how Azure AI services and Azure OpenAI Service with Azure Kubernetes Service (AKS) are empowering developers to build intelligent apps using natural language processing. We'll dive into the seamless integration of Azure OpenAI service within AKS clusters, enabling communication with LLMs and chatbots via a dedicated microservice. Additionally, we'll touch upon Kubernetes-focused AI tools like AKS Copilot and Kubectl ai, simplifying development and scaling in AKS clusters.
We'll also discuss Microsoft DevBox, a cloud-based developer workstation designed to accelerate onboarding and work and enhance the developer experience. DevBox provides specialised virtual environments, preconfigured with project-specific tools, and integrates seamlessly with Microsoft Intune for device management. We'll also explore Azure Deployment Environments, allowing developers to quickly spin up test environments using infrastructure-as-code templates with automated cleanup and security controls.
Microsoft Fabric, a comprehensive analytics solution, consolidates Azure Data Factory, Azure Synapse Analytics, and Power BI. Fabric covers everything from data movement to data science, real-time analytics, and business intelligence, offering a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place.
Lastly, we'll touch on some improvements in CosmosDB, such as hierarchical partition keys, materialised views, comprehensive change feeds, and burst capacity, enhancing query efficiency and reliability.
Join us as we delve deeper into these exciting announcements from Microsoft Build 2023, and explore how they empower developers and organisations to succeed.
The MS build this year showcased many capabilities around how Azure in partnership with OpenAI has changed the paradigm of cloud service offerings.
We have seen the latest announcements in the past year about adding AI into various products and integrating their chatbots into various services with the powerful Large Language Models powered by GPT3, 3.5 and GPT 4 in the future. We are seeing these LLMs are introduced into bing, Windows and Office 365.
We will now look at how these capabilities would help build intelligent apps that leverage Azure AI services for natural language processing, Azure OpenAI Service with Azure Kubernetes Service (AKS) and other Azure application platform services.
To integrate Azure OpenAI service into applications running on AKS and enable us to talk to the Large Language Model and the chatbots, we create it as a separate microservice within AKS cluster and we will leverage the semantic kernel SDK. This will enable us to introduce the chat feature in the application similar to autopilot.
To handle increased traffic, the application is integrated with KEDA(in preview) for automatic scaling based on load test metrics obtained from Azure load test service.
Few Kubernetes-focused AI tools.
With these tools, Microsoft has harnessed Kubernetes to offer full power. This will empower the developer to focus on code and business logic. It is free for customers using AKS clusters and includes features such as chat feature within Azure portal, enabling the creation, configuration and troubleshooting of AKS cluster resources.
An example use case would be to ask AKS Copilot to generate a Kusto query to check CPU usage in an AKS cluster and allow the engineer to quickly analyse and troubleshoot a cluster issue.
Overall, AI is truly revolutionising the way we think about apps and enabling the developers to ship clean and anti-fragile apps.
Back in August 2022, Microsoft DevBox was announced for public preview, today it is racing towards its GA launch scheduled for July 2023.
DevBox is a self-service, cloud-based developer workstation, aimed at helping organisations onboard developers quickly and securely.
From working with over 50 organisations to refine the product Microsoft has also rolled out DevBox to over 9000 of its own engineers across the Azure, Office and Windows teams.
The goal of the service is to allow developers to create a specialised virtual environment with all the tools required for a specific project already in place.
The workstations are enrolled in Microsoft Intune to allow for easy device management and are also connected to virtual networks inside of Azure using project-based configuration.
To save on costs DevBox’s can hibernate, this allows for a fast resume of a workstation and also helps reduce costs by only needing to pay for storage while the machine is in hibernation.
With all of these features, the goal is to shorten the onboarding process from days to around 20 mins, ensuring your developers can get an environment and start working faster than ever before.
For more in-depth information visit: https://learn.microsoft.com/en-us/azure/dev-box/overview-what-is-microsoft-dev-box
Azure Deployment Environments is a service that allows developers to self-deploy development environments into Azure. These environments are defined by using infrastructure-as-code templates and deployed using the Microsoft DevBox portal.
This allows developers to be able to quickly spin up a test environment with real infrastructure, to keep costs in control each environment is set up with a time-to-live allowing for automated cleanup. To ensure best security practices these environments also have organisational security controls applied.
Infrastructure-as-code templates are created using the Azure Resource Manager (ARM) templates, but a public preview for Terraform is now available.
The service is free and is generally available now.
For more in-depth information visit: https://learn.microsoft.com/en-us/azure/deployment-environments/overview-what-is-azure-deployment-environments
We are thrilled about the launch of Microsoft Fabric. It is truly a revolutionary product that distinguishes itself as a comprehensive analytics solution.
Currently, organisations encounter numerous obstacles when attempting to harness the power of data for a competitive advantage. With the fragmented landscape of data and AI technologies, it is a complex and costly affair to integrate disparate services from multiple vendors across the data analytics lifecycle.
Fabric addresses these pain points and simplifies the analytics process by consolidating technologies like Azure Data Factory, Azure Synapse Analytics, and Power BI into a single product. On top of that, it leverages Azure OpenAI Service at every layer, unlocking the full potential of data through generative AI capabilities.
With features like Copilot, Fabric enables developers to leverage conversational language to create data flows, generate code, build machine learning models, and more. Business users can also benefit from AI-driven analytics and visualisation capabilities provided by Power BI, which seamlessly integrates with Microsoft 365 applications, making data easily accessible and actionable.
Fabric has the potential to pave the way for a new era of streamlined analytics and AI-driven insights, helping organisations unlock the full potential of their data.
While the exciting new OpenAI and Devbox developments are understandably hogging the limelight, there are also a few incremental and quality-of-life improvements being released for CosmosDB.
Hierarchical Partition Keys - now GA
This allows for more efficient partitioning by allowing a hierarchy of up to three partitioning keys. We see this as being especially useful in multi-tenancy scenarios. When one or more keys are specified in a query, it will be efficiently routed to the appropriate subset of partitions - with a significant impact on query execution costs.
Materialised Views for Azure Cosmos DB for NoSQL - now in Preview
We see this as “Applied CQRS as a service”. With CQRS it’s common to maintain alternative read-model views of your data to facilitate efficient querying. Maintaining such a setup manually can introduce a lot of overhead to your database writes. With a materialised view, you only need to write to the base table and cosmos takes on the responsibility of updating the views to remain consistent. Additionally, a materialised view can use a different partition key which unlocks the use of multiple, independent partition keys for a data set - at the cost of the additional storage space taken by the materialised view.
All Versions and Delete Change Feed - in preview
The previous “Latest Version” change feed only emits when items are inserted or updated and does not contain deletes. This mode includes all deletes and transient versions making it much easier to sync all data changes from a Cosmos DB to any other data or event store. Note that you can only read changes that have occurred within the continuous backup retention period configured for the account.
Burst Capacity - now GA
When enabled, Bust Capacity allows each Cosmos DB partition to accumulate up to 5 minutes of idle capacity. When the partition with accumulated capacity experiences a spike in workload, requests that would otherwise have been rate limited get a small “grace period” and will be served by burst capacity at up to 3000RUs. If the spike was short lived, this feature will allow that small spike to be handled without returning any Rate Limited 429 HTTP responses to the client. If the high workload continues, then the partition will be rate limited after the burst capacity has been consumed. This is a big improvement for overall uptime and reliability.
One of my favourite parts of GitHub is the Advanced Security tooling. It’s super easy to get up and running with secrets and dependency scanning. This takes care of the low hanging fruit security wise in any repository, then you can add powerful static analysis with code scanning using the CodeQL language.
However a lot of organisations have heavy investments in Azure DevOps and are unable to move to GitHub without a large amount of word and potentially changing how they build, test and deploy applications. Therefore it's great to see Microsoft recognise this and bring the power of GitHub Advanced Security to Azure DevOps.
GitHub Advanced Security for Azure DevOps will be released in preview soon. I encourage any organisation using Azure DevOps for their source code and CI/CD processes to sign up for the preview and enable it on a repository or two to see the insights these tools can bring with only a few clicks.
We’ve got ambitious plans to be the best Microsoft solutions company in Australia and New Zealand.