Transitioning to Azure cloud authentication – Part 2

Conditional Access

Now that we’ve gone over some a couple basics, I wanted to go through some of the features Azure AD has built in which not only blow on-premise AD away, but also show why a push to utilize it over it’s predecessor is vital for the modern workspace.

The modern security perimeter now extends beyond an organization’s network to include user and device identity. Organizations can use identity-driven signals as part of their access control decisions.

Conditional Access brings signals together, to make decisions, and enforce organizational policies. Azure AD Conditional Access is at the heart of the new identity-driven control plane.

I’ve especially seen post-Covid, with the workspace being more dispersed, the two goals I see most commonly requested are to:

  • Empower users to be productive wherever and whenever
  • Protect the organization’s assets

In my opinion, using Azure AD Conditional Access policies to apply the right access controls when needed is one of the strongest controls available for keeping the organization secure.

Signals that Conditional Access can use when making decisions include:

  • User or group memberships
    • Note: Dynamic memberships are super powerful here, drastically lowering the support overhead with utilizing this control!
  • IP Location info
  • Specific Device
  • Based on application
  • A.I. – like, by using real-time and calculated risk detection
    • Note: This one is cool, as integration with AzAD Identity Protection allows the identity of risky sign-in behavior, and you can then force users to go through some of the options talked about in Part 1

Conditional Access is very powerful, however, I would recommend that it initially be implemented in “report only” mode. – Thankfully, for many obvious reasons, an excellent option.

Entitlement Management

Continuing in the same theme, either static, or automatic assignments of access packages can be created in Entitlement Management, which now include multi-stage reviews.

Access reviews can be built in sequential stages, each with their own set of reviewers and configurations, making it easy to design more efficient reviews for the resource owners and auditors by reducing the number of decisions each reviewer is accountable for.

Note: In the following sample, I have a third party application added as “Application”, the reason it shows up is because it is an Enterprise App registered in my Azure tenant, one can only imagine the possibilities here!

Up to three stages can be specified, in addition, you can define whether earlier stage decisions should be revealed to later-stage reviewers. 

Automatic assignment of access policies

Azure AD now adds and removes users’ access across groups, Teams, SharePoint sites, and applications as their attributes change (such as when someone joins, moves between departments, etc.). The inclusion of this policy in an access package simplifies managing at scale; users don’t need to make requests, which not only ensures their access doesn’t remain longer than necessary, but also does so without the need for administrative interaction when someone moves teams.

Here’s a screenshot example for a policy I’ve got

In this example, the rule is based on the attributes of the user, in this case department. Azure AD will automatically begin creating resource assignments for those users who meet the rule, without the need to request.

In addition to what be done with dynamic groups, we can also use entitlement management with automatic assignment policies for:

  • Managing access across multiple resources, including applications, SharePoint Online sites, existing Azure AD groups and Teams, and groups that are provisioned to on-premises AD.
  • Managing access with a combination of policies to have both rules (for instance, members in a department) and exceptions so that the exceptions can be regularly reviewed and removed, if no longer needed
  • More automation of tasks across applications through entitlement management’s custom extensions, by running workflows when users receive or lose assignments

Cybersecurity Architect Expert certification

This weekend I successfully passed the SC-100 certification, with it have now achieved my second Expert level cert, the Cybersecurity Architect certification from Microsoft.

Learned a ton, and looking at the exam score, I overstudied on governance, but that’s what I’m interested in, and yes, it’s helped a ton with furthering my knowledge with Azure security infrastructure and design!

Transitioning to Azure cloud authentication – Part 1

This series of articles is to document steps to be taken to transition from an on-premise Active Directory footprint, and migrate the workloads to Azure AD.

A typical migration has the following stages:

  • Discovery: Find out what is currently in the environment
  • Pilot: Deploy new cloud capabilities to a small subset of users, applications, and devices
  • Scale Out: Expand the pilot to complete the transition
  • Cut-over: Stop using the on-premises authentication

Users and Groups

Microsoft highly recommends a passwordless environment, due to as is depicted in the following graphic, is both highly secure, and convenient.

In my experience, users correctly following secure practices either make or break security initiatives, thus, in my opinion, convenience is crucial

Industry authentication standards rely on one of the following:

  • Something you Know:
    • Passwords are great, but unless a vault is used, it is common to use the same, or variation for many personal accounts. Highly vulnerable in modern times, as environments are often compromised, with credentials getting exposed to public sites. The equivalent of writing credentials down on paper and other people finding it.
  • Something you Have:
    • Removes the problem of forgetting something you know, but is vulnerable to the object being lost or stolen.
  • Something you Are:
    • Much harder to lose a fingerprint than a wallet, however, while this is getting better, historically, biometric sensors can be fairly expensive (cost and support) and have accuracy issues.

Due to the fact each authentication methods have their vulnerabilities, a combination of them is much stronger, hence the modern term “Multi-Factor Authentication” (MFA)

Here’s an example of using the Authenticator App as a convenient multi-factor authentication option in addition to a password.

The Authenticator App turns any iOS or Android phone into a strong, passwordless credential. Users can sign in to any platform or browser by getting a notification to their phone, matching a number displayed on the screen to the one on their phone, and then using their biometric (touch or face) or PIN to confirm.

Passwordless authentication using the Authenticator app follows the same basic pattern as Windows Hello for Business. It’s a little more complicated as the user needs to be identified so that Azure AD can find the Authenticator app version being used:

  1. The user enters their username.
  2. Azure AD detects that the user has a strong credential and starts the Strong Credential flow.
  3. A notification is sent to the app via Apple Push Notification Service (APNS) on iOS devices, or via Firebase Cloud Messaging (FCM) on Android devices.
  4. The user receives the push notification and opens the app.
  5. The app calls Azure AD and receives a proof-of-presence challenge and nonce.
  6. The user completes the challenge by entering their biometric or PIN to unlock private key.
  7. The nonce is signed with the private key and sent back to Azure AD.
  8. Azure AD performs public/private key validation and returns a token.

Password Self-Service

Until an MFA environment is in place, migrating to Azure’s password self-service (SSPR) gives users the capability of managing their own password resets, which not only greatly helps with the “convenience” point I made above, but in most cases, tremendously decreases help desk support calls.

The following authentication methods are available for SSPR:

  • Mobile app notification
  • Mobile app code
  • Email
  • Mobile phone
  • Office phone (available only for tenants with paid subscriptions)
  • Security questions

Users can only reset their password if they have registered an authentication method that the administrator has enabled.

On-premises integration

With a hybrid environment, first install and configure the sync agent to be capable of enabling password writeback, once that is complete, you can configure Azure AD Connect to write password change events back from Azure AD to the on-premises directory.

In addition, the following options are available:

  1. Users can unlock accounts without resetting their password
  2. Password filters for on-premises Active Directory

Specialized role permissions – Locking down standard Azure infrastructure

Possibly due to specific governance needs, or perhaps maintaining a specific infrastructure in your cloud environment, you might want to lock out standard build capability for a group of users.

In the following example, I had a request to remove the ability of creating new Resource Groups in Azure to most users regardless of authorization levels, here is the example of how to do so with Microsoft Graph permission sets.

I created a custom Azure role that defined what can be done, and what can’t be done, looking at a the empty role JSON file, you can see there are “Actions” and “NotActions” sections:

  "Name": "",
  "IsCustom": true,
  "Description": "Base Role file",
  "Actions": [
  "NotActions": [
  "AssignableScopes": [

The fields we’re going to pay attention to are the Actions, NotActions, and AssignableScopes.

Step 1: Determine the resource providers that map to Azure services.

With this example, we are targeting Resource Groups, so we will use:


Step 2: Find the available permissions. With this case, we want to restrict creating or modifying resource groups, so it makes sense to add to the deny section the “write” permission.


Step 3: Assign it to the appropriate scope. A Resource Group is created in a subscription, therefore that’s where you’d define the scope:


Putting it all together, your JSON file will look something like this:

  "Name": "Deny RG Create",
  "Id": null,
  "IsCustom": true,
  "Description": "Disable New Resource Group creation",
  "Actions": [
  "NotActions": [
  "AssignableScopes": [

Once you’ve created the role definition, import it into your Azure subscription and assign the role to the necessary users, they’ll get the “You do not have permissions to create resource groups under subscription” message when trying to create a new group.

Advanced 365 Mailbox management with MS Graph, Powershell, and JSON

Create Office 365 mailbox folders and advanced mailbox rules with Powershell and MS Graph

Recently, a request came through to create some email messaging processing for various Office 365 users, which involved creating some folder structure in mailboxes of a list of users, and creating a message rule that had some different criteria based on email message attributes.

The needs were:

  • Create a root folder
  • Create a folder inside that root folder
  • Create a rule:
    • Check the message Header for From address
    • Check for custom Header information
    • Move message to previously created subfolder
  • Finally, make sure there was no second mailbox rule created if already existed.

The New-MailboxFolder Powershell command works perfectly if the folder needed is for your own mailbox, if you want to run it against others, there is no current Powershell commandlet, so custom code must be created. While there are some basic examples out there, there was no comprehensive script anyone has published as of yet, so here is one I came up with.

For brevity purposes, I won’t go into detail the process that’s required to authenticate in order to run scripts against your environment, as there are quite a few resources available easily found by your favorite search engine, so I will skip over that process, and explain the “how to figure out what you need to accomplish with Powershell using MS Graph.”

$Mailboxes = @("")
$Folders = @("RootFolder","OtherRootFolder")
$SubFolders = @("SubFolder")
$MailbRule = "RuleForSubFolder"

$AppId = "xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx"
$AppSecret = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
$Scope = ""
$TenantName = "xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx"

$Url = "$TenantName/oauth2/v2.0/token"

Add-Type -AssemblyName System.Web

$Body = @{
    client_id = $AppId
	client_secret = $AppSecret
	scope = $Scope
	grant_type = 'client_credentials'

$PostSplat = @{
    ContentType = 'application/x-www-form-urlencoded'
    Method = 'POST'
    # Create string by joining bodylist with '&'
    Body = $Body
    Uri = $Url

$Request = Invoke-RestMethod @PostSplat

$Header = @{
    Authorization = "$($Request.token_type) $($Request.access_token)"

foreach($Mailbox in $Mailboxes) {

    $Uri = "$Mailbox/mailFolders"
    $Mailboxfolders = Invoke-RestMethod -Uri $Uri -Headers $Header -Method Get -ContentType "application/json"
    $MailboxfoldersList = $Mailboxfolders.value.displayName
    $NextPage = $Mailboxfolders.'@Odata.NextLink'
    While($null -ne $NextPage) {
        $Mailboxfolders = Invoke-RestMethod -Uri $NextPage -Headers $Header -Method Get -ContentType "application/json"
        $MailboxfoldersList += $Mailboxfolders.value.displayName
        $NextPage = $Mailboxfolders.'@Odata.NextLink'

    foreach($Folder in $Folders) {
        $Body = @"
            "displayName": "$Folder"
        Write-Host "Mailbox: $Mailbox`nMailboxfolders: $($MailboxfoldersList)`nFolder wanted: $Folder"

        if($($MailboxfoldersList) -contains $Folder) {
            Write-Host "$Folder folder already found at mailbox $Mailbox, creating subfolder.`n"
            $UriParent = "$Mailbox/mailFolders/?`$filter=displayname eq '$Folder'"
            $ParentFolder = Invoke-RestMethod -Uri $UriParent -Headers $Header -Method Get -ContentType "application/json"
            $UriSub = "$Mailbox/mailFolders/$($"

        else {
            $ParentFolder = Invoke-RestMethod -Uri $Uri -Headers $Header -Method Post -Body $Body -ContentType "application/json"
            Write-Host "Created new folder: $($ParentFolder.displayName) to mailbox $Mailbox!`n"
            $UriSub = "$Mailbox/mailFolders/$($"
    $MailboxSubfolders = Invoke-RestMethod -Uri $UriSub -Headers $Header -Method Get -ContentType "application/json"
    $MailboxSubfoldersList = $MailboxSubfolders.value.displayName
    foreach($SubFolder in $SubFolders) {
        $Body2 = @"
            "displayName": "$SubFolder"
        if($($MailboxSubfoldersList) -contains $Subfolder) {
            Write-Host "$Subfolder folder already found at mailbox $Mailbox.`n"
            $UriGetSub = "$Mailbox/mailFolders/$($`$filter=displayname eq '$Subfolder'"
            $SubId = Invoke-RestMethod -Uri $UriGetSub -Headers $Header -Method Get -ContentType "application/json"
            $UriGetRules = "$Mailbox/mailFolders/inbox/messageRules"

            $MailboxRules = Invoke-RestMethod -Uri $UriGetRules -Headers $Header -Method Get -ContentType "application/json"
            Write-Host "The rules are: $($MailboxRules.value.displayName)"
            $MailboxRulesList = $MailboxRules.value.displayName

            if($($MailboxRulesList) -contains "$MailbRule") {
                Write-Host "The mailbox rule $MailbRule already found at mailbox $Mailbox.`n"
            else {

                ## For syntax:
                $RuleBody = @"
                    "displayName": "$MailbRule",
                    "sequence": 2,
                    "isEnabled": true,
                    "conditions": {
                        "headerContains": [
                            "X-SomeCompany-tag: customTag"
                    "actions": {
                        "moveToFolder": "$($",
                        "stopProcessingRules": true
                    "exceptions": {
                        "headerContains": [
                            "X-SomeCompany-Spam-Reason: eusafe",
                $RuleUri = "$Mailbox/mailFolders/inbox/messageRules"
                $NewRule = Invoke-RestMethod -Uri $RuleUri -Headers $Header -Method Post -Body $RuleBody -ContentType "application/json"
                Write-Host "Created new Rule: $MailbRule in mailbox $Mailbox!`n"
        else {
            $NewSubfolder = Invoke-RestMethod -Uri $UriSub -Headers $Header -Method Post -Body $Body2 -ContentType "application/json"
            Write-Host "Created new subfolder: $($NewSubfolder.displayName) in $Folder to mailbox $Mailbox!`n"

            $RuleBody = @"
                "displayName": "$MailbRule",
                "sequence": 2,
                "isEnabled": true,
                "conditions": {
                    "headerContains": [
                        "X-SomeCompany-tag: customTag"
                "actions": {
                    "moveToFolder": "$($",
                    "stopProcessingRules": true
                "exceptions": {
                    "headerContains": [
                        "X-SomeCompany-Spam-Reason: eusafe",
            $RuleUri = "$Mailbox/mailFolders/inbox/messageRules"

            $NewRule = Invoke-RestMethod -Uri $RuleUri -Headers $Header -Method Post -Body $RuleBody -ContentType "application/json"
            Write-Host "Created new Rule: $MailbRule in mailbox $Mailbox!`n"

The key part of this article is not to show how fancy of a script I can write (disclaimer: the fancy spacing is from Visual Studio Code, use it!), but rather, how to get at the MS Graph API and syntax required to do the tremendous amount of capabilities that it’s got access to. I figured by throwing up a script that does quite a few different things that were previously only available if you ran several different scripts one after another (and hoped nothing broke), here’s an example of doing several different things easily using Powershell against the MS Graph API.

To see what JSON I needed, I used extensively to see the fields to use, and looked up the REST API documentation to see what properties are required in the request body. (For example, for the Message Rule, I went to:

Logon failure with Azure AD DS based services

Fix a madding “Invalid password” error when trying to use an Azure service that uses Azure AD DS as it’s authentication

I hope this helps someone fix a madding “Invalid password” error when trying to use an Azure service that uses Azure AD DS as it’s authentication with a synced account from on-premise AD.

With a recent implementation of Windows Virtual Desktop, and interesting failure occurred with a set of users that were synchronized to Azure AD from an on-premise AD environment.

The environment was one where there was no longer an Azure AD Connect configuration in place, in fact, the on-premise AD environment was no longer available. All users were using Office 365 services without any issues, and Azure AD Domain Services was implemented for new Azure services, one of them being WVD.

All WVD services were tested with the admin account that created the resources, and some test users created in Azure AD, however, when the group that needed to use the service tried to use it with their accounts, around half the necessary users could not log into the VM image.

After verifying all permissions were correctly assigned, and checking to see if there were any relevant differences between the accounts that were able to log on vs. the ones that were not, I noticed that all accounts were able to log on to the web URL, however after the initial logon to the service, the originally synced accounts were failing with an “invalid password” error, whereas the ones that were directly created in Azure succeeded. – Aha! This pointed me to the fact that the accounts seemed to be having some sort of Azure ADDS failure, as 365 services were not dependent on that.

Quite a few articles were read all over the place, with none being any help, so I went back to the basics, and went over the Azure AD to Azure AD DS synchronization guide much more methodical than I previously had.

I’ll cut to the chase, in the middle of that guide, the following statement is made: When a user is created in Azure AD, they’re not synchronized to Azure AD DS until they change their password in Azure AD.

WVD – RDAgentBootLoader – Object reference not set to an instance of an object

This article is just a reminder to read event log errors carefully, as they tremendously help troubleshoot undocumented errors (which this blog is all about)

A Windows Virtual Desktop implementation kept having it’s Session host VMs randomly go to an Unavailable status, and after going through the full plethora of troubleshooting articles, I decided to take a closer look at the event error 3389:

Unable to retrieve DefaultAgent from registry: System.NullReferenceException: Object reference not set to an instance of an object.
   at RDAgentBootLoader.BootLoaderSettings.get_DefaultAgentPath() in S:\src\RDAgent\src\RDAgentBootloader\BootLoaderSettings.cs:line 82

The above error doesn’t say much, and is pretty cryptic, but when we look at the detail pane, we get some better information:

Log Name:      Application
Source:        RDAgentBootLoader
Date:          12/11/2020 10:21:05 AM
Event ID:      3389
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Unable to retrieve DefaultAgent from registry: System.NullReferenceException: Object reference not set to an instance of an object.
   at RDAgentBootLoader.BootLoaderSettings.get_DefaultAgentPath() in S:\src\RDAgent\src\RDAgentBootloader\BootLoaderSettings.cs:line 82
Event Xml:
<Event xmlns="">
    <Provider Name="RDAgentBootLoader" />
    <EventID Qualifiers="0">3389</EventID>
    <TimeCreated SystemTime="2020-12-11T15:21:05.2207218Z" />
    <Correlation />
    <Execution ProcessID="0" ThreadID="0" />
    <Security />
    <Data>Unable to retrieve DefaultAgent from registry: System.NullReferenceException: Object reference not set to an instance of an object.
   at RDAgentBootLoader.BootLoaderSettings.get_DefaultAgentPath() in S:\src\RDAgent\src\RDAgentBootloader\BootLoaderSettings.cs:line 82</Data>

The first thing that stood out to me was “Unable to retrieve DefaultAgent from registry“, and after a quick search to see where that registry key was, I found it at: \HKLM\SOFTWARE\Microsoft\RDAgentBootLoader

Looking at my settings, I saw the “DefaultAgent” is pointing to an entry version that is not in my registry:

Hmm…. if it’s pointing to a version that there’s no key, let me check if the binaries are there on the machine. Yes, they were! So let me try creating the missing key by checking to see what was in the existing key:

Ok, let’s create the missing key in the same syntax as the existing one:

After a restart, the session hosts are now Available and stable!

Azure FQDN support for site-to-site VPN

As of November 20, 2020, Azure now supports FQDN configurations for it’s VPN connections!

This is perfect for customer branches or locations without static public IP addresses (private homes behind a cable modem, etc.) to connect to the Azure VPN gateways. Dynamic DNS services can be leveraged to use the Fully Qualified Domain Name (FQDN) instead of IP addresses!

Hybrid and Multicloud strategies for financial services organizations

Lucia Stanham wrote an amazing article at Azure blog on some observations I had before I was able to write my own, so instead of rehashing the same thing, I’m posting the article instead:

A need for hybrid and multicloud strategies for financial services

The financial services industry is a dynamic space that is constantly testing and pushing novel use cases of information technology. Many of its members must balance immense demands—from the pressures to unlock continuous innovation in a landscape with cloud-native entrants, to responding to unexpected surges in demand and extend services to new regions—all while managing risk and combatting financial crime.

At the same time, financial regulations are also constantly evolving. In the face of the current pandemic, we (at Microsoft) have seen our customers accelerate in their adoption of new technologies, including public cloud services, to keep up with evolving regulations and industry demands. Hand in hand with growing cloud adoption, we’ve also seen growing regulatory concerns over concentration risk (check out our recent whitepaper on this), which have resulted in new recommendations for customers to increase their overall operational resiliency, address vendor lock-in risks and require effective exit plans.

Further complicating matters, many financial services firms oversee portfolios of services that include legacy apps that have been in use for many years. These apps often cannot support the implementation newer capabilities that can accommodate mobile application support, business intelligence, and other new service capabilities, and suffer from shortcomings that adversely affect their resiliency, such as having outdated and manual processes for governance, updates, and security processes. These legacy applications also have high vendor lock-in because they lack modern interoperability and portability. Furthermore, the blunt force approach of leveraging legacy technology as a means for protecting against financial crime is an unsustainable strategy with diminishing returns—with big banks spending over $1 billion per year maintaining legacy infrastructure and seeing a rise in false positive rates as financial crime evolves in sophistication.

As a means to address the demands of modernization, competition, and compliance, financial services organizations have turned to public cloud, hybrid cloud and multi-cloud strategies. A hybrid model enables existing applications—which originally exist on-premises—to be extended by connecting to the public cloud. This infrastructure framework unleashes the benefits of the public cloud—such as scale, speed, and elastic compute, without requiring organizations to rearchitect entire applications. This approach provides organizations the flexibility to decide what parts of an application should reside in an existing datacenter versus in the public cloud, as such providing them with a consistent and flexible approach to developing a modernization strategy.

Additional benefits of successful hybrid cloud strategies include:

  • A unified, consistent approach for infrastructure management: Consistently manage, secure and govern IT resources across on-premises, multicloud and the edge, delivering a consistent experience across locations.
  • Extending geographic reach and openings new markets: Meet the growing global demand and extend into new markets by extending the capabilities of datacenters to new locations – while also meeting data localization requirements from local markets
  • Managing security and increasing regulatory compliance: Hybrid and multicloud are great alternatives for strictly on-premises strategies due to cloud benefits around service security, availability, resiliency, data protection and data portability. These strategies are often referenced as a preferred way of reducing risk and addressing regulatory compliance challenges.
  • Increasing Elasticity: Customers can respond with agility to surges in demand or transaction by provisioning and de-provisioning capacity as needed. A hybrid strategy allows organizations to seamlessly scale their capacity beyond their datacenter during high-compute scenarios, such as risk computations and complex risk modeling, without over exhausting servers or slowing down customer interactions.
  • Reducing CapEx Expenses: The cloud makes the need for such a large capital outlay for managing on-premises infrastructure unnecessary. Through the benefits of elastic capacity in hybrid scenarios, companies can avoid the costs of unused digital capacity, paying only for the resources that are consumed.
  • Accelerate time to market: A hybrid strategy provides a bridge that connects on-premises data to new cloud-based capabilities across AI and advanced analytics, allowing customers to modernize their services and unlock innovation. With virtualized environments, they can accelerate testing and evaluations cycles and enable deployment seamlessly across different locations.

A multicloud strategy enables customers to leverage services that span different cloud platforms, enabling them to select the services best suited to the workloads or apps they are managing.

Commonly cited benefits of a multicloud strategy include:

  • Flexibility: Customers wish to have the flexibility to optimize their architectures leveraging the cloud services best suited to their specific needs, including the flexibility to select services based on features or costs
  • Avoiding vendor lock-in: A common requirement customers often state, customers often seek design multi-cloud deployments to achieve short term flexibility and long-term agility by designing systems across multiple clouds.

Microsoft hybrid and multicloud edge for financial services organizations

Azure hybrid capabilities uniquely address some of the main barriers customers face around hybrid and multicloud strategies. Managing multiple environments is an endeavor that introduces inherent complexity and risk for firms, faced with an expanding data estate that spans diverse on-premises, public cloud(s), and edge environments. Optimizing for productivity without sacrificing security and compliance can be daunting. Azure provides a seamless environment for developing, deploying and managing data and applications across all distributed locations.

For one, Azure uniquely supports the full range of hybrid capabilities across DevOps, Identity, Security, Management, and Data. Given that customer IT estates involve much more than containers, many of our cloud benefits are also available to server-based workloads. Azure enables customers to manage both Windows and Linux servers across their data estate and customers can also manage access and user authentication with hybrid identity services. The Azure Stack portfolio extends Azure services and capabilities to your environment of choice—from the datacenter to edge locations and remote offices and disconnected environments. Customers can run machine learning models on the edge, in order to get quick results before data is sent to the cloud. Furthermore, with capabilities such a Azure Stack Hub, our portfolio enables organizations to operate in offline environments that block data from being sent to the public cloud, especially if required for regulatory compliance.

Second, Azure simplifies the experience of managing a complex data estate by providing a unified, consistent approach for managing and monitoring their hybrid or multicloud environments. With capabilities such as Azure Arc, can manage their data estate with a single management plane—including the capability to monitor non-Microsoft clouds. Customers can also take a similarly simplified approach to managing security across their estate with services such as Azure Sentinel, which provides a consistent threat detection and security analytics view across on-premises, cloud and edge devices. In combination with services such as Azure Security Center, Azure policy, and Azure advisor, customers can also design, deploy, and oversee security and compliance of their deployments across their hybrid and multicloud environments.

Azure leadership in hybrid and multicloud offerings is also rooted in our extensive collaborations with hardware partners (OEMs), which whom we have partnered and co-engineered solutions to deliver a well-defined variety of supporting devices. Partner solutions have been designed with the aim in mind to increase resiliency and expand the reach of virtual data centers. With the new rugged series of Azure Stack Edge for instance, we provide cloud capabilities in the harshest environment conditions supporting scenarios such as tactical edge, humanitarian and emergency response efforts.

The Azure commitment to financial services customers stems from Microsoft industry-leading work with regulators around the world. Our customers require their cloud partners to support transparency, regulatory right to audit, and self-reporting. To enable this, we have a dedicated and comprehensive FSI compliance program available to customers and help customers manage their compliance by enabling choices around data location, transparency and notification of subcontractors, providing commitments on exit planning (see our recent blog here), as well as tools to aid in risk assessments.

Azure enable financial services to operate hybrid seamlessly. Customers can manage their full, multicloud or hybrid estate in a single control pane with Azure Arc. They can also bring Azure services to any infrastructure (such as AWS, GCP or VMWare services), they can modernize data centers with Azure Stack, and further extend insights to the edge with Azure IoT

Customer spotlights

We’ve seen many of our financial services customers begin to realize the benefits of hybrid and multicloud strategies already. In a recent Total Economic Impact study commissioned with Forrester on the impact of shifting from on-premises to Azure IaaS (including to hybrid environments), over a three year period, organizations avoided 90 percent of on-premises infrastructure costs (valued at over $7 million), as well as associated employee costs. Organizations were able to reallocate their IT staff to higher level business initiatives, including ventures of expansion into new markets, which resulted in altogether new streams of income for the companies.

One example of a company that took a hybrid approach was Banco de Crédito e Inversiones (BCI). Their portfolio supported 20 million transactions a month and required a hybrid approach in order to keep apps and resources on-premises for regulatory and performance reasons. With Azure Stack Hub, they were able to improve the performance and reliability of their systems, and even rolled out new products quickly. They were able to switch from outsourced IT management to in-house management.

“We’ve found the whole Azure platform to be very reliable and stable, and it gets better with each release. In fact, we have statistics showing that when we enabled Azure Stack Hub, customer satisfaction went up. It’s very clear. We’re delivering a better experience for our customers through the reliability and performance of Azure Stack Hub and the new functionality our team is building on top of it.”—German Matosas, Head of Architecture and Cloud Platform, BCI

Another example is Volkswagen Financial Services, a branch of VW that manages approximately 80 web apps across ten countries—a complex IT estate by any measure. They needed to modernize their apps and development approach and leveraged Azure Stack Hub to bring cloud speed and scale to their DevOps practices. This strategy also allowed them to maintain components of their highly customized apps on-premises (such as core databases and SAP systems), due to privacy and compliance requirements. This also enabled them to add new services without needing to rework their existing applications.

What about full or single cloud?

While the focus of this blogpost has been hybrid and multicloud strategies, it is also worth briefly touching on the value of partnering with a single cloud provider to provide end-to-end solutions. This is referred to as a “full cloud” or “single cloud” strategy and serves the long-term objective of shutting down all on-premises data centers and moving all workloads to a single cloud provider. This strategy also has its merits and in fact may offer benefits over both hybrid and multicloud solutions, such as offering simplified management, less complexity, and lower total cost of ownership (TCO). Partnering with a highly resilient CSP, such as Microsoft, for a full cloud strategy, has been the solution of choice for several financial institutions. The unique benefits of a full cloud strategy need to be weighed against potential downsides, but in principle, this approach is allowed by regulators in most global jurisdictions.

Deciding on a hybrid or multicloud strategy

Many organizations commence their journey from a fully on-premises baseline. We’ve seen that as they start to consume public cloud services, questions arise around what the most appropriate deployment strategy could be—whether they should take a full cloud, hybrid cloud, or multicloud approach.

If you respond positively to one or more of the questions below you are likely in a good position for using hybrid or multicloud strategies:

  1. Does your organization’s digital strategy enable your organization to easily adopt new and emerging technologies and deploy them to on-premises or legacy apps? 
  2. Does your organization have a digital strategy that welcomes innovation but is not ready to fully commit to a public cloud?
  3. Do you find it challenging to meet capacity demands in your IT infrastructure and meet unexpected surges in demand or maintain performance levels?
  4. Does your IT department struggle to manage different technologies from different providers and keep oversight across multiple environments?
  5. Does your organization face pressure from regulators or risk departments to maintain certain processes on-premise, or within specific geographic regions (data residency)?
  6. Is your organization considering expanding into new geographies or new markets?