close

Azure AD

Azure AD

Active DirectoryAzure AD

Microsoft acquires CloudKnox Security to offer unified privileged access and cloud entitlement management

Microsoft-Logo-Microsoft-1-696×304

At Microsoft, we are committed to supporting organizations in their digital transformation and helping them to deliver secure and seamless experiences. Since IT modernization often spans multiple clouds, cloud security and identity are top of mind for most of our customers. Modern identity security needs to protect all users and resources consistently across multi-cloud and hybrid cloud environments. Today, Microsoft is taking a significant step toward this goal with the acquisition of CloudKnox Security, a leader in Cloud Infrastructure Entitlement Management (CIEM). CloudKnox offers complete visibility into privileged access. It helps organizations right-size permissions and consistently enforce least-privilege principles to reduce risk, and it employs continuous analytics to help prevent security breaches and ensure compliance. This strengthens our comprehensive approach to cloud security.

People working in office
Microsoft acquires CloudKnox to offer unified privileged access and cloud entitlement management.

As organizations adapt to hybrid work and more and more cloud services are deployed, new service entities that collaborate and exchange data without human interaction, such as virtual machines and containers, are proliferating. The growth of these service accounts and identities and their increasing volumes of permissions, privileges and entitlements exposes organizations to new attack vectors. Left in blind spots or uncontrolled, these permissions leave business critical systems open to infiltration and disruption. High-profile breaches demonstrate how quickly bad actors can move laterally by exploiting misappropriated privileged credentials.

While organizations are reaping the benefits of cloud adoption, they still struggle to assess, prevent, enforce and govern privileged access across hybrid and multi-cloud environments. Even if they piece multiple siloed systems together, they still get an incomplete view of privileged access. Traditional Privileged Access Management and Identity Governance and Administration solutions are well suited for on-premises environments, however they fall short of providing the necessary end-to-end visibility for multi-cloud entitlements and permissions. Neither do they provide consistent identity lifecycle management or governance in multi-cloud and cloud-native environments.

In January, when I shared the five identity priorities for 2021, I stressed the importance of a Zero Trust security approach that verifies explicitly, grants least privileged access and always assumes breach — with identity as your first line of defense. As the corporate network perimeter disappears, it’s crucial to establish a strong cloud identity foundation, so you can enforce least privileged access to protect business-critical systems while improving business agility. We’re committed to making it easier to enforce least privileged access for all user and workload identities.

The acquisition of CloudKnox further enables Microsoft Azure Active Directory customers with granular visibility, continuous monitoring and automated remediation for hybrid and multi-cloud permissions. We are committed to providing our customers with unified privileged access management, identity governance and entitlement management including:

  • Automated and simplified access policy enforcement in one integrated multi-cloud platform for all human and workload identities.
  • The widest breadth of signal-enabling, high-precision machine learning-based anomaly detections.
  • Seamless integration with other Microsoft cloud security services, including Microsoft 365 Defender, Azure Defender and Azure Sentinel.

Our acquisition of CloudKnox, like our recent acquisition announcements on RiskIQ and ReFirm Labs, shows our focus and execution in acquiring, integrating and expanding the strongest defenses for our customers — from chip to cloud — backed by more than 3,500 defenders at Microsoft and the more than 8 trillion security signals we process every day. Microsoft is uniquely positioned to help empower and defend the future of hybrid work and multi-cloud environments, providing essential visibility, control and monitoring Zero Trust demands.

We’re excited to bring the CloudKnox team and technology to Microsoft and our joint customers and look forward to their contributions. We’ll share more information as we integrate CloudKnox with Microsoft’s identity, security and compliance solutions.

Source Microsoft

read more
Active DirectoryAzure AD

Windows 365 Announced and Azure Virtual Desktop Gains New Features

GENERIC-Microsoft-‘emojis-340×200

Microsoft announced Windows 365 – launching on August 2nd, which is (arguably) Microsoft’s big entry into Desktop as a Service (DaaS). Windows 365 allows you to provide users with persistent virtual desktops without managing the supporting infrastructure.

The core idea behind it from Microsoft is that for a set monthly fee, you can assign a Windows 365 license to a user after picking the appropriate Cloud PC size. These range from 1 vCPU with 2GB RAM and 64GB HDD space, up to 8 vCPUs, 32GB RAM and 512GB of storage space. The license is then provisioned from Microsoft Endpoint Manager, which avoids the need to build out either a traditional VDI environment on-premises, or build-out and manage Azure Virtual Desktop.

Windows 365 Announced and Azure Virtual Desktop Gains New Features
Figure 1: Managing Windows 365 Cloud PCs within Microsoft Endpoint Manager.

The timing of this Windows 365 release is important, as it was announced at Microsoft’s global partner conference, Inspire. If you currently work for a Microsoft Partner, the main message is that if you are currently configuring Microsoft 365 services like Intune today, and also helping roll-out Windows 10 desktops, then it should be straightforward for you to help customers with Windows 365.

In episode 20 of the Practical 365 podcast, we briefly discussed announcements for Azure Virtual Desktop as part of the rename from Windows Virtual Desktop, including the news that a Public Preview of Azure AD Join and Intune management would arrive soon. This week, at the same time as Windows 365, Microsoft announced that these features are now in Public Preview.

What’s the Difference Between Windows 365 and Azure Virtual Desktop?

If you haven’t examined either offering in detail, it’s easy to look at both and think of them as basically the same thing. And the confusing part is there’s a lot of overlap, so you aren’t wrong for making that assumption. Fundamentally speaking, with Windows 365 you are avoiding the core infrastructure & platform piece, leaving that to Microsoft to worry about that. Therefore, it is most definitely a Software-as-a-Service type solution.

With Azure Virtual Desktop, you’re required to manage a supporting Azure subscription, configure and implement the platform services required to allow a thin-client or Remote Desktop client to connect in. Then performed over HTTPS into the environment, must be authenticated and allocated to the correct machine via a session broker, maybe even provision a VM or start one up, and so on. Plus, you need to make the architectural decisions around that, such as how you will configure redundancy, backup and resilience on top of Azure.

The difference between AVD and simply building out a VDI platform on Azure VMs (Infrastructure-as-a-Service) is that AVD is a Platform-as-a-Service offering, built primarily of modules you can configure, alongside infrastructure you deploy to it.

Almost all of the platform configuration melts away with Windows 365, where the key decision for an Enterprise SKU is how you will (today) connect an Azure vNet to your on-premises environment. While you can upload your own custom images for Windows, you can also choose ‘vanilla’ images – i.e. those pre-configured to work, with add-ins for Teams VDI pre-installed and ready to go.

Microsoft provides a table that explains when and why you’ll choose each:

Windows 365 Announced and Azure Virtual Desktop Gains New Features
Figure 2: Microsoft’s view on which to choose – AVD or Windows 365.

Windows 365 Will Launch with Limitations

There are two versions of Windows 365 that will be available upon launch – a small business version and an enterprise version.

The small business version is intended for those lacking an IT team and is self-service for someone that runs their own business and chooses to buy a Cloud PC to use with their Microsoft 365 Business subscription. It doesn’t include any management capabilities but does allow sign-ins via Azure AD accounts.

The enterprise version (if you’re reading this, this is probably the version you need) will at launch require Hybrid Azure AD Join of each machine. That should make you pause and reflect a little, because that means there are pre-requisites that must be fulfilled prior to using Windows 365, including:

  • A local Active Directory on-premises, or as IaaS in Azure (Azure AD Directory Services isn’t supported)
  • Synchronization of your local AD to Azure AD
  • Network connectivity to an Azure vNet, so that Windows 365 Cloud PCs can join Active Directory as traditional workstations, Hybrid Azure AD Join configured and supporting connectors installed on-premises.
  • Plan and configure how network access and internet access will function – however, you can of course connect to the public Internet from Azure as well as route traffic from Azure to on-premises.

(For more info, feel free to read through Microsoft’s step-by-step example guide for setup (which admittedly glosses over a few aspects, but still comprehensive) to get a better idea of what’s involved.)

While not exactly unexpected, this could simply reflect the timing of when Azure Virtual Desktop features are released to General Availability (see below), as Microsoft has already said that this isn’t a permanent limitation, but rather one tied to the underlying platform, Azure Virtual Desktop today.

Therefore, in several months’ time, pure Azure AD joined and Intune/MEM managed Windows 365 Cloud PCs, should be M be easy to get started with.

Important Considerations

However, a crucial technical point to expose is that at least today, all features and functionality available in Windows 365 is tied to features and functionality available in Azure Virtual Desktop. If you’ve already deployed Azure Virtual Desktop and wondered whether functionality such as Teams VDI is improved, then the answer is no – there’s no special sauce for Windows 365 such as a different model for local device redirection, or an XBox Cloud Gaming-style set of improvements to the Remote Desktop user experience beyond what is currently offered in Azure Virtual Desktop. The client you’ll use on mobile or on desktop is the same as you use or deploy for AVD – the Remote Desktop app from the respective app stores.

A key thing to remember is that Windows 365 is focused on making everything simpler than AVD, rather than better. While the HAADJ requirement today might be disappointing, it isn’t forever – if you aren’t looking for simply the lowest cost but looking for simplicity – then some of the details like Windows 365 being completely persistent, single-user desktops without the need for FSLogix is a big benefit.

Azure AD Join and Intune/MEM management arrive in Azure Virtual Desktop

The success of AVD over the last 12 months means that there are lots of organizations who will look at Windows 365, consider the amount of work they have done to get AVD up and running, and remain confident with their choice. AVD is likely to be cheaper to run on an ongoing basis over setup Windows 365 but will require specialist skills to successfully implement and manage.

Savings are gained because charges are based upon the compute, network and storage consumed for AVD whereas with Windows 365, you pay per-machine, per-user, per-month (plus Azure network egress charges). With both solutions, you still need licensing for Windows and the applications installed, but you can save money with AVD by using multi-user images and auto-scaling based on usage.

One blocker for urgent AVD requirements has been the requirement for Active Directory – strikingly similar to Windows 365 when it launches on August 2nd. A key ask, especially for organizations deploying laptops that are Azure AD joined and Intune managed, is for the AVD to support Azure AD Join and Intune management so there isn’t a requirement to extend networking from on-premises to Azure to get line-of-sight access to AD, or enable Azure AD DS – not something you want to do even if you have an on-premises AD.

This functionality is now in preview for both Azure AD Join and Intune management, greatly simplifying the ability to deploy VDI in Azure using AVD for the same user types who would also be able to use a pure Azure AD Joined & Intune managed device. Once in GA, this means that a rapid enablement of AVD will be one of the key deciding factors when choosing between that and Windows 365.

Source Practical 365

read more
Azure AD

Upgrading PowerShell Scripts with Azure AD Cmdlets to Use Graph API Calls

GENERIC-PowerShell-and-Microsoft-Graph-340×200 (1)

Updating the Distribution List Count Script for the Graph

In June, I described how I upgraded a PowerShell script written for Exchange on-premises to report the count of members in distributions to work with Exchange Online. Or more precisely, with Azure AD rather than Active Directory. While everything works just fine, the problem is that Microsoft is moving away from the Azure AD Graph API that the Azure AD cmdlets use. After June 30, 2022, Microsoft will no longer provide security updates for the Azure AD Graph, nor will they support its use. This doesn’t mean that you can’t continue to use the Azure AD PowerShell module; it just means that Microsoft won’t deliver support for any problems you encounter. The cmdlets in the module work well, so I don’t anticipate many problems, but the writing is on the wall.

Microsoft’s path forward is the Graph PowerShell SDK, a wrapper around many different Graph API calls. The problem is that the SDK is still under development and doesn’t work as well as it will (eventually). And the nagging doubt is that perhaps it might just be best to use the underlying Graph API calls instead.

Using Graph API Calls

With that in mind, let’s explore what it takes to convert a script using Azure AD cmdlets to Graph API calls. Our starting point is the script to report distribution list counts. The end is the Graph version (both available in GitHub).

PowerShell scripts can invoke Graph API calls using the Invoke-RestMethod or Invoke-WebRequest cmdlets. Either work, but the Invoke-RestMethod cmdlet is more aligned with the REST-based Graph APIs than the more general-purpose Invoke-WebRequest cmdlet is. Before you can use these cmdlets to call Graph APIs in a script, some housekeeping is necessary.

First, you need to register an app in Azure AD to use as an entry point to Graph API calls. To allow access to data via the Graph, the app receives consent from an administrator to use a set of permissions. In this case, the app needs the following Graph application permissions: Directory.Read.All, Group.Read.All, and User.Read.All permissions.

Second, you need to authenticate. In this case, the script is intended to run interactively (other arrangements are necessary to run scripts non-interactively). Authentication is performed by asking Azure AD for an access token. To prove that the app is authorized, it passes an app secret, which you generate for the app from the Azure AD admin center. A combination of the tenant identifier, the app identifier (also generated by Azure AD), and the app secret is sufficient to allow the app to authenticate.

If you examine the Graph version of the script, you’ll see that it has more lines of code than the original PowerShell version. Housekeeping needs some code to get done, but once you’ve figured out the code that works for you (taking your own coding style, error handling, and so on into account), you’ll probably have code that can be lifted from one script to another to fulfil the task of getting ready to communicate with the Graph. For instance, because many Graph API calls use pagination to return a limited set of data per call, I have a function to continue fetching data until no more is available.

Counting Distribution Lists

The big challenge when counting the number of members in a distribution list is how to deal with nested groups. The PowerShell version uses a function to expand the membership of any nested groups to come up with a full set of members, just like the way the Exchange Online transport service expands a distribution list to create copies of messages during its fan-out process.

The Graph API for Groups includes the transitiveMembers call to return the members of a group (including distribution lists). This means that we can replace the PowerShell function with a single graph call. In this code, we ask the Graph to return the membership for a distribution list identified by its Azure AD object identifier (stored in the list’s ExternalDirectoryObjectId property).

Get-GraphData is the function to handle pagination referred to above.

After a successful call, the $Members array contains the full set of members. To analyze the membership and create a count of the different types of objects (mailboxes, groups, contacts, etc.), it’s a matter of looping through the array. The values for member type returned by the Graph take a little interpretation, but that’s not hard. Here’s the relevant code:

And that’s about it. A single Graph call replaces the function, and some slightly different processing counts the member types.

Simple Once You Know How

As it turns out, the task of converting the script was straightforward. Admittingly, I make the assertion from the view of someone who has worked with the Graph API for a while, can set up a registered app without breaking sweat, and have some code to paste into a new script to do the housekeeping. Still, I hold to my point. Like anything else to do with Office 365, it takes some time to become accustomed to how best to get things done. In five years, we’ll look back at all the Graph-enabled PowerShell we’ll write between now and then and wonder why anyone was concerned. At least, I hope so.

Source Practical 365

read more
Active DirectoryAzure AD

Upgrading PowerShell Scripts with Azure AD Cmdlets to Use Graph API Calls

GENERIC-PowerShell-and-Microsoft-Graph-340×200

Updating the Distribution List Count Script for the Graph

In June, I described how I upgraded a PowerShell script written for Exchange on-premises to report the count of members in distributions to work with Exchange Online. Or more precisely, with Azure AD rather than Active Directory. While everything works just fine, the problem is that Microsoft is moving away from the Azure AD Graph API that the Azure AD cmdlets use. After June 30, 2022, Microsoft will no longer provide security updates for the Azure AD Graph, nor will they support its use. This doesn’t mean that you can’t continue to use the Azure AD PowerShell module; it just means that Microsoft won’t deliver support for any problems you encounter. The cmdlets in the module work well, so I don’t anticipate many problems, but the writing is on the wall.

Microsoft’s path forward is the Graph PowerShell SDK, a wrapper around many different Graph API calls. The problem is that the SDK is still under development and doesn’t work as well as it will (eventually). And the nagging doubt is that perhaps it might just be best to use the underlying Graph API calls instead.

Using Graph API Calls

With that in mind, let’s explore what it takes to convert a script using Azure AD cmdlets to Graph API calls. Our starting point is the script to report distribution list counts. The end is the Graph version (both available in GitHub).

PowerShell scripts can invoke Graph API calls using the Invoke-RestMethod or Invoke-WebRequest cmdlets. Either work, but the Invoke-RestMethod cmdlet is more aligned with the REST-based Graph APIs than the more general-purpose Invoke-WebRequest cmdlet is. Before you can use these cmdlets to call Graph APIs in a script, some housekeeping is necessary.

First, you need to register an app in Azure AD to use as an entry point to Graph API calls. To allow access to data via the Graph, the app receives consent from an administrator to use a set of permissions. In this case, the app needs the following Graph application permissions: Directory.Read.All, Group.Read.All, and User.Read.All permissions.

Second, you need to authenticate. In this case, the script is intended to run interactively (other arrangements are necessary to run scripts non-interactively). Authentication is performed by asking Azure AD for an access token. To prove that the app is authorized, it passes an app secret, which you generate for the app from the Azure AD admin center. A combination of the tenant identifier, the app identifier (also generated by Azure AD), and the app secret is sufficient to allow the app to authenticate.

If you examine the Graph version of the script, you’ll see that it has more lines of code than the original PowerShell version. Housekeeping needs some code to get done, but once you’ve figured out the code that works for you (taking your own coding style, error handling, and so on into account), you’ll probably have code that can be lifted from one script to another to fulfil the task of getting ready to communicate with the Graph. For instance, because many Graph API calls use pagination to return a limited set of data per call, I have a function to continue fetching data until no more is available.

Counting Distribution Lists

The big challenge when counting the number of members in a distribution list is how to deal with nested groups. The PowerShell version uses a function to expand the membership of any nested groups to come up with a full set of members, just like the way the Exchange Online transport service expands a distribution list to create copies of messages during its fan-out process.

The Graph API for Groups includes the transitiveMembers call to return the members of a group (including distribution lists). This means that we can replace the PowerShell function with a single graph call. In this code, we ask the Graph to return the membership for a distribution list identified by its Azure AD object identifier (stored in the list’s ExternalDirectoryObjectId property).

Get-GraphData is the function to handle pagination referred to above.

After a successful call, the $Members array contains the full set of members. To analyze the membership and create a count of the different types of objects (mailboxes, groups, contacts, etc.), it’s a matter of looping through the array. The values for member type returned by the Graph take a little interpretation, but that’s not hard. Here’s the relevant code:

And that’s about it. A single Graph call replaces the function, and some slightly different processing counts the member types.

Simple Once You Know How

As it turns out, the task of converting the script was straightforward. Admittingly, I make the assertion from the view of someone who has worked with the Graph API for a while, can set up a registered app without breaking sweat, and have some code to paste into a new script to do the housekeeping. Still, I hold to my point. Like anything else to do with Office 365, it takes some time to become accustomed to how best to get things done. In five years, we’ll look back at all the Graph-enabled PowerShell we’ll write between now and then and wonder why anyone was concerned. At least, I hope so.

Source Practical365

read more
Azure AD

Using PowerShell to Manage Conditional Access (CA) Policies

033-03-01-2021-BLOG-Managing-Conditional-Access-and-PowerShell-no-Graph-LOW-300×162

Microsoft provides many methods to manage a tenant’s data and users.  PowerShell is a powerful tool to manage resources, including Conditional Access Policies using a set of cmdlets in the AzureAD module.  In this article, we review the eight PowerShell cmdlets and how to use them.

**Note that both the AzureAG and AzureADPreview PowerShell modules contain these cmdlets.

PowerShell and CA Policies

First, connect to Azure Active Directory using either the AzureAD or AzureADPreview module:
ConnectAzureAD

After connecting, we can get a list of available PowerShell cmdlets by using these two one-liners:
GetCommand *conditional*
GetCommand *named*

Combined we get a total of eight cmdlets dealing with Conditional Access Policies and Names Location Policies:
GetAzureADMSConditionalAccessPolicy
NewAzureADMSConditionalAccessPolicy
RemoveAzureADMSConditionalAccessPolicy
SetAzureADMSConditionalAccessPolicy

GetAzureADMSNamedLocationPolicy
NewAzureADMSNamedLocationPolicy
RemoveAzureADMSNamedLocationPolicy
SetAzureADMSNamedLocationPolicy

Conditional Access Policies set conditions to determine the conditions under which users receive access to apps.  These conditions can consist of locations, one or more users, applications, platforms, and more.

Figure 1: Properties of a new Conditional Access Policy

In the following examples, we examine these conditions to see what we can configure with PowerShell.

Creating a New Conditional Access Policy

A greenfield, or new tenant, has no Conditional Access Policies.  To utilize Conditional Access, we need to build its conditions.  If Named Locations are required, we need to create the Named Location first.  Let us walk through this process using an example scenario.

Named Locations

Conditional Access Policies can contain Named Locations that correspond to physical locations in an organization’s environment. Named Locations can be Physical locations with their corresponding IP subnet/range or a single country/a set of countries.  We can use Named Locations to provide a condition that controls where users are prompted (or not prompted) for MFA or other optional actions.   Included in the Azure AD Module, we saw that there are four PowerShell cmdlets for managing Named Locations and run the typical gamut of Get, New, Remove and Set PowerShell verbs.

In a new or Greenfield Azure AD, there are no Named Locations that can be used by Conditional Access and we need to either create these in Azure AD or with PowerShell.  To create a new Named Location policy, we need to use the New-AzureADMSNamedLocationPolicy cmdlet.

When creating a new Named location, we need to keep a couple of things in mind:

  • Display Name is required
  • Need to choose either an IP range or a list of countries as this determines the type of Named Location we are creating.

For the first Named Location, we can use some base criteria – Chicago Office, IP Range of 10.25.0.0 with a subnet mask of 16 bits and we will mark this location as a trusted location. A second location can also be created for a New York Office, with an IP range of 10.26.0.0 and the same subnet mask of 16 bits. PowerShell is required to create the Named Location.  Notice that there is an MS Graph object for the IP subnet of the office in the example below.

Example:
IT wants a Conditional Access Policy to force multi-factor authentication (MFA) for all cloud apps unless users access apps from two locations.  The locations are both physical offices in Chicago and New York, with subnets of 10.25.0.0/16 and 10.26.0.0/16, respectively.  We will first create the two Named Locations using New-AzureADMSNamedLocationPolicy and then create a new CA Policy using New-AzureADMSConditionalAccessPolicy to reference the locations:
$CHISubnet = NewObject TypeName Microsoft.Open.MSGraph.Model.IpRange
$CHISubnet.cidrAddress = ‘10.25.0.0/16’
NewAzureADMSNamedLocationPolicy OdataType “#microsoft.graph.ipNamedLocation” DisplayName ‘Chicago Office’ IsTrusted $True IpRanges $CHISubnet
$NYSubnet = NewObject TypeName Microsoft.Open.MSGraph.Model.IpRange
$NYSubnet.cidrAddress = ‘10.26.0.0/16’
NewAzureADMSNamedLocationPolicy OdataType “#microsoft.graph.ipNamedLocation” DisplayName ‘New York Office’ IsTrusted $True IpRanges $NYSubnet

We can validate the properties of the locations with the Get-AzureADMSNamedLocationPolicy cmdlet, which we should do before proceeding:

Figure 2: New Named Locations and the configured settings in the red rectangles.

** To be fair, the same output is also generated when creating a Named Location, but the above illustrates what can be seen with the Get-* portion of the Named Location cmdlets.

With the Named Locations created, we can now use these into a CA Policy (keep in mind there are a lot of settings to a CA Policy).  First, we need an object to hold the Condition for the CA Policy (Applications, Users, and Locations). Breakdown of the available conditions available in ConditionalAccessConditionSet is defined here.
$CAConditions = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessConditionSet
$CAConditions.Applications = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessApplicationCondition
$CAConditions.Applications.IncludeApplications = ‘All’

In this section, we define the users to apply the Conditional Access Policy to (Users.ExcludeUsers) as well as any users we wish to exclude (Users.ExcludeUsers).  In this case, the excluded user is a Break Glass account that is excluded from any policies we define and is shown below as the Object GUID for the user in Azure AD.  The GUIDs for the locations are in the ID field as seen in Figure 2.
$CAConditions.Users = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessUserCondition
$CAConditions.Users.IncludeUsers = ‘All’
$CAConditions.Users.ExcludeUsers = ‘22561a78-a72e-4d39-898d-cd7c57c84ca6’
$CAConditions.Locations = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessLocationCondition
$CAConditions.Locations.IncludeLocations = ‘0743ff81-cea0-40d2-b0f0-4028cdc1c61a’,‘0f0c7a7f-4863-480e-9b71-d8f9eddb37e4’

** Be careful with the ‘All’ Applications AND ‘All’ user settings as this does affect the Azure AD Portal as well, which means you could get locked out of the portal and not able to change your CA Policies.  Make sure to have a Break Glass Account created and excluded as shown here [Users.ExcludeUsers].  For more information on Break Glass Accounts, refer to this blog post.

Next, we need to configure Grant Controls for the MFA requirement.  Like the Conditions above we also need a Graph object and provide an operator (‘Or’ / ‘And’) as well as the control, in our case MFA:
$CAControls = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessGrantControls
$CAControls._Operator = “OR”
$CAControls.BuiltInControls = “Mfa”

** A list of available BuiltInControls can be found here.

Summary of Policy Settings:

  • Applies to all Cloud Apps
  • Applied to all users in Azure AD
  • Excluded from applying to one account
  • Enforces MFA for these users and these apps
  • Two Names Locations included, where MFA will not be enforced

We now have our $CAConditions object and $CAControls object populated with the required settings and can use this to create a CA Policy per our initial requirements:
NewAzureADMSConditionalAccessPolicy DisplayName “CloudApps-MFA-CorpWide” State “Enabled” Conditions $CAConditions GrantControls $CAControls

Unfortunately, all the settings put in place for this CA Policy are obfuscated by ‘System.Collections’ as the properties contain complex data sets. To validate settings for the CA Policy we must pick apart each condition to see what is present. Applications, Users and Locations are all sub-properties of the Conditions property on a CA Policy; thus we can break each of these sections down like so:
((GetAzureADMSConditionalAccessPolicy).Conditions).Applications

((GetAzureADMSConditionalAccessPolicy).Conditions).Users

((GetAzureADMSConditionalAccessPolicy).Conditions).Locations

In all cases, Applications, Users and Locations are listed in the Conditional Access Policies as ObjectIds.  If a report were generated for management, additional PowerShell queries would be needed to convert these values to names.  Converting the ObjectIds to names would also help with verification or documenting/auditing CA Policies.

Altering an Existing Conditional Access Policy

Now that we have a CA Policy in place, we can use PowerShell to update Applications, Users, Locations and Controls as well as other unconfigured items like Platforms and Sign-In Risk.

Changing the Enforcement of a CA Policy
One change that could be implemented is changing the state of the CA Policy.  CA Policies have three states: EnabledForReportingButNotEnforced, Enabled and Disabled.  For example, if a new CA Policy is in place and users are prompted for MFA when inside a corporate location, then the policy could potentially be disabled or set to report only until troubleshooting is performed:
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163 State Disabled
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163 State EnabledForReportingButNotEnforced

With the policy effectively disabled, IT can now make changes to the policy, validate them, and then re-enable the policy:
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163 State Enabled

Changing CA Policy Controls
In the previous example, there is a requirement for an MFA prompt when using a cloud app and connecting from outside the two Named Locations.   These controls can also be adjusted if additional security requirements are needed or possible alternatives (Azure AD Joined / Compliance).  Controls on CA Policies can also be combined (AND) versus the one control (OR) that was in the previous example.

For example, if we want to add that when a user accesses a cloud app, while external, they need to do so from a Compliant Device AND they must be prompted for MFA, we can do so by creating a new Control object and then apply this to the existing CA Policy.  A Compliant Device is a device that meets a specified list of criteria that is predefined like OS version level, not jailbroken, etc. First the Controls:
$CAControls = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessGrantControls
$CAControls._Operator = “AND”
$CAControls.BuiltInControls = “Mfa”,“CompliantDevice”

Notice that we have an AND operator, which declares that both Controls need to be true.  Also, of note, is that the configured Controls are MFA and a Compliant Device.  These controls are then applied to an existing CA Policy, overwriting any existing settings:
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163s GrantControls $CAControls

Before:

After:

 

We can now place this in a one-liner like so:
RemoveAzureADMSConditionalAccessPolicy PolicyId 7ac2fceed5bd4003ad105dfa838417da

No prompt is given, so be careful with the removal of CA Policies.

Named Locations

As mentioned previously, Conditional Access Policies use the concept of Named Locations to correspond to physical locations in an organization’s environment. We can use the Get-AzureADMSNamedLocationPolicy to compare what is shown in PowerShell and what is displayed in the Azure AD portal for Named locations. This is useful for validating a configuration as well as understanding any differences between the two displays:

Figure 3: Breakdown of a Named location in PowerShell and in the Conditional Access section of AzureAD.

Modify Existing Named Location

Organizations change over time which might mean that changes are needed for named locations. For example, you might need to rename locations, add subnets, remove subnets, mark them as trusted, or remove the trust. All these changes can be performed with the Set-AzureADMSNamedLocationPolicy cmdlet. Let us see what it takes to make each of these changes:

Change Named Location ‘Name’
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a DisplayName ‘Chicago Loop Office’
Change IP Range
$ipRanges = NewObject TypeName Microsoft.Open.MSGraph.Model.IpRange
$ipRanges.cidrAddress = ‘10.25.0.0/16’
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a IpRanges $ipRanges OdataType “#microsoft.graph.ipNamedLocation”
Trusted
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a IsTrusted $False
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a IsTrusted $True

Removing a Named Location

When offices close or are no longer needed for a Conditional Access Policy, the Named Locations can be removed if there is a desire to do so. Leaving the Named Locations behind does not create a security risk. We first need to determine if there are any existing Conditional Access Policies that contain the location. We can pick these out from the Conditions property of the CA Policy and look at the Locations sub-property, which stores the location as a System.Collections.Generic.List:

 

We can reveal the GUID:
((GetAzureADMSConditionalAccessPolicy).Conditions).Locations

Let us assume we need to remove our Chicago Office, which is a Named Location:

 

The ‘Id’ value is what we can use to match to the Locations.IncludeLocations value in the CA. To find any CA Policy with this Named Location, we need to match the value like so:
$OldNamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘Chicago office’}).Id
$CAPolicies = GetAzureADMSConditionalAccessPolicy
Foreach ($CAPolicy in $CAPolicies) {
If ($OldNamedLocationId eq((($CAPolicy).Conditions).Locations).IncludeLocations) {
WriteHost “CA Policy $($CAPolicy.Id) contains the Named Location.”
}
}

We may have only one location, or we may have multiple locations, but all matching entries will be displayed:
CA Policy 4e7dac6d-d471-483f-9677-46407acaa3f5 contains the Named Location.
We can decide to either remove the policy or just remove the Named Location. If the Named Location is the only one, then we would need to replace it to keep the CA Policy. For this example, we will replace the location with another Named Location. First, find any non-Chicago Office Named Locations:
GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName ne ‘Chicago office’} | ft DisplayName,Id

We then use the ID from this location to replace the one we need to remove:
$OldNamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘Chicago office’}).Id
$NewNamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘LA Office Office’}).Id
$conditions = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessConditionSet
$conditions.Locations = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessLocationCondition
$conditions.Locations.IncludeLocations = “$NewNamedLocationId”
Foreach ($CAPolicy in $CAPolicies) {
If ($OldNamedLocationIdeq ((($CAPolicy).Conditions).Locations).IncludeLocations) {
SetAzureADMSConditionalAccessPolicy PolicyId $($CAPolicy.ID) Conditions $Conditions
}
}

Now we re-run the same code block we ran to find the CA Policies with the Named Location:
$NamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘Chicago office’}).Id
$CAPolicies = GetAzureADMSConditionalAccessPolicy
Foreach ($CAPolicy in $CAPolicies) {
If ($NamedLocationId eq((($CAPolicy).Conditions).Locations).IncludeLocations) {
WriteHost “CA Policy $($CAPolicy.ID) contains the Named Location.”
}
}

Now no results are returned, and we are safe to remove the Named Location:

**Tip: Make sure to remove a Named Location from a Conditional Access Policy
If we try to remove a Named Location that is defined in a Conditional Access Policy, an error is generated. This is because the Named Location is attached to the Conditional Access policy and needs to be removed first.

Figure 4: Attempting to remove a Named Location before removing it from a Conditional Access policy.

Final Thoughts

From the article, we can see that Microsoft provides a set of PowerShell cmdlets that provide a convenient access point to Conditional Access Policies. These CA Policies can be created, modified, and even removed as with these cmdlets. We also have cmdlets to create Named Locations, which can, in turn, be used by the Conditional Access Policies. Potential uses for these cmdlets span from creation and management of CA Policies to documentation and auditing of Policies for security personnel/consultants needing to verify the security stance of an organization. Make sure to read the documentation links provided in this article as well as the Get-Help for the cmdlets for guidance on values to be used for the various moving parts of a policy – Locations, Platforms, Grant Controls, Session controls, and more.

Source Practical365

read more
Azure AD

How to Define Custom Sensitive Information Types for Use in DLP Policies

21-09-2020-657-p365-Generic-IT-LOW-340×200

New Interface and New Capabilities Make It Easier to Manage Sensitive Information Types

Sensitive information types are used by Microsoft 365 components like DLP policies and auto-label (retention) policies to locate data in messages and documents. Earlier in January, Microsoft released a set of new sensitive information types to make it easier to detect country-specific sensitive data like identity cards and driving licenses. Message center notification MC233488 (updated January 23) brings news of two additional changes rolled out in late January and early February.

Confidence Levels for Matching

First, the descriptions of the confidence levels used for levels of match accuracy are changing from being expressed as a percentage (65%, 75%, 85%) to low, medium, and high. Existing policies pick up the new descriptors automatically. This is Microsoft 365 roadmap item 68915.

The change is intended to make it easier for people to better understand the accuracy of matches when using sensitive information types. DLP detects matches in messages and documents by looking for patterns and other forms of evidence. The primary element (like a regular expression) defined in the pattern for a sensitive information type defines how basic matching is performed for that type. Secondary elements (like keyword lists) add evidence to increase the likelihood that a match exists. The more evidence is gathered, the higher the match accuracy and the confidence level increases from low to medium to high. Policy rules use the confidence level to decide what action is necessary when matches are found. For instance, a rule might block documents and email when a high confidence exists that sensitive data is present while just warning (with a policy tip) for lower levels of confidence.

Copying Sensitive Information Types

The second change is that you can copy sensitive information types, including the set of sensitive information types distributed by Microsoft. For instance, let’s assume that you don’t think that the standard definition for a credit card number works as well as it could. You can go to the Data Classification section of the Compliance Center, select the credit card number sensitive information type, and copy it to create a new type (Figure 1). The copy becomes a custom sensitive information type under your control to tweak as you see fit.

Copying the sensitive information type for a credit card number
Figure 1: Copying the sensitive information type for a credit card number

Improved User Interface

The third change is the most important. Figure 1 is an example of a new user interface to manage sensitive information types (Microsoft 365 roadmap item 68916). The new interface is crisper and exposes more information about how information types work. For instance, in Figure 1, we can see that the primary element for credit card number detection is a function to detect a valid credit card number. Further evidence (supporting elements) come from the presence of keywords like a credit card name (for example, Visa or MasterCard) and expiration date near a detected number.

Few are likely to have the desire to tweak the standard sensitive information types. However, being able to examine how Microsoft puts these objects together is instructive and helpful when the time comes to create custom sensitive information types to meet business requirements.

Detecting Azure AD Passwords

For instance, MVP James Cussen points out that Azure AD passwords are not included in the list of sensitive information types. While some people need to send passwords around in email and Teams messages, it’s not the most secure way of transmitting credentials. In this post, he uses the old interface to define a sensitive information type to detect passwords. To test the new interface, I used his definition as the basis for a new custom sensitive information type.

The primary element to match passwords is a regular expression:

A bunch of suggested expressions to detect passwords can be found on the internet. Most fail when input for use with a sensitive information type because they fail Microsoft’s rules to detect illegal or inefficient expressions. Not being a Regex expert, I tried several (all worked when tested against https://regex101.com/), and all failed except the one created by James.

A keyword list is a useful secondary element to add evidence that a password is found. The list contains a comma-separated set of common words that you might expect to find close to a password. For instance:

“Here’s your new password: 515AbcSven!”

“Use this pwd to get into the site ExpertsAreUs33@”

In multilingual tenants, the ideal situation is to include relevant words in different languages in the keyword list. For instance, if the tenant has Dutch and Swedish users, you could include wachtwoord (Dutch) and lösenord (Swedish). To accommodate the reality that people don’t always spell words correctly, consider adding some misspelt variations of keywords. In this instance, we could add keywords like passwrod or pword.

James’s definition allows keywords to be in a window of 300 characters anchored on the detected password (see this Microsoft article to understand how the window works). I think this window is too large and might result in many false positives. The keyword is likely to be close to the password, so I reduced the window to 80 characters.

Figure 2 shows the result after inputting the regular expression, keyword list, confidence level (medium), and character proximity. It’s a less complex definition than for Microsoft’s sensitive information types. The big question is does it work.

Definition for the Azure Active Directory password custom sensitive information type
Figure 2: Definition for the Azure Active Directory password custom sensitive information type

Testing

The Test option allows you to upload a file containing sample text to run against the definition to see if it works. As you can see in Figure 3, the test was successful.

Testing a custom sensitive information type
Figure 3: Testing a custom sensitive information type

Using the Custom Sensitive Information Type in a Teams DLP policy

Testing gives some confidence that the custom sensitive information type will work properly when deployed in a DLP policy. After quickly creating a DLP policy for Teams, we can confirm its effectiveness (Figure 4) in chats and channel conversations.

Passwords blocked in a Teams chat
Figure 4: Passwords blocked in a Teams chat

I deliberately choose Teams as the DLP target because organizations probably don’t want their users swapping passwords in chats and channel conversations. Before rushing to extend the DLP policy to cover email, consider the fact that it’s common to send passwords in email. For instance, I changed the policy to cover email and Teams and discovered that the policy blocks any invitation to Zoom meetings because these invitations include the word “pwd” as in:

Although it might be an attractive idea to block Zoom to force people to use Teams online meetings instead, it’s not a realistic option. The simple solution is not to use this DLP policy for email.

False Positives and Policy Overrides

The downside of matching text in messages against keywords defined in a policy is that some false positives can happen. For instance, I have a Flow to import tweets about Office 365 into a team channel. As Figure 5 shows, some tweets are picked up as potential password violations because a keyword appears near a string which could be a valid password.

Tweets posted in Teams are blocked because they match the password definition
Figure 5: Tweets posted in Teams are blocked because they match the password definition

Adjusting the definition for the sensitive information type to reduce the character proximity count (from 80 to 60) reduced the number of false positives. Testing and observation will tell how effective changes like this are when exposed to real-life data.

Apart from adjusting character proximity, two other potential solutions exist. First, amend the DLP policy to allow users to override the block and send the message with a justification reported to the administrator. If the message is important, users will override the policy. The administrator will be notified when overrides occur and tweak the policy (if possible) to avoid further occurrences.

The second option is to exclude accounts (individually or the members of a distribution list) which have a business need to send passwords from the DLP policy. DLP will then ignore messages sent by these individuals.

Creating Custom Sensitive Information Types a Nice to Have

Given the broad range of standard types created by Microsoft, the need to define custom sensitive information types isn’t likely to be a priority for most tenants. However, for those who need this feature for business reasons, the recent changes are both welcome and useful.

Source Practical365

read more
Active DirectoryAzure AD

Azure AD Suffers Another Big Authentication Outage

16-11-2020-711-p365-Labyrinth-LOW-300×162

Failure to Authenticate Causes Users to Lose Access to Apps

Microsoft 365 users around the world were under impressed on Monday, March 15 when Azure AD authentication requests failed to block their access to apps like Teams and Exchange Online. Downdetector.com started to report problems with users being unable to sign into Teams and other Office 365 apps around 19:00 UTC. The number of reports grew rapidly (Figure 1) for incident MO244568.

Downdetector.com reports a problem with Teams
Figure 1: Downdetector.com reports a problem with Teams

The Azure status page said “Starting at approximately 19:15 UTC on 15 Mar 2021, a subset of customers may experience issues authenticating into Microsoft services, including Microsoft Teams, Office and/or Dynamics, Xbox Live, and the Azure Portal.” The difference between the time users detected issues and Microsoft declaring an incident isn’t unusual, as Microsoft engineers need time to figure out if reported problems are due to a transient hiccup or something deeper.

Why Some Users Could Continue to Work

Based on my own experience, it seemed as if apps continued to work if they didn’t need to make an authentication request to Azure AD. In my case, I was connected to the Microsoft tenant in Teams and couldn’t switch back to my home tenant because the Teams client wasn’t able to authenticate with that tenant. As anyone who has ever looked at the Teams activity in the Office 365 audit log can testify, Teams desktop clients authenticate hourly when their access token expires. This might be the reason why Microsoft highlighted the effect on Teams in their communications for the Microsoft 365 health status page (Figure 2).

Microsoft 365 service health status page
Figure 2: Microsoft 365 service health status page

The Microsoft 365 Service health dashboard wasn’t much help (Figure 3) because it needs to authenticate to the Office 365 Services Communications endpoint (here’s an example of how to retrieve the same incident data with PowerShell). Because authentication failed, the Service health dashboard couldn’t retrieve incident details. No doubt some ISVs will make the point that relying on Microsoft APIs during a major outage might be a case of putting all your eggs in the proverbial basket.

No joy in the Microsoft 365 service health dashboard
Figure 3: No joy in the Microsoft 365 service health dashboard

If an app didn’t need to authenticate, it continued to work quite happy. I used Outlook desktop and the browser interfaces to SharePoint Online, OneDrive for Business, OWA, and Planner during the outage. On the other hand, while writing this article, Word couldn’t upload the document to SharePoint Online or autosave changes because of an authentication failure. Another interesting problem was that messages sent to a Teams channel email address failed because the connector used to deliver the email to Teams couldn’t authenticate.

Similar Authentication Woes in September 2020

At first glance, this problem seems like that of the September 28/29 Azure AD outage last year. Microsoft said then that “A latent code defect in the Azure AD backend service Safe Deployment Process (SDP) system caused this to deploy directly into our production environment.” In other words, a code change containing a bug made its way into production, which seems very like the reason cited by the Microsoft 365 status twitter account (@msft365status) at 20:10 UTC that the problem was due to “a recent change to an authentication system.”

Unfortunately, as I wrote in February 2019, Azure AD is the Achilles Heel of Office 365. When everything works, it’s great. When Azure AD falls over, everything comes to a crashing halt across Microsoft 365. This is despite Microsoft’s December 2020 announcement of a 99.99% SLA for Azure AD authentication, which comes into effect on April 1, 2021. That is, if you have Azure AD Premium licenses.

Quite reasonably, people asked why Microsoft had deployed code changes on the first day of the working week. That’s a good question. Although people use Office 365 every day, the weekend demand on the service is much lower than Monday-Friday, so it’s fair to look for Microsoft to make changes which might impact users then. Or even better, test their software before allowing code to make it through to production services.

Current Status

As of 21:15 UTC, Microsoft said “We’ve identified the underlying cause of the problem and are taking steps to mitigate impact. We’ll provide an updated ETA on resolution as soon as one is available.” At 21:25 UTC the status became a little more optimistic “We are currently rolling out a mitigation worldwide. Customers should begin seeing recovery at this time, and we anticipate full remediation within 60 minutes.”

After the mitigation rolled out, the problem appears to be resolved (22:10 UTC). Microsoft says “The update has finished its deployment to all impacted regions. Microsoft 365 services continue the process of recovery and are showing decreasing error rates in telemetry. We’ll continue to monitor service health as availability is restored.” Normal service appears to have resumed for Teams and other apps, at least in the tenants I access.

On March 16, Microsoft noted that some organizations might still see some Intune failures. They anticipate that remediation actions taken by 21:00 UTC will address these lingering problems.

Problem sorted for now, but it will be interesting to see what Microsoft’s Post Incident Report says.

Source Practical 365

read more
Active DirectoryAzure AD

How to Decide Between Azure AD Connect and Azure AD Connect Cloud Sync

017-02-15-2021-BLOG-How-to-decide-between-Azure-AD-Connect-and-Azure-AD-Connect-Cloud-Sync-340×200

Microsoft recently announced that Azure AD Connect Cloud Sync had reached GA (general availability), adding another option for directory synchronization with Microsoft 365. This article provides a background on directory synchronization and why it is fundamental for your journey to the cloud. Then we will discuss the solutions and give you the information you need to pick the right solution. Let’s begin with some basics.

What is Azure AD Sync, and Why Do I Need It?

Most organizations run Active Directory on-premises. This directory is usually the source of authority for all users, groups, and computers in a Windows domain. The domain provides a way to centrally manage accounts, passwords, policies, and permissions on-premises.

When an on-premises organization decides to use Microsoft 365, it needs a way to bring those on-premises accounts into Azure AD to use the new cloud services like Exchange Online, Teams, SharePoint Online, etc. Most organizations want to use their existing on-premises accounts rather than create new accounts and manage different passwords. That is where Azure AD Connect comes in. Both Azure AD Connect and Azure AD Connect Cloud Sync synchronize and link objects from AD to Azure AD and synchronize password hashes (not passwords) to maintain a single sign-on experience.

Azure AD Connect

Azure AD Connect has a long and storied past. It is based on Microsoft Identity Manager (MIM), which is used to bridge multiple on-premises authoritative systems and authentication stores. MIM is the sixth generation of Microsoft identity management solutions since they bought two similar technologies in 1997 and 1999.

While MIM can be expensive and bridges multiple authoritative directories, Azure AD Connect is free and purpose-built to bridge Active Directory with Azure Active Directory. This is known as hybrid identity.

Azure AD Connect is installed on an on-premises domain-joined server and is even supported to be installed on a domain controller. It only requires an outbound HTTPS connection to Microsoft 365 servers.

Capabilities

Since its humble beginnings of syncing a single AD to a single Azure AD tenant, Azure AD Connect’s capabilities have expanded significantly. Currently, this includes:

  • Synchronization between
    • Single forest, single Azure AD tenant.
    • Multiple forests, single Azure AD tenant.
    • Single or multiple forests, multiple Azure AD tenants (requires that each object is only represented once in all tenants).
    • LDAPv3-compatible identity stores.
  • Password Hash Synchronization (PHS) – use Azure AD as your organization’s identity provider by synchronizing password hashes to Azure AD.
  • Pass-Through Authentication (PTA) – use your organization’s Domain Controllers as your identity provider without having to deploy a full-blown AD FS configuration.
  • Federation integration with Active Directory Federation Services (AD FS).
  • Health monitoring of both Active Directory and the synchronization process.
  • Accommodating up to 10GB of database space (up to 100,000 objects) using LocalDB. If your organization exceeds this limit, use a full SQL Server.
  • Organizational Unit, group, or attribute filtering.
  • Exchange hybrid writeback capabilities for organizations with Exchange Server.
  • Exchange Public Folder address synchronization for directory-based edge blocking.
  • Password writeback capabilities to support self-service password reset (SSPR).
  • Office 365 Group writeback to prevent email address overlaps.
  • Directory extension attribute synchronization to extend the schema in Azure AD to include specific attributes consumed by LOB apps and Microsoft Graph Explorer.
  • Robust synchronization rule editing capabilities.
  • Seamless single sign-on (SSSO) capabilities that allow domain-joined users and computers to access Microsoft 365 workloads without being prompted to sign-in every time.
  • Hybrid Azure AD Join capabilities.
  • Device writeback capabilities that allow organizations to use on-premises conditional access and Windows Hello.
  • Synchronizing directory changes every 30 minutes and password changes almost immediately when using password hash sync.

Read more about Azure AD Connect: How it works and best practices for synchronizing your data

Azure AD Connect Cloud Sync

Microsoft realizes that it is unfortunate that your organization’s journey to the cloud-first requires installing more software on-premises. Azure AD Connect Cloud Sync is a cloud service alternative to Azure AD Connect software. The organization deploys one or more lightweight agents in their on-premises environment to bridge AD and Azure AD. The configuration is done in the cloud.

The service provides some of the features and capabilities that Azure AD Connect provides, making it useful for some merger and acquisition scenarios. It is important to note that Azure AD Connect Cloud Sync does not support Exchange hybrid, which reduces the number of useful scenarios.

Capabilities

Azure AD Connect Cloud Sync has many of the same features and capabilities as Azure AD Connect with the following differences:

  • Lightweight agent installation model.
  • Adds high availability using multiple agents.
  • Allows connectivity to multiple disconnected on-premises AD forests
  • Synchronizes directory changes more frequently than Azure AD Connect.
  • Can be used in addition to Azure AD Connect.
  • Does not support Exchange hybrid writeback.
  • Does not support LDAPv3-compatible identity stores.
  • Does not support device objects.
    • No hybrid Azure AD join.
    • No support for Windows Hello.
  • Does not support directory attribute synchronization.
  • Does not support Pass-Through Authentication (PTA).
  • Does not support synchronization rule editing capabilities.
  • Does not support writeback for passwords, devices, or groups.
  • Does not support cross-domain references.

As you can see, there are several gaps in functionality that limit the use of Azure AD Connect Cloud Sync. It is expected that Microsoft may close these gaps with future updates. The fact that this is a cloud-based service means that they can iterate rather quickly. I would not expect Exchange hybrid support anytime soon, though.

Appropriate Use Cases for Each

Choosing which directory synchronization solution to use requires a full understanding of what your organization’s needs are.

Azure AD Connect has the most features and compatibility. Almost all customers I encounter use Exchange Server or Exchange Online. The lack of Exchange hybrid support with Azure AD Connect Cloud Sync limits the use of that solution.

If you don’t need Exchange hybrid support or any of the other unsupported features, Azure AD Connect Cloud Sync can be a quick and easy way to configure AD directory synchronization with Azure AD. Examples include mergers and acquisitions where the organization being acquired has limited IT experience. By installing a simple, lightweight agent on a domain server, the acquiring organization can configure and manage directory synchronization from their tenant.

The marketing slides and videos introducing Azure AD Connect Cloud Sync often talk about the “heavy infrastructure investment” required for Azure AD Connect. A LocalDB database is installed with Azure AD Connect and has a 10GB limit (about 100,000 objects). Unless your organization’s Active Directory exceeds this, there is no requirement for additional infrastructure at all. Azure AD Connect can be installed on any existing domain-joined server running Windows Server 2012 or later or directly on a domain controller. It only requires an outbound HTTPS connection to the Internet.

Organizations with over 100,000 objects would likely save money with Azure AD Connect Cloud Sync since it does not require a full SQL server deployment. Still, organizations this size are usually running Exchange.

A scenario where Azure AD Connect Cloud Sync might be useful is one where an organization has AD on-premises but uses Google Workspace for email. This organization can sync their directory to Azure AD and then begin migrating Google mail to Exchange Online.

Azure AD Connect Cloud Sync is also the appropriate choice when connecting to multiple disconnected on-premises AD forests. Azure AD Connect requires line-of-site connectivity between multiple on-premises AD forests. This can be useful in some merger and acquisition scenarios.

Ultimately, you should deploy Azure AD Connect Cloud Sync if it provides the features and compatibility your organization needs. Otherwise, you will need to use the more fully-featured Azure AD Connect.

Security Considerations for Protecting Access to Azure AD Connect and Azure AD Connect Cloud Sync

Organizations should treat any server running Azure AD Connect or the Azure AD Connect Cloud Sync agent as a tier-0 asset – the same as a domain controller – since it is responsible for directory synchronization with Azure AD. Organizations should restrict administrative access to the Azure AD Connect server to only domain administrators or other tightly controlled security groups.

Azure AD Connect installation and configuration must be run with an Enterprise Admin account in AD and requires a Global Administrator account in the tenant.

Azure AD Connect Cloud Sync must be installed with an AD account with local admin permission on the server or Domain Admin permissions on a domain controller and requires a tenant account with Hybrid Identity Administrator or Global Administrator roles in the tenant.

For Azure AD Connect, the user account used to install it is automatically added to the local ADSyncAdmins security group. The best practice is to add Domain Admins to this group so more than one account can manage directory synchronization. Remove the individual user account that was used to install Azure AD Connect from this group.

The account used for configuration requires specific rights and is only used for installation or configuration. Directory synchronization will not be impacted if the account is disabled or deleted.

Both synchronization solutions use the highest TLS available in Windows Server. To ensure that Azure AD Connect and Azure AD Connect Cloud Sync use TLS 1.2 set the following registry keys, then restart the server:

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client]
“DisabledByDefault”=dword:00000000
“Enabled”=dword:00000001
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server]
“DisabledByDefault”=dword:00000000
“Enabled”=dword:00000001
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319] “SchUseStrongCrypto”=dword:00000001

Summary

Both Azure AD Connect and Azure AD Connect Cloud Sync provide ways for organizations to synchronize AD with Azure AD. Both solutions are easy to deploy and provide the features that organizations need to provide a unified sign-in experience to Microsoft 365.

Understand your organization’s requirements. Azure AD Connect Cloud Sync is the preferred way to synchronize on-premises AD to Azure AD, assuming you can get by with its limitations. Azure AD Connect provides the most feature-rich synchronization capabilities, including Exchange hybrid support.

From a security perspective, treat your organization’s Azure AD Connect server or agent the same as a domain controller and other Tier 0 resources.

Source Practical 365

read more
Azure AD

For companies who migrated to Azure before the pandemic, the cloud was the silver lining

In-Home-Solutions-Door-768×512

In challenging times, innovative companies don’t panic, they pivot.

So when the coronavirus traveled the globe, disrupting everything from grocery shopping to tax preparation, resilient companies around the world responded with ingenious solutions that ensured the safety of their employees, met the needs of their customers and kept their IT operations humming.

For some, it meant quickly enabling employees to work from home instead of at the office, while others rushed to re-engineer their supply chains, so their customers could continue to buy the products they needed to stay safe and healthy. Their solutions were custom but the step that allowed them to react so rapidly was common: They had migrated their operations to Microsoft Azure.

Even before COVID-19, companies were hungry to enable secure remote work, achieve operational efficiency, address IT budget constraints and accelerate innovation, and migrating to the cloud was the critical first step to unlocking those accomplishments.

“Azure has always helped our diverse customers adapt to new ways of doing business,” says Jeremy Winter, director, Azure management and migration.

“While some are moving beyond crisis mode and others look to the cloud to stay resilient, we’re seeing an acceleration in business’ cloud migrations. We’ve spent a lot of time with customers who are both considering and executing migrations and understand the challenges they are facing. We also recognize that our customers are counting on us more than ever to recover and ultimately transform.”

Here’s how a handful of those Microsoft customers discovered during the health crisis that the cloud didn’t just have a silver lining – it was the silver lining.

A technician from Actavo arrives at a customer's home.

Actavo

Dublin, Ireland-based Actavo is a multi-disciplinary, global engineering services business that designs, builds and maintains the vital infrastructure we rely on every day.

Among their many services, they design, construct and maintain networks for the telecommunications and utilities industries; design, construct and install modular buildings; provide scaffold and rope access solutions along with the installation of insulation and removal of asbestos in the industrial sector; and provide hundreds of in-home technicians to companies who provide residential services like gas boiler repair, satellite installation or broadband internet installation to a half-million homes every year.

With so many essential industries relying on their round-the-clock support, even before the pandemic Actavo sought to increase operational efficiency and improve agility by migrating to Azure.

“Moving to the cloud gave us the versatility to be able to scale up and scale down in any country where we needed to operate, but it also gave us the robustness from a disaster recovery perspective,” says Willie Ryan, global Environmental Health and Safety and IT director.

A novel coronavirus may not have been the disaster that Actavo had in mind, but when it arrived, the decision to migrate to the cloud proved even more valuable. With the new, versatile infrastructure in place, Actavo was able to quickly pivot to remote work for its employees, ensuring that they could continue to support their clients around the world.

“We’ve got a very solid platform, and that was quite evident as we came through COVID and the pandemic, in terms of, we moved from an on-prem operation to everybody working from home, seamlessly, quickly, easily, without any major headaches,” Ryan says.

“Within the space of a week we were able to tick a box that said, ‘If lockdown comes, we’re ready to go.’ The lockdown announcement came at five o’clock on a Friday afternoon. We actually had everybody that needed to be working at home, working at home on Monday morning.”

Silvan Schriber, head of corporate development for Additiv

Additiv

Additiv partners with the world’s leading financial institutions to help them digitalize their wealth management activities. Their ability to advocate for change and innovation in an industry that is highly regulated is dependent on building trust with their clients by adhering to the highest standards of security and compliance.

So when the company moved its enterprise software offering for its clients to the cloud, they worked with Microsoft to ensure a seamless transition that would strictly adhere to regulations across multiple locations – a prerequisite for a global company that is headquartered in Switzerland and has offices and clients on three continents.

Now, Additiv can offer its clients innovative, flexible software-as-a-service solutions that can scale easily and that adapt to their specific needs, no matter where they’re located. “Digital collaboration means breaking down these virtual borders, and Azure allowed us to integrate better so that our clients can provide a holistic solution to their customers out of the cloud,” says Silvan Schriber, head of corporate development.

“The benefits of Azure for Additiv are 100 percent trust that the solutions adhere in each location to regulations,” he adds. “So it helps us in centrally managing our solutions and placing them in each country of our clients, with virtually the click of a button.”

That consistency and efficiency will be critical as Additiv’s financial services customers increasingly embrace digital strategies and omni-channel distribution to reimagine their businesses post-pandemic.

“The pace of change pre-pandemic was rapid, but there’s no reason to think these trends won’t accelerate post-pandemic,” the company writes in a report published earlier this year. “… the companies that are faring best are those that are either digitally native or have invested the most in digital.”

An Albertsons storefront.
Azure migration helps Albertsons Companies deliver a better shopping experience. (Photo courtesy of Albertsons Companies)

Albertsons Companies

Long before this past spring, people were changing the way they shop for groceries, and Albertsons Companies – one of the largest food and drug retailers in the United States with some 2,200 stores operating under banners including Albertsons, Safeway and Vons – was evolving along with them.

“I believe that every customer – from millennials to baby boomers – is transforming how they shop,” says Ramiya Iyer, senior vice president of Data, Digital & Merchandising at Albertsons Companies. “The retail industry is undergoing its own transformational journey to meet customers where they are and be relevant to them. It’s imperative that we do that to succeed and grow as a business.”

To stay competitive and give customers the modern, convenient shopping experience they had come to expect, Albertsons migrated from an aging on-premises server farm to Microsoft Azure.

The move allowed them to take advantage of artificial intelligence (AI) and cognitive services, and empowered an environment where developers can innovate and efficiently test new ideas, propelling Albertsons to introduce apps that make shopping faster and more convenient.

“With Azure, we can bring new ideas to market faster and deliver releases on shorter time cycles,” says Param Deivreddy, vice president, IT architecture. “We think Azure also encourages exploration and innovation, because you don’t have to spend a huge amount on infrastructure to quickly test a new idea.”

The company’s easy-to-use apps offer the kinds of personalized and streamlined experiences digitally savvy shoppers demand, allowing them to create shopping lists, easily find the products they want, skip long check-out lines, even save time at the fuel pump by claiming rewards, activating the pump and paying for gas with a tap of the phone.

“Grocery shopping shouldn’t be a chore,” says Iyer. “We want to provide our customers with a totally frictionless experience that lets them enjoy the food they’re putting on the table with a minimum of fuss and stress.”

An H&R Block storefront.
Cloud computing allows H&R Block to ramp up during tax season and enables employees to serve customers from home. (Photo courtesy of H&R Block)

H&R Block

Few things in life are certain, but as the adage goes, taxes are one of them. Even during a health crisis, Americans are responsible for filing their annual return, and millions of them lean on Kansas City, Missouri-based H&R Block to navigate the process.

To streamline its operations and improve its ability to ramp up during tax season, H&R Block migrated its computing workload to Microsoft Azure, a move that would, “help the company better process millions of tax returns annually while allowing the firm to build financial software products with greater speed, quality and security,” writes CIO’s Clinton Boulton in a recent interview  on cloud migration with H&R Block CIO Alan Lowden.

“It’s what we need to do to enable our strategic vision of putting customers at the center,” Lowden tells CIO. “We have to give them a convenient experience of serving them any way they want to be served.”

So this past tax season, H&R Block was able to spring into action to support their customers while protecting the safety of their employees and the communities they serve. To keep their workforce safe during their busiest season – which was even longer this year to give taxpayers more time to file – they shifted to a work-from-home model that allowed 80,000 tax professionals to serve customers virtually.

That quick pivot wouldn’t have been possible if the company hadn’t migrated to Azure. Because they didn’t need to buy new hardware, or set up or configure additional infrastructure, they were able to make the critical change to remote workstations in less than two weeks. The company’s tax pros were able to provide the expertise their customers needed, and the even more critical refunds they were counting on.

When the demands of tax season return to normal, H&R Block will continue to inspire confidence in their customers through a seamless experience – wherever they like, however they like.

“Now we can present tax tips and offerings to our clients that are most relevant to them,” says Aditya Thadani, vice president of architecture and information management at H&R Block. “Migrating our platforms to Azure has really allowed us to serve our clients better.”

Source Microsoft

read more
Azure ADAzure VM

Microsoft Sustainability Calculator Aims to Help Cloud Customers Manage their Carbon Footprint

no thumb

Microsoft has announced a new tool that will help cloud customers better understand their carbon emissions. Called the Microsoft Sustainability Calculator, the product has landed in private preview ahead of a future launch.
According to Redmond, the aim of the Microsoft Sustainability Calculator is to give customers more oversight across their carbon output. More specifically, to become more transparent about what their environmental impact is.
“It’s challenging to make and meet meaningful carbon reduction goals without the ability to measure carbon emissions,” Microsoft says.

Microsoft has announced a new tool that will help cloud customers better understand their carbon emissions. Called the Microsoft Sustainability Calculator, the product has landed in private preview ahead of a future launch.

According to Redmond, the aim of the Microsoft Sustainability Calculator is to give customers more oversight across their carbon output. More specifically, to become more transparent about what their environmental impact is.

“It’s challenging to make and meet meaningful carbon reduction goals without the ability to measure carbon emissions,” Microsoft says.

By leveraging AI technology, the calculator provides accurate accounting for carbon usage. It highlights the impact caused by Microsoft cloud service across an organization’s footprint. Armed with the information, company’s can make concise decisions about their environmental impact.

For example, the Microsoft Sustainability Calculator measures the impact of moving regular applications to the cloud and how this can reduce the carbon output of a company.

Carbon Negative

It looks like the tool goes hand-in-hand with Microsoft’s own push to be carbon negative by 2030. Redmond made that pledge earlier this year. The decision followed a 2017 commitment to cut 75% of its carbon emissions by the same date and builds on 2019 revisions of 70% renewable energy by 2023.

“While the world will need to reach net zero, those of us who can afford to move faster and go further should do so,” Microsoft President Brad Smith said of the new commitment. “That’s why today we are announcing an ambitious goal and a new plan to reduce and ultimately remove Microsoft’s carbon footprint.”

Source Winbuzzer

read more
1 2 3 6
Page 1 of 6