close

Azure AD

Azure AD

Azure AD

Using PowerShell to Manage Conditional Access (CA) Policies

033-03-01-2021-BLOG-Managing-Conditional-Access-and-PowerShell-no-Graph-LOW-300×162

Microsoft provides many methods to manage a tenant’s data and users.  PowerShell is a powerful tool to manage resources, including Conditional Access Policies using a set of cmdlets in the AzureAD module.  In this article, we review the eight PowerShell cmdlets and how to use them.

**Note that both the AzureAG and AzureADPreview PowerShell modules contain these cmdlets.

PowerShell and CA Policies

First, connect to Azure Active Directory using either the AzureAD or AzureADPreview module:
ConnectAzureAD

After connecting, we can get a list of available PowerShell cmdlets by using these two one-liners:
GetCommand *conditional*
GetCommand *named*

Combined we get a total of eight cmdlets dealing with Conditional Access Policies and Names Location Policies:
GetAzureADMSConditionalAccessPolicy
NewAzureADMSConditionalAccessPolicy
RemoveAzureADMSConditionalAccessPolicy
SetAzureADMSConditionalAccessPolicy

GetAzureADMSNamedLocationPolicy
NewAzureADMSNamedLocationPolicy
RemoveAzureADMSNamedLocationPolicy
SetAzureADMSNamedLocationPolicy

Conditional Access Policies set conditions to determine the conditions under which users receive access to apps.  These conditions can consist of locations, one or more users, applications, platforms, and more.

Figure 1: Properties of a new Conditional Access Policy

In the following examples, we examine these conditions to see what we can configure with PowerShell.

Creating a New Conditional Access Policy

A greenfield, or new tenant, has no Conditional Access Policies.  To utilize Conditional Access, we need to build its conditions.  If Named Locations are required, we need to create the Named Location first.  Let us walk through this process using an example scenario.

Named Locations

Conditional Access Policies can contain Named Locations that correspond to physical locations in an organization’s environment. Named Locations can be Physical locations with their corresponding IP subnet/range or a single country/a set of countries.  We can use Named Locations to provide a condition that controls where users are prompted (or not prompted) for MFA or other optional actions.   Included in the Azure AD Module, we saw that there are four PowerShell cmdlets for managing Named Locations and run the typical gamut of Get, New, Remove and Set PowerShell verbs.

In a new or Greenfield Azure AD, there are no Named Locations that can be used by Conditional Access and we need to either create these in Azure AD or with PowerShell.  To create a new Named Location policy, we need to use the New-AzureADMSNamedLocationPolicy cmdlet.

When creating a new Named location, we need to keep a couple of things in mind:

  • Display Name is required
  • Need to choose either an IP range or a list of countries as this determines the type of Named Location we are creating.

For the first Named Location, we can use some base criteria – Chicago Office, IP Range of 10.25.0.0 with a subnet mask of 16 bits and we will mark this location as a trusted location. A second location can also be created for a New York Office, with an IP range of 10.26.0.0 and the same subnet mask of 16 bits. PowerShell is required to create the Named Location.  Notice that there is an MS Graph object for the IP subnet of the office in the example below.

Example:
IT wants a Conditional Access Policy to force multi-factor authentication (MFA) for all cloud apps unless users access apps from two locations.  The locations are both physical offices in Chicago and New York, with subnets of 10.25.0.0/16 and 10.26.0.0/16, respectively.  We will first create the two Named Locations using New-AzureADMSNamedLocationPolicy and then create a new CA Policy using New-AzureADMSConditionalAccessPolicy to reference the locations:
$CHISubnet = NewObject TypeName Microsoft.Open.MSGraph.Model.IpRange
$CHISubnet.cidrAddress = ‘10.25.0.0/16’
NewAzureADMSNamedLocationPolicy OdataType “#microsoft.graph.ipNamedLocation” DisplayName ‘Chicago Office’ IsTrusted $True IpRanges $CHISubnet
$NYSubnet = NewObject TypeName Microsoft.Open.MSGraph.Model.IpRange
$NYSubnet.cidrAddress = ‘10.26.0.0/16’
NewAzureADMSNamedLocationPolicy OdataType “#microsoft.graph.ipNamedLocation” DisplayName ‘New York Office’ IsTrusted $True IpRanges $NYSubnet

We can validate the properties of the locations with the Get-AzureADMSNamedLocationPolicy cmdlet, which we should do before proceeding:

Figure 2: New Named Locations and the configured settings in the red rectangles.

** To be fair, the same output is also generated when creating a Named Location, but the above illustrates what can be seen with the Get-* portion of the Named Location cmdlets.

With the Named Locations created, we can now use these into a CA Policy (keep in mind there are a lot of settings to a CA Policy).  First, we need an object to hold the Condition for the CA Policy (Applications, Users, and Locations). Breakdown of the available conditions available in ConditionalAccessConditionSet is defined here.
$CAConditions = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessConditionSet
$CAConditions.Applications = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessApplicationCondition
$CAConditions.Applications.IncludeApplications = ‘All’

In this section, we define the users to apply the Conditional Access Policy to (Users.ExcludeUsers) as well as any users we wish to exclude (Users.ExcludeUsers).  In this case, the excluded user is a Break Glass account that is excluded from any policies we define and is shown below as the Object GUID for the user in Azure AD.  The GUIDs for the locations are in the ID field as seen in Figure 2.
$CAConditions.Users = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessUserCondition
$CAConditions.Users.IncludeUsers = ‘All’
$CAConditions.Users.ExcludeUsers = ‘22561a78-a72e-4d39-898d-cd7c57c84ca6’
$CAConditions.Locations = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessLocationCondition
$CAConditions.Locations.IncludeLocations = ‘0743ff81-cea0-40d2-b0f0-4028cdc1c61a’,‘0f0c7a7f-4863-480e-9b71-d8f9eddb37e4’

** Be careful with the ‘All’ Applications AND ‘All’ user settings as this does affect the Azure AD Portal as well, which means you could get locked out of the portal and not able to change your CA Policies.  Make sure to have a Break Glass Account created and excluded as shown here [Users.ExcludeUsers].  For more information on Break Glass Accounts, refer to this blog post.

Next, we need to configure Grant Controls for the MFA requirement.  Like the Conditions above we also need a Graph object and provide an operator (‘Or’ / ‘And’) as well as the control, in our case MFA:
$CAControls = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessGrantControls
$CAControls._Operator = “OR”
$CAControls.BuiltInControls = “Mfa”

** A list of available BuiltInControls can be found here.

Summary of Policy Settings:

  • Applies to all Cloud Apps
  • Applied to all users in Azure AD
  • Excluded from applying to one account
  • Enforces MFA for these users and these apps
  • Two Names Locations included, where MFA will not be enforced

We now have our $CAConditions object and $CAControls object populated with the required settings and can use this to create a CA Policy per our initial requirements:
NewAzureADMSConditionalAccessPolicy DisplayName “CloudApps-MFA-CorpWide” State “Enabled” Conditions $CAConditions GrantControls $CAControls

Unfortunately, all the settings put in place for this CA Policy are obfuscated by ‘System.Collections’ as the properties contain complex data sets. To validate settings for the CA Policy we must pick apart each condition to see what is present. Applications, Users and Locations are all sub-properties of the Conditions property on a CA Policy; thus we can break each of these sections down like so:
((GetAzureADMSConditionalAccessPolicy).Conditions).Applications

((GetAzureADMSConditionalAccessPolicy).Conditions).Users

((GetAzureADMSConditionalAccessPolicy).Conditions).Locations

In all cases, Applications, Users and Locations are listed in the Conditional Access Policies as ObjectIds.  If a report were generated for management, additional PowerShell queries would be needed to convert these values to names.  Converting the ObjectIds to names would also help with verification or documenting/auditing CA Policies.

Altering an Existing Conditional Access Policy

Now that we have a CA Policy in place, we can use PowerShell to update Applications, Users, Locations and Controls as well as other unconfigured items like Platforms and Sign-In Risk.

Changing the Enforcement of a CA Policy
One change that could be implemented is changing the state of the CA Policy.  CA Policies have three states: EnabledForReportingButNotEnforced, Enabled and Disabled.  For example, if a new CA Policy is in place and users are prompted for MFA when inside a corporate location, then the policy could potentially be disabled or set to report only until troubleshooting is performed:
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163 State Disabled
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163 State EnabledForReportingButNotEnforced

With the policy effectively disabled, IT can now make changes to the policy, validate them, and then re-enable the policy:
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163 State Enabled

Changing CA Policy Controls
In the previous example, there is a requirement for an MFA prompt when using a cloud app and connecting from outside the two Named Locations.   These controls can also be adjusted if additional security requirements are needed or possible alternatives (Azure AD Joined / Compliance).  Controls on CA Policies can also be combined (AND) versus the one control (OR) that was in the previous example.

For example, if we want to add that when a user accesses a cloud app, while external, they need to do so from a Compliant Device AND they must be prompted for MFA, we can do so by creating a new Control object and then apply this to the existing CA Policy.  A Compliant Device is a device that meets a specified list of criteria that is predefined like OS version level, not jailbroken, etc. First the Controls:
$CAControls = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessGrantControls
$CAControls._Operator = “AND”
$CAControls.BuiltInControls = “Mfa”,“CompliantDevice”

Notice that we have an AND operator, which declares that both Controls need to be true.  Also, of note, is that the configured Controls are MFA and a Compliant Device.  These controls are then applied to an existing CA Policy, overwriting any existing settings:
SetAzureADMSConditionalAccessPolicy PolicyId 21bdd8c1bb5c40d1a4de926888c69163s GrantControls $CAControls

Before:

After:

 

We can now place this in a one-liner like so:
RemoveAzureADMSConditionalAccessPolicy PolicyId 7ac2fceed5bd4003ad105dfa838417da

No prompt is given, so be careful with the removal of CA Policies.

Named Locations

As mentioned previously, Conditional Access Policies use the concept of Named Locations to correspond to physical locations in an organization’s environment. We can use the Get-AzureADMSNamedLocationPolicy to compare what is shown in PowerShell and what is displayed in the Azure AD portal for Named locations. This is useful for validating a configuration as well as understanding any differences between the two displays:

Figure 3: Breakdown of a Named location in PowerShell and in the Conditional Access section of AzureAD.

Modify Existing Named Location

Organizations change over time which might mean that changes are needed for named locations. For example, you might need to rename locations, add subnets, remove subnets, mark them as trusted, or remove the trust. All these changes can be performed with the Set-AzureADMSNamedLocationPolicy cmdlet. Let us see what it takes to make each of these changes:

Change Named Location ‘Name’
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a DisplayName ‘Chicago Loop Office’
Change IP Range
$ipRanges = NewObject TypeName Microsoft.Open.MSGraph.Model.IpRange
$ipRanges.cidrAddress = ‘10.25.0.0/16’
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a IpRanges $ipRanges OdataType “#microsoft.graph.ipNamedLocation”
Trusted
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a IsTrusted $False
SetAzureADMSNamedLocationPolicy PolicyId 0743ff81cea040d2b0f04028cdc1c61a IsTrusted $True

Removing a Named Location

When offices close or are no longer needed for a Conditional Access Policy, the Named Locations can be removed if there is a desire to do so. Leaving the Named Locations behind does not create a security risk. We first need to determine if there are any existing Conditional Access Policies that contain the location. We can pick these out from the Conditions property of the CA Policy and look at the Locations sub-property, which stores the location as a System.Collections.Generic.List:

 

We can reveal the GUID:
((GetAzureADMSConditionalAccessPolicy).Conditions).Locations

Let us assume we need to remove our Chicago Office, which is a Named Location:

 

The ‘Id’ value is what we can use to match to the Locations.IncludeLocations value in the CA. To find any CA Policy with this Named Location, we need to match the value like so:
$OldNamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘Chicago office’}).Id
$CAPolicies = GetAzureADMSConditionalAccessPolicy
Foreach ($CAPolicy in $CAPolicies) {
If ($OldNamedLocationId eq((($CAPolicy).Conditions).Locations).IncludeLocations) {
WriteHost “CA Policy $($CAPolicy.Id) contains the Named Location.”
}
}

We may have only one location, or we may have multiple locations, but all matching entries will be displayed:
CA Policy 4e7dac6d-d471-483f-9677-46407acaa3f5 contains the Named Location.
We can decide to either remove the policy or just remove the Named Location. If the Named Location is the only one, then we would need to replace it to keep the CA Policy. For this example, we will replace the location with another Named Location. First, find any non-Chicago Office Named Locations:
GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName ne ‘Chicago office’} | ft DisplayName,Id

We then use the ID from this location to replace the one we need to remove:
$OldNamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘Chicago office’}).Id
$NewNamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘LA Office Office’}).Id
$conditions = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessConditionSet
$conditions.Locations = NewObject TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessLocationCondition
$conditions.Locations.IncludeLocations = “$NewNamedLocationId”
Foreach ($CAPolicy in $CAPolicies) {
If ($OldNamedLocationIdeq ((($CAPolicy).Conditions).Locations).IncludeLocations) {
SetAzureADMSConditionalAccessPolicy PolicyId $($CAPolicy.ID) Conditions $Conditions
}
}

Now we re-run the same code block we ran to find the CA Policies with the Named Location:
$NamedLocationId = (GetAzureADMSNamedLocationPolicy | Where {$_.DisplayName eq ‘Chicago office’}).Id
$CAPolicies = GetAzureADMSConditionalAccessPolicy
Foreach ($CAPolicy in $CAPolicies) {
If ($NamedLocationId eq((($CAPolicy).Conditions).Locations).IncludeLocations) {
WriteHost “CA Policy $($CAPolicy.ID) contains the Named Location.”
}
}

Now no results are returned, and we are safe to remove the Named Location:

**Tip: Make sure to remove a Named Location from a Conditional Access Policy
If we try to remove a Named Location that is defined in a Conditional Access Policy, an error is generated. This is because the Named Location is attached to the Conditional Access policy and needs to be removed first.

Figure 4: Attempting to remove a Named Location before removing it from a Conditional Access policy.

Final Thoughts

From the article, we can see that Microsoft provides a set of PowerShell cmdlets that provide a convenient access point to Conditional Access Policies. These CA Policies can be created, modified, and even removed as with these cmdlets. We also have cmdlets to create Named Locations, which can, in turn, be used by the Conditional Access Policies. Potential uses for these cmdlets span from creation and management of CA Policies to documentation and auditing of Policies for security personnel/consultants needing to verify the security stance of an organization. Make sure to read the documentation links provided in this article as well as the Get-Help for the cmdlets for guidance on values to be used for the various moving parts of a policy – Locations, Platforms, Grant Controls, Session controls, and more.

Source Practical365

read more
Azure AD

How to Define Custom Sensitive Information Types for Use in DLP Policies

21-09-2020-657-p365-Generic-IT-LOW-340×200

New Interface and New Capabilities Make It Easier to Manage Sensitive Information Types

Sensitive information types are used by Microsoft 365 components like DLP policies and auto-label (retention) policies to locate data in messages and documents. Earlier in January, Microsoft released a set of new sensitive information types to make it easier to detect country-specific sensitive data like identity cards and driving licenses. Message center notification MC233488 (updated January 23) brings news of two additional changes rolled out in late January and early February.

Confidence Levels for Matching

First, the descriptions of the confidence levels used for levels of match accuracy are changing from being expressed as a percentage (65%, 75%, 85%) to low, medium, and high. Existing policies pick up the new descriptors automatically. This is Microsoft 365 roadmap item 68915.

The change is intended to make it easier for people to better understand the accuracy of matches when using sensitive information types. DLP detects matches in messages and documents by looking for patterns and other forms of evidence. The primary element (like a regular expression) defined in the pattern for a sensitive information type defines how basic matching is performed for that type. Secondary elements (like keyword lists) add evidence to increase the likelihood that a match exists. The more evidence is gathered, the higher the match accuracy and the confidence level increases from low to medium to high. Policy rules use the confidence level to decide what action is necessary when matches are found. For instance, a rule might block documents and email when a high confidence exists that sensitive data is present while just warning (with a policy tip) for lower levels of confidence.

Copying Sensitive Information Types

The second change is that you can copy sensitive information types, including the set of sensitive information types distributed by Microsoft. For instance, let’s assume that you don’t think that the standard definition for a credit card number works as well as it could. You can go to the Data Classification section of the Compliance Center, select the credit card number sensitive information type, and copy it to create a new type (Figure 1). The copy becomes a custom sensitive information type under your control to tweak as you see fit.

Copying the sensitive information type for a credit card number
Figure 1: Copying the sensitive information type for a credit card number

Improved User Interface

The third change is the most important. Figure 1 is an example of a new user interface to manage sensitive information types (Microsoft 365 roadmap item 68916). The new interface is crisper and exposes more information about how information types work. For instance, in Figure 1, we can see that the primary element for credit card number detection is a function to detect a valid credit card number. Further evidence (supporting elements) come from the presence of keywords like a credit card name (for example, Visa or MasterCard) and expiration date near a detected number.

Few are likely to have the desire to tweak the standard sensitive information types. However, being able to examine how Microsoft puts these objects together is instructive and helpful when the time comes to create custom sensitive information types to meet business requirements.

Detecting Azure AD Passwords

For instance, MVP James Cussen points out that Azure AD passwords are not included in the list of sensitive information types. While some people need to send passwords around in email and Teams messages, it’s not the most secure way of transmitting credentials. In this post, he uses the old interface to define a sensitive information type to detect passwords. To test the new interface, I used his definition as the basis for a new custom sensitive information type.

The primary element to match passwords is a regular expression:

A bunch of suggested expressions to detect passwords can be found on the internet. Most fail when input for use with a sensitive information type because they fail Microsoft’s rules to detect illegal or inefficient expressions. Not being a Regex expert, I tried several (all worked when tested against https://regex101.com/), and all failed except the one created by James.

A keyword list is a useful secondary element to add evidence that a password is found. The list contains a comma-separated set of common words that you might expect to find close to a password. For instance:

“Here’s your new password: 515AbcSven!”

“Use this pwd to get into the site ExpertsAreUs33@”

In multilingual tenants, the ideal situation is to include relevant words in different languages in the keyword list. For instance, if the tenant has Dutch and Swedish users, you could include wachtwoord (Dutch) and lösenord (Swedish). To accommodate the reality that people don’t always spell words correctly, consider adding some misspelt variations of keywords. In this instance, we could add keywords like passwrod or pword.

James’s definition allows keywords to be in a window of 300 characters anchored on the detected password (see this Microsoft article to understand how the window works). I think this window is too large and might result in many false positives. The keyword is likely to be close to the password, so I reduced the window to 80 characters.

Figure 2 shows the result after inputting the regular expression, keyword list, confidence level (medium), and character proximity. It’s a less complex definition than for Microsoft’s sensitive information types. The big question is does it work.

Definition for the Azure Active Directory password custom sensitive information type
Figure 2: Definition for the Azure Active Directory password custom sensitive information type

Testing

The Test option allows you to upload a file containing sample text to run against the definition to see if it works. As you can see in Figure 3, the test was successful.

Testing a custom sensitive information type
Figure 3: Testing a custom sensitive information type

Using the Custom Sensitive Information Type in a Teams DLP policy

Testing gives some confidence that the custom sensitive information type will work properly when deployed in a DLP policy. After quickly creating a DLP policy for Teams, we can confirm its effectiveness (Figure 4) in chats and channel conversations.

Passwords blocked in a Teams chat
Figure 4: Passwords blocked in a Teams chat

I deliberately choose Teams as the DLP target because organizations probably don’t want their users swapping passwords in chats and channel conversations. Before rushing to extend the DLP policy to cover email, consider the fact that it’s common to send passwords in email. For instance, I changed the policy to cover email and Teams and discovered that the policy blocks any invitation to Zoom meetings because these invitations include the word “pwd” as in:

Although it might be an attractive idea to block Zoom to force people to use Teams online meetings instead, it’s not a realistic option. The simple solution is not to use this DLP policy for email.

False Positives and Policy Overrides

The downside of matching text in messages against keywords defined in a policy is that some false positives can happen. For instance, I have a Flow to import tweets about Office 365 into a team channel. As Figure 5 shows, some tweets are picked up as potential password violations because a keyword appears near a string which could be a valid password.

Tweets posted in Teams are blocked because they match the password definition
Figure 5: Tweets posted in Teams are blocked because they match the password definition

Adjusting the definition for the sensitive information type to reduce the character proximity count (from 80 to 60) reduced the number of false positives. Testing and observation will tell how effective changes like this are when exposed to real-life data.

Apart from adjusting character proximity, two other potential solutions exist. First, amend the DLP policy to allow users to override the block and send the message with a justification reported to the administrator. If the message is important, users will override the policy. The administrator will be notified when overrides occur and tweak the policy (if possible) to avoid further occurrences.

The second option is to exclude accounts (individually or the members of a distribution list) which have a business need to send passwords from the DLP policy. DLP will then ignore messages sent by these individuals.

Creating Custom Sensitive Information Types a Nice to Have

Given the broad range of standard types created by Microsoft, the need to define custom sensitive information types isn’t likely to be a priority for most tenants. However, for those who need this feature for business reasons, the recent changes are both welcome and useful.

Source Practical365

read more
Active DirectoryAzure AD

Azure AD Suffers Another Big Authentication Outage

16-11-2020-711-p365-Labyrinth-LOW-300×162

Failure to Authenticate Causes Users to Lose Access to Apps

Microsoft 365 users around the world were under impressed on Monday, March 15 when Azure AD authentication requests failed to block their access to apps like Teams and Exchange Online. Downdetector.com started to report problems with users being unable to sign into Teams and other Office 365 apps around 19:00 UTC. The number of reports grew rapidly (Figure 1) for incident MO244568.

Downdetector.com reports a problem with Teams
Figure 1: Downdetector.com reports a problem with Teams

The Azure status page said “Starting at approximately 19:15 UTC on 15 Mar 2021, a subset of customers may experience issues authenticating into Microsoft services, including Microsoft Teams, Office and/or Dynamics, Xbox Live, and the Azure Portal.” The difference between the time users detected issues and Microsoft declaring an incident isn’t unusual, as Microsoft engineers need time to figure out if reported problems are due to a transient hiccup or something deeper.

Why Some Users Could Continue to Work

Based on my own experience, it seemed as if apps continued to work if they didn’t need to make an authentication request to Azure AD. In my case, I was connected to the Microsoft tenant in Teams and couldn’t switch back to my home tenant because the Teams client wasn’t able to authenticate with that tenant. As anyone who has ever looked at the Teams activity in the Office 365 audit log can testify, Teams desktop clients authenticate hourly when their access token expires. This might be the reason why Microsoft highlighted the effect on Teams in their communications for the Microsoft 365 health status page (Figure 2).

Microsoft 365 service health status page
Figure 2: Microsoft 365 service health status page

The Microsoft 365 Service health dashboard wasn’t much help (Figure 3) because it needs to authenticate to the Office 365 Services Communications endpoint (here’s an example of how to retrieve the same incident data with PowerShell). Because authentication failed, the Service health dashboard couldn’t retrieve incident details. No doubt some ISVs will make the point that relying on Microsoft APIs during a major outage might be a case of putting all your eggs in the proverbial basket.

No joy in the Microsoft 365 service health dashboard
Figure 3: No joy in the Microsoft 365 service health dashboard

If an app didn’t need to authenticate, it continued to work quite happy. I used Outlook desktop and the browser interfaces to SharePoint Online, OneDrive for Business, OWA, and Planner during the outage. On the other hand, while writing this article, Word couldn’t upload the document to SharePoint Online or autosave changes because of an authentication failure. Another interesting problem was that messages sent to a Teams channel email address failed because the connector used to deliver the email to Teams couldn’t authenticate.

Similar Authentication Woes in September 2020

At first glance, this problem seems like that of the September 28/29 Azure AD outage last year. Microsoft said then that “A latent code defect in the Azure AD backend service Safe Deployment Process (SDP) system caused this to deploy directly into our production environment.” In other words, a code change containing a bug made its way into production, which seems very like the reason cited by the Microsoft 365 status twitter account (@msft365status) at 20:10 UTC that the problem was due to “a recent change to an authentication system.”

Unfortunately, as I wrote in February 2019, Azure AD is the Achilles Heel of Office 365. When everything works, it’s great. When Azure AD falls over, everything comes to a crashing halt across Microsoft 365. This is despite Microsoft’s December 2020 announcement of a 99.99% SLA for Azure AD authentication, which comes into effect on April 1, 2021. That is, if you have Azure AD Premium licenses.

Quite reasonably, people asked why Microsoft had deployed code changes on the first day of the working week. That’s a good question. Although people use Office 365 every day, the weekend demand on the service is much lower than Monday-Friday, so it’s fair to look for Microsoft to make changes which might impact users then. Or even better, test their software before allowing code to make it through to production services.

Current Status

As of 21:15 UTC, Microsoft said “We’ve identified the underlying cause of the problem and are taking steps to mitigate impact. We’ll provide an updated ETA on resolution as soon as one is available.” At 21:25 UTC the status became a little more optimistic “We are currently rolling out a mitigation worldwide. Customers should begin seeing recovery at this time, and we anticipate full remediation within 60 minutes.”

After the mitigation rolled out, the problem appears to be resolved (22:10 UTC). Microsoft says “The update has finished its deployment to all impacted regions. Microsoft 365 services continue the process of recovery and are showing decreasing error rates in telemetry. We’ll continue to monitor service health as availability is restored.” Normal service appears to have resumed for Teams and other apps, at least in the tenants I access.

On March 16, Microsoft noted that some organizations might still see some Intune failures. They anticipate that remediation actions taken by 21:00 UTC will address these lingering problems.

Problem sorted for now, but it will be interesting to see what Microsoft’s Post Incident Report says.

Source Practical 365

read more
Active DirectoryAzure AD

How to Decide Between Azure AD Connect and Azure AD Connect Cloud Sync

017-02-15-2021-BLOG-How-to-decide-between-Azure-AD-Connect-and-Azure-AD-Connect-Cloud-Sync-340×200

Microsoft recently announced that Azure AD Connect Cloud Sync had reached GA (general availability), adding another option for directory synchronization with Microsoft 365. This article provides a background on directory synchronization and why it is fundamental for your journey to the cloud. Then we will discuss the solutions and give you the information you need to pick the right solution. Let’s begin with some basics.

What is Azure AD Sync, and Why Do I Need It?

Most organizations run Active Directory on-premises. This directory is usually the source of authority for all users, groups, and computers in a Windows domain. The domain provides a way to centrally manage accounts, passwords, policies, and permissions on-premises.

When an on-premises organization decides to use Microsoft 365, it needs a way to bring those on-premises accounts into Azure AD to use the new cloud services like Exchange Online, Teams, SharePoint Online, etc. Most organizations want to use their existing on-premises accounts rather than create new accounts and manage different passwords. That is where Azure AD Connect comes in. Both Azure AD Connect and Azure AD Connect Cloud Sync synchronize and link objects from AD to Azure AD and synchronize password hashes (not passwords) to maintain a single sign-on experience.

Azure AD Connect

Azure AD Connect has a long and storied past. It is based on Microsoft Identity Manager (MIM), which is used to bridge multiple on-premises authoritative systems and authentication stores. MIM is the sixth generation of Microsoft identity management solutions since they bought two similar technologies in 1997 and 1999.

While MIM can be expensive and bridges multiple authoritative directories, Azure AD Connect is free and purpose-built to bridge Active Directory with Azure Active Directory. This is known as hybrid identity.

Azure AD Connect is installed on an on-premises domain-joined server and is even supported to be installed on a domain controller. It only requires an outbound HTTPS connection to Microsoft 365 servers.

Capabilities

Since its humble beginnings of syncing a single AD to a single Azure AD tenant, Azure AD Connect’s capabilities have expanded significantly. Currently, this includes:

  • Synchronization between
    • Single forest, single Azure AD tenant.
    • Multiple forests, single Azure AD tenant.
    • Single or multiple forests, multiple Azure AD tenants (requires that each object is only represented once in all tenants).
    • LDAPv3-compatible identity stores.
  • Password Hash Synchronization (PHS) – use Azure AD as your organization’s identity provider by synchronizing password hashes to Azure AD.
  • Pass-Through Authentication (PTA) – use your organization’s Domain Controllers as your identity provider without having to deploy a full-blown AD FS configuration.
  • Federation integration with Active Directory Federation Services (AD FS).
  • Health monitoring of both Active Directory and the synchronization process.
  • Accommodating up to 10GB of database space (up to 100,000 objects) using LocalDB. If your organization exceeds this limit, use a full SQL Server.
  • Organizational Unit, group, or attribute filtering.
  • Exchange hybrid writeback capabilities for organizations with Exchange Server.
  • Exchange Public Folder address synchronization for directory-based edge blocking.
  • Password writeback capabilities to support self-service password reset (SSPR).
  • Office 365 Group writeback to prevent email address overlaps.
  • Directory extension attribute synchronization to extend the schema in Azure AD to include specific attributes consumed by LOB apps and Microsoft Graph Explorer.
  • Robust synchronization rule editing capabilities.
  • Seamless single sign-on (SSSO) capabilities that allow domain-joined users and computers to access Microsoft 365 workloads without being prompted to sign-in every time.
  • Hybrid Azure AD Join capabilities.
  • Device writeback capabilities that allow organizations to use on-premises conditional access and Windows Hello.
  • Synchronizing directory changes every 30 minutes and password changes almost immediately when using password hash sync.

Read more about Azure AD Connect: How it works and best practices for synchronizing your data

Azure AD Connect Cloud Sync

Microsoft realizes that it is unfortunate that your organization’s journey to the cloud-first requires installing more software on-premises. Azure AD Connect Cloud Sync is a cloud service alternative to Azure AD Connect software. The organization deploys one or more lightweight agents in their on-premises environment to bridge AD and Azure AD. The configuration is done in the cloud.

The service provides some of the features and capabilities that Azure AD Connect provides, making it useful for some merger and acquisition scenarios. It is important to note that Azure AD Connect Cloud Sync does not support Exchange hybrid, which reduces the number of useful scenarios.

Capabilities

Azure AD Connect Cloud Sync has many of the same features and capabilities as Azure AD Connect with the following differences:

  • Lightweight agent installation model.
  • Adds high availability using multiple agents.
  • Allows connectivity to multiple disconnected on-premises AD forests
  • Synchronizes directory changes more frequently than Azure AD Connect.
  • Can be used in addition to Azure AD Connect.
  • Does not support Exchange hybrid writeback.
  • Does not support LDAPv3-compatible identity stores.
  • Does not support device objects.
    • No hybrid Azure AD join.
    • No support for Windows Hello.
  • Does not support directory attribute synchronization.
  • Does not support Pass-Through Authentication (PTA).
  • Does not support synchronization rule editing capabilities.
  • Does not support writeback for passwords, devices, or groups.
  • Does not support cross-domain references.

As you can see, there are several gaps in functionality that limit the use of Azure AD Connect Cloud Sync. It is expected that Microsoft may close these gaps with future updates. The fact that this is a cloud-based service means that they can iterate rather quickly. I would not expect Exchange hybrid support anytime soon, though.

Appropriate Use Cases for Each

Choosing which directory synchronization solution to use requires a full understanding of what your organization’s needs are.

Azure AD Connect has the most features and compatibility. Almost all customers I encounter use Exchange Server or Exchange Online. The lack of Exchange hybrid support with Azure AD Connect Cloud Sync limits the use of that solution.

If you don’t need Exchange hybrid support or any of the other unsupported features, Azure AD Connect Cloud Sync can be a quick and easy way to configure AD directory synchronization with Azure AD. Examples include mergers and acquisitions where the organization being acquired has limited IT experience. By installing a simple, lightweight agent on a domain server, the acquiring organization can configure and manage directory synchronization from their tenant.

The marketing slides and videos introducing Azure AD Connect Cloud Sync often talk about the “heavy infrastructure investment” required for Azure AD Connect. A LocalDB database is installed with Azure AD Connect and has a 10GB limit (about 100,000 objects). Unless your organization’s Active Directory exceeds this, there is no requirement for additional infrastructure at all. Azure AD Connect can be installed on any existing domain-joined server running Windows Server 2012 or later or directly on a domain controller. It only requires an outbound HTTPS connection to the Internet.

Organizations with over 100,000 objects would likely save money with Azure AD Connect Cloud Sync since it does not require a full SQL server deployment. Still, organizations this size are usually running Exchange.

A scenario where Azure AD Connect Cloud Sync might be useful is one where an organization has AD on-premises but uses Google Workspace for email. This organization can sync their directory to Azure AD and then begin migrating Google mail to Exchange Online.

Azure AD Connect Cloud Sync is also the appropriate choice when connecting to multiple disconnected on-premises AD forests. Azure AD Connect requires line-of-site connectivity between multiple on-premises AD forests. This can be useful in some merger and acquisition scenarios.

Ultimately, you should deploy Azure AD Connect Cloud Sync if it provides the features and compatibility your organization needs. Otherwise, you will need to use the more fully-featured Azure AD Connect.

Security Considerations for Protecting Access to Azure AD Connect and Azure AD Connect Cloud Sync

Organizations should treat any server running Azure AD Connect or the Azure AD Connect Cloud Sync agent as a tier-0 asset – the same as a domain controller – since it is responsible for directory synchronization with Azure AD. Organizations should restrict administrative access to the Azure AD Connect server to only domain administrators or other tightly controlled security groups.

Azure AD Connect installation and configuration must be run with an Enterprise Admin account in AD and requires a Global Administrator account in the tenant.

Azure AD Connect Cloud Sync must be installed with an AD account with local admin permission on the server or Domain Admin permissions on a domain controller and requires a tenant account with Hybrid Identity Administrator or Global Administrator roles in the tenant.

For Azure AD Connect, the user account used to install it is automatically added to the local ADSyncAdmins security group. The best practice is to add Domain Admins to this group so more than one account can manage directory synchronization. Remove the individual user account that was used to install Azure AD Connect from this group.

The account used for configuration requires specific rights and is only used for installation or configuration. Directory synchronization will not be impacted if the account is disabled or deleted.

Both synchronization solutions use the highest TLS available in Windows Server. To ensure that Azure AD Connect and Azure AD Connect Cloud Sync use TLS 1.2 set the following registry keys, then restart the server:

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client]
“DisabledByDefault”=dword:00000000
“Enabled”=dword:00000001
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server]
“DisabledByDefault”=dword:00000000
“Enabled”=dword:00000001
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319] “SchUseStrongCrypto”=dword:00000001

Summary

Both Azure AD Connect and Azure AD Connect Cloud Sync provide ways for organizations to synchronize AD with Azure AD. Both solutions are easy to deploy and provide the features that organizations need to provide a unified sign-in experience to Microsoft 365.

Understand your organization’s requirements. Azure AD Connect Cloud Sync is the preferred way to synchronize on-premises AD to Azure AD, assuming you can get by with its limitations. Azure AD Connect provides the most feature-rich synchronization capabilities, including Exchange hybrid support.

From a security perspective, treat your organization’s Azure AD Connect server or agent the same as a domain controller and other Tier 0 resources.

Source Practical 365

read more
Azure AD

For companies who migrated to Azure before the pandemic, the cloud was the silver lining

In-Home-Solutions-Door-768×512

In challenging times, innovative companies don’t panic, they pivot.

So when the coronavirus traveled the globe, disrupting everything from grocery shopping to tax preparation, resilient companies around the world responded with ingenious solutions that ensured the safety of their employees, met the needs of their customers and kept their IT operations humming.

For some, it meant quickly enabling employees to work from home instead of at the office, while others rushed to re-engineer their supply chains, so their customers could continue to buy the products they needed to stay safe and healthy. Their solutions were custom but the step that allowed them to react so rapidly was common: They had migrated their operations to Microsoft Azure.

Even before COVID-19, companies were hungry to enable secure remote work, achieve operational efficiency, address IT budget constraints and accelerate innovation, and migrating to the cloud was the critical first step to unlocking those accomplishments.

“Azure has always helped our diverse customers adapt to new ways of doing business,” says Jeremy Winter, director, Azure management and migration.

“While some are moving beyond crisis mode and others look to the cloud to stay resilient, we’re seeing an acceleration in business’ cloud migrations. We’ve spent a lot of time with customers who are both considering and executing migrations and understand the challenges they are facing. We also recognize that our customers are counting on us more than ever to recover and ultimately transform.”

Here’s how a handful of those Microsoft customers discovered during the health crisis that the cloud didn’t just have a silver lining – it was the silver lining.

A technician from Actavo arrives at a customer's home.

Actavo

Dublin, Ireland-based Actavo is a multi-disciplinary, global engineering services business that designs, builds and maintains the vital infrastructure we rely on every day.

Among their many services, they design, construct and maintain networks for the telecommunications and utilities industries; design, construct and install modular buildings; provide scaffold and rope access solutions along with the installation of insulation and removal of asbestos in the industrial sector; and provide hundreds of in-home technicians to companies who provide residential services like gas boiler repair, satellite installation or broadband internet installation to a half-million homes every year.

With so many essential industries relying on their round-the-clock support, even before the pandemic Actavo sought to increase operational efficiency and improve agility by migrating to Azure.

“Moving to the cloud gave us the versatility to be able to scale up and scale down in any country where we needed to operate, but it also gave us the robustness from a disaster recovery perspective,” says Willie Ryan, global Environmental Health and Safety and IT director.

A novel coronavirus may not have been the disaster that Actavo had in mind, but when it arrived, the decision to migrate to the cloud proved even more valuable. With the new, versatile infrastructure in place, Actavo was able to quickly pivot to remote work for its employees, ensuring that they could continue to support their clients around the world.

“We’ve got a very solid platform, and that was quite evident as we came through COVID and the pandemic, in terms of, we moved from an on-prem operation to everybody working from home, seamlessly, quickly, easily, without any major headaches,” Ryan says.

“Within the space of a week we were able to tick a box that said, ‘If lockdown comes, we’re ready to go.’ The lockdown announcement came at five o’clock on a Friday afternoon. We actually had everybody that needed to be working at home, working at home on Monday morning.”

Silvan Schriber, head of corporate development for Additiv

Additiv

Additiv partners with the world’s leading financial institutions to help them digitalize their wealth management activities. Their ability to advocate for change and innovation in an industry that is highly regulated is dependent on building trust with their clients by adhering to the highest standards of security and compliance.

So when the company moved its enterprise software offering for its clients to the cloud, they worked with Microsoft to ensure a seamless transition that would strictly adhere to regulations across multiple locations – a prerequisite for a global company that is headquartered in Switzerland and has offices and clients on three continents.

Now, Additiv can offer its clients innovative, flexible software-as-a-service solutions that can scale easily and that adapt to their specific needs, no matter where they’re located. “Digital collaboration means breaking down these virtual borders, and Azure allowed us to integrate better so that our clients can provide a holistic solution to their customers out of the cloud,” says Silvan Schriber, head of corporate development.

“The benefits of Azure for Additiv are 100 percent trust that the solutions adhere in each location to regulations,” he adds. “So it helps us in centrally managing our solutions and placing them in each country of our clients, with virtually the click of a button.”

That consistency and efficiency will be critical as Additiv’s financial services customers increasingly embrace digital strategies and omni-channel distribution to reimagine their businesses post-pandemic.

“The pace of change pre-pandemic was rapid, but there’s no reason to think these trends won’t accelerate post-pandemic,” the company writes in a report published earlier this year. “… the companies that are faring best are those that are either digitally native or have invested the most in digital.”

An Albertsons storefront.
Azure migration helps Albertsons Companies deliver a better shopping experience. (Photo courtesy of Albertsons Companies)

Albertsons Companies

Long before this past spring, people were changing the way they shop for groceries, and Albertsons Companies – one of the largest food and drug retailers in the United States with some 2,200 stores operating under banners including Albertsons, Safeway and Vons – was evolving along with them.

“I believe that every customer – from millennials to baby boomers – is transforming how they shop,” says Ramiya Iyer, senior vice president of Data, Digital & Merchandising at Albertsons Companies. “The retail industry is undergoing its own transformational journey to meet customers where they are and be relevant to them. It’s imperative that we do that to succeed and grow as a business.”

To stay competitive and give customers the modern, convenient shopping experience they had come to expect, Albertsons migrated from an aging on-premises server farm to Microsoft Azure.

The move allowed them to take advantage of artificial intelligence (AI) and cognitive services, and empowered an environment where developers can innovate and efficiently test new ideas, propelling Albertsons to introduce apps that make shopping faster and more convenient.

“With Azure, we can bring new ideas to market faster and deliver releases on shorter time cycles,” says Param Deivreddy, vice president, IT architecture. “We think Azure also encourages exploration and innovation, because you don’t have to spend a huge amount on infrastructure to quickly test a new idea.”

The company’s easy-to-use apps offer the kinds of personalized and streamlined experiences digitally savvy shoppers demand, allowing them to create shopping lists, easily find the products they want, skip long check-out lines, even save time at the fuel pump by claiming rewards, activating the pump and paying for gas with a tap of the phone.

“Grocery shopping shouldn’t be a chore,” says Iyer. “We want to provide our customers with a totally frictionless experience that lets them enjoy the food they’re putting on the table with a minimum of fuss and stress.”

An H&R Block storefront.
Cloud computing allows H&R Block to ramp up during tax season and enables employees to serve customers from home. (Photo courtesy of H&R Block)

H&R Block

Few things in life are certain, but as the adage goes, taxes are one of them. Even during a health crisis, Americans are responsible for filing their annual return, and millions of them lean on Kansas City, Missouri-based H&R Block to navigate the process.

To streamline its operations and improve its ability to ramp up during tax season, H&R Block migrated its computing workload to Microsoft Azure, a move that would, “help the company better process millions of tax returns annually while allowing the firm to build financial software products with greater speed, quality and security,” writes CIO’s Clinton Boulton in a recent interview  on cloud migration with H&R Block CIO Alan Lowden.

“It’s what we need to do to enable our strategic vision of putting customers at the center,” Lowden tells CIO. “We have to give them a convenient experience of serving them any way they want to be served.”

So this past tax season, H&R Block was able to spring into action to support their customers while protecting the safety of their employees and the communities they serve. To keep their workforce safe during their busiest season – which was even longer this year to give taxpayers more time to file – they shifted to a work-from-home model that allowed 80,000 tax professionals to serve customers virtually.

That quick pivot wouldn’t have been possible if the company hadn’t migrated to Azure. Because they didn’t need to buy new hardware, or set up or configure additional infrastructure, they were able to make the critical change to remote workstations in less than two weeks. The company’s tax pros were able to provide the expertise their customers needed, and the even more critical refunds they were counting on.

When the demands of tax season return to normal, H&R Block will continue to inspire confidence in their customers through a seamless experience – wherever they like, however they like.

“Now we can present tax tips and offerings to our clients that are most relevant to them,” says Aditya Thadani, vice president of architecture and information management at H&R Block. “Migrating our platforms to Azure has really allowed us to serve our clients better.”

Source Microsoft

read more
Azure ADAzure VM

Microsoft Sustainability Calculator Aims to Help Cloud Customers Manage their Carbon Footprint

no thumb

Microsoft has announced a new tool that will help cloud customers better understand their carbon emissions. Called the Microsoft Sustainability Calculator, the product has landed in private preview ahead of a future launch.
According to Redmond, the aim of the Microsoft Sustainability Calculator is to give customers more oversight across their carbon output. More specifically, to become more transparent about what their environmental impact is.
“It’s challenging to make and meet meaningful carbon reduction goals without the ability to measure carbon emissions,” Microsoft says.

Microsoft has announced a new tool that will help cloud customers better understand their carbon emissions. Called the Microsoft Sustainability Calculator, the product has landed in private preview ahead of a future launch.

According to Redmond, the aim of the Microsoft Sustainability Calculator is to give customers more oversight across their carbon output. More specifically, to become more transparent about what their environmental impact is.

“It’s challenging to make and meet meaningful carbon reduction goals without the ability to measure carbon emissions,” Microsoft says.

By leveraging AI technology, the calculator provides accurate accounting for carbon usage. It highlights the impact caused by Microsoft cloud service across an organization’s footprint. Armed with the information, company’s can make concise decisions about their environmental impact.

For example, the Microsoft Sustainability Calculator measures the impact of moving regular applications to the cloud and how this can reduce the carbon output of a company.

Carbon Negative

It looks like the tool goes hand-in-hand with Microsoft’s own push to be carbon negative by 2030. Redmond made that pledge earlier this year. The decision followed a 2017 commitment to cut 75% of its carbon emissions by the same date and builds on 2019 revisions of 70% renewable energy by 2023.

“While the world will need to reach net zero, those of us who can afford to move faster and go further should do so,” Microsoft President Brad Smith said of the new commitment. “That’s why today we are announcing an ambitious goal and a new plan to reduce and ultimately remove Microsoft’s carbon footprint.”

Source Winbuzzer

read more
Azure AD

Microsoft and Hitachi Announce Partnership Based on Azure Cloud Integration

Microsoft-Cologne-Office-Pixabay-768×578

Microsoft has announced an expansion of its partnership with Japanese multinational Hitachi. Specifically, the companies are entering a new era of deeper collaboration as part of a multi-year strategic alliance.

Leading the goals of the partnership is to drive transformation in logistics and manufacturing across North America, Southeast Asia, and Japan.

On its end, Hitachi, will integrate its solutions into Microsoft’s cloud services. For example, the IoT industrial controllers HX Series and Lumada platform will now integrate with Microsoft Azure, Dynamics 365, and Microsoft 365.

“We are delighted to expand our partnership with Microsoft and combine our OT, IT and products excellence to provide manufacturing and logistics companies with digital solutions. We use Lumada to provide total seamless solutions to solve challenges by connecting cyberspaces with physical spaces. Through this collaboration with Microsoft, we will be able to accelerate our customers’ digital transformation and continue to deliver social, environmental and economic value,” said Jun Abe, Vice President and Executive Officer, CEO of Industry & Distribution Business Unit, Hitachi Ltd.

Partnership

Both companies will also work to combine Lumada and Azure into a new industry data platform. In a blog post to announce the alliance, Microsoft says Hitachi will bring the following benefits to its Microsoft Azure platform:

  • “Increase manufacturing productivity: Using Hitachi Digital Supply Chain as well as Azure IoT to analyze 4M data collected from manufacturing sites for the visualization and analysis of production processes to optimize factory operations and increase productivity.
  • Optimize logistics with data analytics: Increasing the logistics efficiency and reducing operational costs by analyzing traffic congestion, storage locations and delivery locations, and enabling smart routing to save miles and deliver faster through advanced digital technologies such as Azure Maps and Hitachi Digital Solution for Logistics/Delivery Optimization Service.
  • Predictive maintenance and remote assist: Enabling predictive maintenance, real-time remote assistance and remote training scenarios for first-line workers, leveraging HoloLens 2 and Dynamics 365 Remote Assist as well as other smart devices.”

Those solutions will make their debut next month in Thailand and will arrive in Japan and North America later in the year.

“Building resilient and flexible digital supply chains is critical to grow a business and meet customer needs in today’s fast-changing environments. By expanding our collaboration with Hitachi, we’ll unlock new opportunities for manufacturing and logistics companies as they strive to lead in their industries and pioneer with a data-driven mindset and digital capabilities,” said Çağlayan Arkan, Vice President Manufacturing at Microsoft.

Source Winbuzzer

read more
Azure AD

How to quickly install and configure Azure AD Connect

18-23-2020-583-p365-Generic-IT-admin-computing-something-image-LOW

One of the fundamental components of setting up Office 365 is installing Azure AD Connect. This tool is used to connect your on-premises Active Directory to Azure AD.

It works by synchronizing a copy of objects in the directory, such as users, groups, contacts and devices from Active Directory to Azure AD every 30 minutes. When you use Azure AD Connect, your local Active Directory remains the master copy and only selected attributes, such as those needed to support Exchange Hybrid, are written back.

Azure AD Connect supports many topologies, including a single Active Directory, multiple Active Directories and even multiple Office 365 tenants.

Before we begin, it’s worth outlining that there is a variety of configuration options available and these should be considered if you have more than basic requirements. For example, if you cannot synchronize hashes of passwords to the cloud, then you may wish to consider options like pass-through authentication. If you still run older clients or do not plan to use Hybrid Azure AD join to provide single sign-on to PCs, then you might wish to configure Seamless Sign-On.

In this guide, we’re not going to cover every option for installation of Azure AD Connect, as there’s a variety of ways to configure it.

The key purpose of this guide is if you are installing and configuring Azure AD Connect to support using a tool like Microsoft Teams, and plan to update your configuration to support wider needs in the future. We’ll show you how to install it in a similar way to the Express option which uses the most common deployment settings.

Prerequisites for installing Azure AD Connect

Before you begin, ensure you meet the pre-requisites for installation. As a minimum, you need Windows Server 2012 or later, on a domain-joined server (or domain controller) with .NET Framework 4.5.1 and PowerShell, with at least 4GB RAM and a 70GB HDD. The server will need access to the internet, in particular access to the Azure AD Connect service. IP ranges are listed here.

It is always recommended to analyze your Active Directory for issues that will prevent accounts synchronizing to Azure AD. The best way to do this, is to use the IDFix tool. The process to run this is documented here. In particular, it is common to ensure that the User Principal Name matches user’s email addresses for easy sign-in and to meet the requirement of matching one of your Office 365 tenant’s custom domains.

And, if you haven’t already, follow our guide on adding a custom domain to match (at a minimum) your primary SMTP domains (the ones you’ll use for sign-in) and if you will be migrating mailboxes to Exchange Online, all domains you use to receive email.

Learn more: What is Azure AD Connect Cloud Provisioning and should you plan to use it?

In this guide we’ll show you how to user Alternate Login ID for rapid identity provisioning to Office 365 applications like Microsoft Teams. If you are planning to use Exchange Hybrid, however, then you should plan to move to matching against the User Principal Name at a later date. This can be achieved fairly simply, by staging the switch of User Principal Names to match user’s Primary SMTP address, then when complete updating Azure AD Connect to use the User Principal Name instead.

Steps for installation

Once pre-requisites are in place we’ll begin. First, download a copy of Azure AD Connect. You’ll find this at the Microsoft Download site.

Next, run the installation tool on the server you’ll install Azure AD Connect on to, then when given the opportunity, we’ll choose the Customize option, unless you want to install with the Express Settings, which include synchronizing all accounts:

Azure AD Connect Express Settings

Next, we’ll choose sign-in options. As mentioned above, we’ve got a variety of options available, and you need to select the right one for your organization.

However, most organizations should choose Password Hash Synchronization, then choose Next:

Azure AD Connect User Sign-In

Next, enter your Office 365 Global Administrator credentials. These will be used to create a synchronization account inside the Office 365 tenant, and not stored by the server:

Connect to Azure AD

We’ll then need to add our local Active Directory. In this guide, we’re assuming you are using a single AD to synchronise to Office 365. Choose Add Directory:

Azure Add Directory

At this point, use enterprise administrator credentials to add an Active Directory connection. The wizard will, like with Office 365, create a synchronization account with the pre-requisite permissions, then choose OK, then Next.

AD Forest Account

On the next page of the wizard, we’ll see the sign-in configuration already chosen and select the on-premises attribute to use as the Azure AD username.

If you plan to use Exchange Hybrid immediately, then you should select the default, userPrincipalName as described earlier in the article – and ensure you have run IDFix and aligned your primary SMTP address values to the UPN.

In the example below, you’ll see a warning that will show if you have a UPN suffix in your environment that is not routable, or not registered in your Office 365 tenant.

We see the warning below because the UPN suffix lab01.local is not routable and therefore cannot be verified in the tenant. However we do have the lab01.allabout365.com domain (which is also our Primary SMTP address registered), therefore we can select Continue without matching all UPN suffixes to verified domains.

Azure AD Sign-In Configuration

If you cannot match your User Principal Name value to your custom domains right now and don’t plan to enable Exchange Hybrid immediately – for example, if you are planning on just enabling Microsoft Teams, or other services like SharePoint or OneDrive, then you can workaround this using Alternate Login ID.

This is when your scenario matches the example user below. You’ll see on the left the email value is using the correct custom domain, however you haven’t yet been able to update the User logon name on the right to match the custom domain value.

New properties

To use Alternate Login ID as a temporary measure, use the User Principal Name drop-down to select the mail attribute:

Azure AD Connect Mail Sign-In

Next, we’ll select the Domain and OU filtering options. You may not wish to synchronize every object in your Active Directory to Azure AD and Office 365.

For mail routing and Exchange Hybrid in particular you will want to synchronize (at a minimum) every recipient including mailboxes, mail users, mail contacts and distribution groups.

However, you may not wish to synchronize leavers’ accounts, service accounts and built-in accounts and non-mail enabled security groups. Therefore, it’s useful to use the Domain and OU filtering to select the OUs that contain valid recipients.

In the example below, we know that all recipients are within the People OU, and we’ll only select the Sync selected domains and OUs radio box, then select the checkbox for the OU, then choose Next:

Domain and OU Filtering

After completing the configuration and selecting any optional features to enable, we are ready to configure and perform our initial sync.

Ready to configure Azure AD Connect

After the sync completes, navigate to the Microsoft 365 admin center to validate the synchronization task is completing successfully. You should see within Users and Groups copies of your Active Directory objects.

Active users

If things don’t look right, then navigate to the Azure AD Connect Health portal. This will provide you the option to monitor Sync errors that the service is experiencing. A good result should show no sync errors to the service. You’ll resolve sync errors by following the recommendations from IDFix.

Sync Errors

Source Winbuzzer

read more
Azure AD

AMD Gains Ground on Intel in Steam CPU Usage

AMD-Threadripper-Official-696×392

AMD continues to become an increasingly potent rival to fellow chipmaker Intel. Since launching its Ryzen series of processors several years ago, AMD has made consistent gains against Intel’s market dominance. In the latest step forward for the company a recent survey shows AMD is making ground on Intel in the gaming realm.

Specifically, AMD processors are gaining CPU market share on Steam. Each month, Steam publishes its Hardware and Software Survey. This shows usage statistics across the platform, including operating system and other metrics… including processors.

Last month, Intel machines continued to be the most used on Steam, with 77.54 percent of users. However, AMD enjoyed a 0.74 percent boost and now has 22.45 percent of the usage market. Since January, the company’s share has increased by over one percent.

Of course, Intel still has a 55 percent lead so don’t expect AMD to take over. However, it is clear the gaming chip market is becoming more competitive. It’s a trend we have also seen across other areas away from gaming.

That said, it is worth remembering this survey only looks at a single platform.

GPU Data

AMD also makes GPUs, but it’s success in this area is more limited. Steam says Nvidia remains the leading GPU maker amongst its users. In fact, Nvidia GPUs make up the top nine spots in terms of GPU usage. AMD is tenth and then the rest of the list holding at least one percent of the market is made of Nvidia products.

Source Winbuzzer

read more
Azure AD

What are Azure AD Security Defaults, and should you use them?

23-03-2020-505-Archive-Shuttle-image-LOW

Azure AD Security Defaults arrived recently and make it easier to implement some of the most common security settings in your Azure AD directory, and Office 365 environment. They aren’t appropriate for everyone, but if you’ve not enabled multi-factor authentication yet, or haven’t disabled legacy authentication, then this might want to be something you consider.

Every Office 365 environment should be secure, and technically – they aren’t susceptible to vulnerabilities, are patched and up to date and regularly tested. But the default settings for an Office 365 tenant have been aimed at the lowest common denominator – organizations with legacy clients and with an expectation that organizations will buy add-on security features, like EM+S if they truly want security. This does mean that many, may Office 365 tenants are vulnerable to a number of attack vectors, including password spray attacks, because an attacker can repeatedly try and login to an Office 365 tenant using basic scripting to attempt a login, then if they successfully authenticate with a username and password, there isn’t an MFA mechanism in place.

Security Defaults replace Baseline Conditional Access policies, which do a similar job, and are offered free to all Office 365 subscriptions, whether or not you’ve paid for Azure AD Premium licensing. This is a change, as although per-user MFA could be enabled in Office 365, it didn’t include the Authenticator app, nor the straightforward enablement mechanism enjoyed by Conditional Access or service-wide Azure MFA.

Security Defaults enforces these settings:

  • Multi-Factor authentication for administrators and end-users, required within 14 days of the next sign-in after enablement
  • Legacy authentication will be blocked, restricting access from older clients, like Office 2010, IMAP, POP3, SMTP, ActiveSync clients that don’t support Modern Auth, and traditional methods of managing Exchange Online using Remote PowerShell.
  • Immediate MFA protection for “privileged” Azure AD actions via the Azure Resource Manager API (such as Azure Portal Access, Azure PowerShell and the Azure CLI).

These defaults are more secure than the baseline policies. Baseline protection policies were (and are) provided using the Conditional Access portal settings, and allowed selective enablement of MFA for administrators, MFA protection for (what Microsoft determine as) risky sign-ins for end users, blocks for legacy authentication and MFA for service management. Baseline policies were not only hidden away, but also never left preview – so many people won’t be using them in production.

Microsoft plan to enable Security Defaults for all new Azure AD tenants within the “next few months” – which should mean by the end of January 2020, a new Office 365 subscription will come with MFA enforced out of the box, and legacy authentication enabled. That’s important to know as it’s a big change.

How this might affect new Office 365 migrations

If you sign-up for an Office 365 subscription over the next few months and Security Defaults are enabled then this might be a surprise – even if you don’t have older clients like Office 2010 or use IMAP and POP3 clients.

One example of this is Outlook clients. When you migrate a mailbox, the expected behaviour is that Outlook will automatically reconfigure and connect to Exchange Online. Even with the latest Office 365 Pro Plus, signed in using Modern Authentication to Office 365 for licensing, you could still see an issue with Security Defaults enabled.

This is because when a mailbox is migrated, it continues to use the legacy authentication process as it follows the Autodiscover bread-trail to Exchange Online, and then fails when attempting to sign-in. This can be solved, either by switching off Security Defaults during your migration – or if you have control over your Outlook clients, you can deploy the registry key in this article.

In general though, you shouldn’t expect many technical issues at all if you are using up-to-date Office 365 Pro Plus clients and the Office apps on mobile. You’ll also find ActiveSync clients on iOS devices, the Gmail app and Samsung Mail apps all support Modern Authentication too (however, you’ll need to reconfigure those clients).

The biggest factor though maybe the user impact of enforcing MFA from every location.

No replacement for Azure AD Conditional Access

The down-side with Multi-Factor Authentication is the inconvenience to users. This of course, is necessary from unknown devices in unknown locations, but most organizations invest significant time and expense in ensuring that their devices and locations are secure, and as such trust their devices.

This is where Azure AD Conditional Access remains important. Conditional Access allows different levels of security for different people, apps, managed and unmanged devices and locations. You can choose where and when to enable MFA or even block access.

For example, you might choose to remove the need for a user to confirm it’s them via the Microsoft Authenticator app when they are signing in from their domain-joined, Windows 10 PC, using Office 365 Pro Plus at their computer in the Office.

You might also avoid regular prompts for MFA on their company issued mobile, too – and instead manage the device properly using tools like Intune. However you might require them to sign-in every so often if they are working from home on their company issued laptops. You might even block sign-in altogether to services containing sensitive business data from random web browsers on unmanaged devices.

There’s a lot more to Conditional Access than the above snippet – a lot more – but those are some of the core scenarios that most companies look to achieve and Security Defaults doesn’t cover.

The greatest pity about Security Defaults is that the most basic functionality organizations need – to define locations where MFA isn’t necessary – isn’t included.

This will be especially frustrating for some organizations where they have simple use-cases, like shop-floor workers in manufacturing where employees are not allowed mobile devices and the uplift to Azure AD Premium for each user doesn’t provide enough value to justify it for those type of workers. In those cases, we’ll probably see those organizations left unprotected.

Perhaps Microsoft will do as they did with the Authenticator app, and bring the basic trusted IP address range for skipping MFA.

Enabling (and Disabling) Security Defaults

You can enable Security Defaults if you aren’t using Conditional Access today. If you do have CA policies, then you won’t be able to enable Security Defaults.

Before you do anything, make sure you perform user communications so people know the change is coming – and ensure you won’t prevent access from legacy clients or applications. It’s an all-or nothing switch; so you can’t put in place exceptions for users or legacy applications  like you can with conditional access.

To enable Security Defaults, sign-in as a Global Administrator to the Azure AD Portal and navigate to Azure Active Directory and scroll down to Properties. From there, select Manage Security Defaults:

You’ll then see the option to enable Security Defaults. It’s an all or nothing switch – it’s either enabled or disabled:

The good news with Security Defaults is that should you decide you need Conditional Access policies, you don’t need to switch off Security Defaults before you define your CA policies. When you configure your CA policies, you’ll be informed you can continue to create them – once you’ve created your policies you can switch off Security Defaults and enable the CA policies you’ve defined.

Summary

Security Defaults are a good addition to Azure AD, and therefore Office 365 and will ensure many more organizations are secured by default. It’s a pity they don’t include all of the basic functionality most organizations should have – but they are a great start by Microsoft on helping all customers – not just those with Azure AD Premium licensing – to secure their identity.

Source Practical365

read more
1 2 3 6
Page 1 of 6