close

Uncategorized

Uncategorized

Office 365 Supervision Improvements

no thumb

The Supervision Review feature has been available for a while in Office 365, even its “V2” version is two years old by now. Recently, Microsoft announced a refresh of the feature, introducing multiple improvements in both the setup and review processes. Some of these improvements have already been rolled out in your tenants, so in this article I will talk you through some of the updates. But first, here’s a quick refresher.

A quick refresher on Supervision

In a nutshell, the Supervision Review feature gives you the tools to monitor some, or all, of your employees’ communications. The reasons why you would want to do this vary from internal regulations to strict compliance requirements such as the FINRA rule 3110.

To configure Supervision in your tenant, you must create at least one Supervision policy, specifying the types of communication you want to monitor, which users or groups will fall under the scope of the policy and who will be responsible for conducting the review. Policy configuration is performed via the Security and Compliance Center (SCC), under the Supervision tab. Behind the scenes, a policy object and a rule associated with it are created and then pushed to the corresponding workloads. Once the policy has been successfully published, copies of the communication items will be stored in a special mailbox.

To check whether your employees are communicating in accordance with the regulatory requirements, the reviewers access the data stored in the supervision mailbox and upon inspecting each item, take an appropriate action. For additional information on configuring and using Supervision refer to the official documentation here.

New Supervision Review Policies

Now we’re up to speed on Supervision Review, let’s find out what’s new in the process of configuring Supervision policies. One major improvement is the ability to configure more granular conditions and exceptions, both in terms of the users that will be covered by the policy, as well as the actual items captured. The UI now features no less than 16 different conditions, in addition to the “directionality” (inbound, outbound or internal) controls, as shown on the screenshot below:

Microsoft listened to their customer feedback prior to this release and have introduced some new, useful capabilities. For example, the ability to scope supervision based on: the domain, the retention label applied to an item, or based on both pre-defined or custom sensitive information types. This makes it much easier for reviewers to focus on the items that matter most. Notably however, the UI could use some improvements when it comes to using the custom sensitive information types.

Another new improvement, currently only available in private preview, targets scenarios with offensive language use and seeks to solve any profanity issues by leveraging AI and ML models. The feature is called Intelligent filters and is far more advanced than the policies you can configure by leveraging the “Message contains any of these words” condition.

If you’re interested in participating in this preview, you can do so by sending an email to supervisionolpreview@service.microsoft.com and describe the scenario you’re trying to solve.

Lastly, and probably most important, Supervision policies now support messages exchanged in Microsoft Teams channels and private chats. By default, this is an enabled featured, however you do have the option to turn it on or off on a per-user basis. A screenshot is demonstrated below of how you can do so.

The local copy of Teams messages from the Team Chats folder in the user’s mailbox is used to capture those items. This is the reason why you can see a “msg” file type, and why they all have the subject as “IM”. An example item is shown in the next section. It is important to note that latency can be expected, unlike email communications which are captured in near real-time, Teams items are being processed once every 24 hours.

Improved reviewer experience

The other area Microsoft has focused on is the reviewer experience. Until recently, reviewers were only able to examine and act upon captured items via the Supervisory review add-in in Outlook or OWA. Apart from the fact that the new OWA version doesn’t support the Supervisory Review add-in yet, the OWA experience was spot on, the Supervisory Review mailbox is added automatically and all relevant content and actions are present.  

The experience in Outlook still has a lot of room for improvement. The main issue resides in the special recipient type used by the Supervisory Review mailboxes (SupervisoryReviewPolicyMailbox) and that the mailboxes are hidden from the GAL by default. Because of this, Outlook is not able to display these mailboxes. This combined with the fact that only folder-level permissions are granted to the reviewer, and Microsoft’s initial guidance instructions were incorrect, makes this a frustrating experience in Outlook for Admins. Microsoft have since corrected the erroneous instructions, however the process still remains tedious because it requires admin intervention to get the identity of the Supervisory mailbox and to update its properties and permissions.

To address the issues with the review experience, Microsoft have now released an Integrated review dashboard. This allows you to review items directly from the SCC UI (or the new Microsoft 365 Compliance Center). To access the new experience, select any of the Supervision policies you’ve created and press Open. You will then be taken to the Supervision dashboard where you will get a quick summary for the given policy on the Home page. On each panel, you can see the number of items pending review or already resolved, the list of users and groups under the scope of the policy, which workloads does the policy cover and who are the corresponding reviewers:

Under the Review tab, you will receive a list of all items that are still pending review. The items can be filtered out by tag: pendingcompliantnon-compliant, and questionable, but you’ll notice that no other filtering options are exposed here. So this part of the experience appears to be inferior when using the Supervisory review add-in for OWA or Outlook, both of which allow you to perform searches against items stored in the corresponding Supervisory Review mailbox.

You can sort items by Type, Subject, Sender and Date. Selecting an item allows you to preview its content, including metadata. And, if additional details are needed, such as full header information, you can use the Download button to get a full copy of the item. You can use the navigation buttons to switch to the next or previous items, and of course this can also be done by selecting the corresponding item from the item list.

In the Review pane, you can tag an item and add a comment. After pressing Save, the item will be moved to the corresponding Compliant, Non-Compliant or Questionable filter view. Alternatively, you can directly Resolve the item by pressing the corresponding button on top of the Review UI. It’s important to note that there is a change in behavior compared to the experience in OWA and Outlook – once an item has been marked as resolved here, no actions can be performed on it.

The biggest improvement in the review experience is the ability to perform bulk actions by using the checkboxes to select some or all items, then pressing the corresponding button. In contrast, the experience in OWA or Outlook requires you to first select an item, press the add-in button to perform an action, then rinse and repeat for each item individually. Available bulk actions in the new experience include: tagging items, adding comments on items, resolving items.

The last part of the new review experience is the Resolved items tab, where you can see a list of all the items marked as resolved. You are again presented with the option to preview the message content, including metadata, and download a full copy if needed. In addition, a history of any actions performed against the item is also shown, which can also be found under the review tab.

As previously mentioned, the Resolve action is final in the new UI, meaning once an item is marked as resolved it cannot be tagged or commented on. These actions are still available when using the add-in-based review experience in OWA or Outlook. Another difference with the new experience is that any “view” actions are not being displayed in the History pane.

Auditing and reporting

The last set of improvements are focused on reporting and auditing the Supervisory review functionality. Like other sections in the SCC, basic insights are displayed in small widgets on top of the Supervision page. However, in my experience, the information provided isn’t always reliable, fresh data insights appear to take a few days to complete and for some reason it chooses to ignore the 2-year-old policy I have in my tenant. Since you can get those details via the Review experience we covered above, this is not a major issue.

The more important data is exposed via the Supervision report, which you can access by clicking the Supervision widget or directly using this link. The report gives you a daily view of the number of items at each stage of the review process, and unlike most of the reports we can generate in Office 365, it allows you to export few years’ worth of data. If you’re not fan of the UI, you can also use PowerShell to get some of the data via the corresponding Get-Supervisory*Report cmdlets.

On the auditing side of things, actions related to management of Supervision policies now flow to the Unified Audit Log in Office 365. Below is an example on how to query the Unified Audit log for actions:

Actions performed by reviewers do not flow into the unified audit log but can be examined via the reports or via the Get-SupervisoryReviewActivitycmdlet.

Conclusion

In this article, we did a quick review of some new and exciting additions to the Supervision feature in Office 365. The most significant of these additions is arguably the added support for capturing Teams communications, although numerous other improvements have been made to management of Supervision policies, reporting and auditing. On the reviewer side of things, some of the pains of the add-in-based review experience have been addressed by introducing a new UI in the SCC, allowing you to perform the review directly from the browser. That is, if you have the necessary permissions to access the corresponding section of the SCC.

read more
Uncategorized

How to create help bots in a different way using Azure QnA Maker

no thumb

There are multiple ways to program bots to interact with users, but it is essential they have some kind of “intelligent” algorithm created to work effectively. The Microsoft Azure QnA Maker is a cloud-based API service which enables you to integrate a conversational, question and answer layer. This, in turn, powers an FAQ service from semi-structured content like documents, URLs, and product manuals. Utilizing the native language from the programmed location, the QnA Maker service answers questions by matching them with the best possible response. Essentially, what the QnA Maker allows us to do is create a bot using pre-existing information derived from FAQs, without needing to develop cumbersome and expensive AI algorithms. The QnA Maker can also be easily integrated with other systems such as SharePoint or Microsoft Teams.

Like most search engines, the Azure QnA Maker engine crawls for information across various sources, this information is then utilized to train the bot an algorithm to draw links between specific questions and answers. You number of sources allowed is dependent on what pricing tier you have selected, you can find out more information here. Below are some examples of the different sources the engine will collect information from:

  • Public FAQ sites (URLs) which have questions or answers on their pages, these must be publicly reachable without any kind of authorization process
  • Physical documents containing FAQs such as: product manuals, brochures, flyers, support guides, etc. The QnA Maker engine accepts documents in the format of: Word, Excel, pdf, txt and tsv (Tab Separated Values)
  • Questions or answers can be entered manually too

How to create a QnA System

There are a few ways to create a QnA system:

  1. The first being utilizing the completed user interface provided by Microsoft where you can define the sources and other parameters, test the results and publish it to make it public
  2. Or, utilizing the API that allows the creation of systems in a more automated mode
  3. There is also a REST API which sends questions to the engine and receives the matching answers

Creation of the Azure QnA Maker Service

Before you get started you need to create the QnA Maker service first:

  1. From the Azure panel-control page, sign in with your administrator’s account and create a new Resource Group or reuse an existing one
  2. In the Group add a QnA Maker resource, giving it a name, Azure Subscription, Location (“West US” is the only available currently), and pricing tier

Because QnA uses a standard Azure search to crawl the answers, define as well its location and pricing tier. An Azure App Service and App Service Plan are created automatically to manage the REST calls.

Create QnA Maker

Figure 1. Creation of the QnA Service in Azure

Creating the QnA

Now that you’ve created the Azure QnA Maker Service and system, you need to create the QnA using the WYSIWYG interface. Below are the steps you need to execute this correctly:

  1. Open the site and log in with the same credentials used to create the Azure Service.
  2. Then select Create a knowledge base. If you haven’t followed the instructions to create the Service in Azure, the first step in the page allows you to do it using the button “Create a QnA service”.
  3. Then, follow the sequence in “Step 2” in the page to select the Azure Tenant, Subscription and QnA Service.
  4. After you’ve completed that step, select a name for the QnA and after that you get the possibility to populate the database with the questions and answers.
  5. As mentioned earlier in this article, you can define one or more URLs with the questions and answers and/or add files containing the information. Step 4 on the page allows you to use both methods.
  6. If you prefer to add manually the questions and answers, you can skip this step and go ahead using the button Create your knowledge base.

Figure 2. Creation of the QnA using the Azure Admin interface.

After you’ve completed this process you’ll find the QnA engine starts working by extracting the questions and answers from the sources and utilizing this information to train the AI algorithms. This can take several minutes, depending on how much information you’ve provided. When the process is complete, you’ll then be redirected to the Knowledge basepage. If the engine finds errors in the extraction, it will either ask you to correct it or skip the data source.

The Knowledge base page shows the list with extracted QnAs, together with a Test button where you can fire queries and review the answers. If you test a question and aren’t satisfied with the answer, use the button Inspect which will open a new window where you can refine manually the answer or, eventually, create a totally new answer if necessary. Also, the Knowledge base page has an Add QnA pair button that you can use to create customized questions and answers.

Figure 3. Reviewing, testing and training the QnA

When you are pleased with the results, use the Save and train button to rerun the learning algorithms. Then, you can make the QnA public by using the Publish button.

Using the QnA from SharePoint

Sending a question and receiving the answer(s) is relatively simple using the QnA REST API. This is ideal for creating a SPFx (SharePoint Framework) WebPart and for integrating the QnA into SharePoint Online. The SharePoint Framework is the recommended way to create customized functionality in SharePoint Office 365 and they can be used in Microsoft Teams as well. It uses JavaScript as language or you can use TypeScript if you prefer to work with a friendlier language. SPFx components can be developed totally using Open Source tools as Gulp, Yeoman and Visual Studio Code; full information about how to create a development environment and the WebParts self is provided by Microsoft here.

Below is the code on how to create a new SPFx WebPart, add the following code fragment and routine (TypeScript):

Note: the complete SPFx source code for the WebPart can be found in the GitHub report.  

As you may have recognized in the code, we need to fill three variables [SJ3] with data coming from the used Azure Service Resource Group:

  • The hostUrl can be found at the “URL” in the Overview window of the App Service
  • The indexId is visible in the Search Service – Indexes, under “Name”;
  •  You can find the endpointKey in the QnA portal (where you defined the data sources) under “Settings” – “Authorization: Endpoint”

The routine getAnswers in the code creates the URL for the REST call, adds the authorization in the HTTP headers and the question in the body, analyzes the JSON response, expels the results for answer, scores how accurate the answer is and answer-source in the interface.

Now, you need to compile and deploy the WebPart to SharePoint, following the instructions given by Microsoft. The WebPart allows to send a question to the Azure QnA Service using REST and responds with the related answer. The answer and related metadata is then shown in the user interface below:

Figure 4. SPFx WebPart working inside SharePoint to send questions and receive answers

To conclude, we can say that the Azure QnA Maker Service is a really easy way to create an intelligent FAQ system that can straightforwardly be integrated with SharePoint or Microsoft Teams. The QnA system is inexpensive and doesn’t need any kind of customized Artificial Intelligence algorithms that must be created and trained separately, which is a bonus in terms of implementing this cost effectively.

read more
Exchange 2016

How to inventory membership of Exchange Groups, recursively

download (1)

To this day, one of the most common questions I run into on technical communities is “how do I generate a list of all members of all groups in my organization”. Even though there are dozens of script samples and tools available on the internet for this task, it seems that they are either hard to find or not ticking all the boxes, therefore people are still trying to find a better solution. For this reason, as well as some recent advancements in the Microsoft Graph APIs, I decided that it’s worth publishing another article on this topic. Plus, it helps us keep the blog a truly practical resource.

Handling recursive output via the directory tools

One of the problems people face when inventorying group membership is making sure membership of nested groups is expanded, that is, the output should include any direct and indirect members of the group. It can go the other way too, by listing all of the groups a given user is a member of, including “parent” ones.

In the AD world, this is a relatively easy task, thanks to the so-called matching rule object identifiers and more specifically the LDAP_MATCHING_RULE_IN_CHAIN OID one. Designed to “traverse” the hierarchy, these constructs can be used to cycle over each parent (or child) object and match them against a filter. Although this type of filter only works against the object’s DistinguishedName value and the syntax can look scary, it gets the job done, and fast.

For example, if you want to list all AD groups a given user is a member of, including nested groups, you can use the first cmdlet below. The second one can be used to list all users that are a member of a given group, or any group nested under it. The third one generalizes this example to include any object types, not just users.

[ps]#List all AD groups a given user is a
member of

Get-ADGroup -LDAPFilter
“(member:1.2.840.113556.1.4.1941:=CN=vasil,CN=Users,DC=michev,DC=info)”

#List all USERS members of a given group

Get-ADUser -LDAPFilter
“(memberof:1.2.840.113556.1.4.1941:=CN=DG,CN=Users,DC=michev,DC=info)”

#List all OBJECTS that are members of a
given group

Get-ADObject -LDAPFilter
“(memberof:1.2.840.113556.1.4.1941:=CN=DG,CN=Users,DC=michev,DC=info)”[/ps]

Use of these filters is not limited to just the AD PowerShell cmdlets, in fact you can run the exact same queries via dsquery or similar tools. The AD PowerShell module does add one additional helpful method for the scenarios we examine. Namely, the –Recursive parameter for the Get-ADGroupMember cmdlet, which “flattens” out the membership list by returning only objects that don’t contain child entries. The syntax is of course much simpler compared to the filters we examined above, but on the downside, the output will only include user objects when the –Recursiveparameter is used. An example is shown below:

[ps]Get-ADGroupMember DG -Recursive[/ps]

In Office 365 and the underlying Azure AD, the methods outlined above are not available. The good news is that we just recently got support for the so-called transitive membership queries, which practically function the same. For example, the below query will return all direct and indirect members of a given group, including users, contacts, groups and so on:

[ps]https://graph.microsoft.com/beta/groups/c91cd116-a8a5-443b-9ae1-e1f0bade4a23/transitiveMembers[/ps]

This method is currently only available when querying the Graph API directly, and only when using the /beta endpoint, but hopefully, it will be exposed as a parameter for cmdlets in the Azure AD module. Unfortunately, as with the AD methods, it only covers objects which the underlying directory recognizes, meaning it’s not applicable to all group types.

Handling Exchange recipient types

Which brings us to the next common issue, the fact that most solutions out there don’t cover objects such as Office 365 Groups, Dynamic Distribution Groups, mail-enabled Public folders and so on. Some of these object types exist only in the Exchange directory, others span multiple workloads and handling them requires special treatment, and some are simply more “exotic” and usually neglected.

This in turn means that if we want a proper inventory of all recipient types recognized by Exchange, we cannot use the methods outlined above. The first logical action then is to look at the Exchange tools and use any methods exposed therein. As it’s often the case, the Get-Recipient cmdlet can offer a potential solution. Indeed, you can use the following filter to get all the valid Exchange recipients that are member of a given group:

[ps]Get-Recipient -Filter “MemberOfGroup -eq ‘CN=MESG,CN=Users,DC=michev,DC=info’”[/ps]

Unfortunately, this method does not expand the membership of any nested groups. In turn, this means that if you want to collect a comprehensive inventory of all your Exchange (Online) group objects and their members, you will have to iterate against each group, expand its membership, then rinse and repeat for any nested groups. The logic to do this in code is not very complex, and we’ve had PowerShell script samples that cover this for years. The main problem is the amount of resources consumed and the time it will take to complete the script.

With that in mind, I’ve decided to put together a script that follows some of the best practices for running against Exchange Remote PowerShell. We will utilize my preferred method of using implicit remoting and minimizing the amount of data returned by selecting just the properties we need via Invoke-Command. Using server-side filtering where possible is also a very good idea. You will find a link to the script at the end of the article, so if you aren’t interested in the details, then skip the next sections.

One additional limitation of the Get-Recipient cmdlet is that it does not return any objects of type User and ExchangeSecurityGroup, that is not mail-enabled objects which are synchronized from Azure AD. Although in general, you can just ignore these, other cmdlets such as Get-DistributionGroupMember might return them in the list of members.

Group types recognized by Exchange (Online)

When using long-running scripts, it’s always a good idea to exclude any objects you’re not interested in. With that in mind, the script attached to this article is designed to accept several parameters, designating the different types of Exchange groups for which you want to generate the membership inventory. Those include:

  • “Traditional” Distribution groups, which are included by default if you don’t specify any parameters, or use the –IncludeDGs switch
  • Mail-enabled Security groups, for which the above logic applies
  • Office 365 (or Modern) Groups, included when you specify the –IncludeOffice365Groups switch
  • Dynamic Distribution groups, included when you specify the –IncludeDynamicDGs switch
  • All of the above, which is the behavior used when you specify the –IncludeAll switch

One particular group type I have excluded are RoomLists, which in my experience people simply don’t want listed in these reports. If you do want to include them, feel free to make the relevant changes in the code (line 111, 225). If you are running the script against on-premises Exchange install, you might want to remove any references to GroupMailbox objects as well. Although the script runs just fine in Exchange 2019 EMS, I haven’t checked older versions, and not all of them will recognize these object types.

Handling Office 365 Groups

Office 365 Groups, also known as Modern Groups, are often neglected when generating membership inventory. As Office 365 Groups do not support nesting, they are relatively simple to handle. However, different cmdlet needs to be used to list their membership, namely the Get-UnifiedGroupLinks cmdlet. Here’s an example:

[ps]Get-UnifiedGroupLinks firstgroup -LinkType member[/ps]

If you specify the –IncludeOffice365Groups switch, the script will ensure that all Office 365 Groups in your organization are enumerated and their membership included in the output. In addition, the script will also include these types of objects in the output whenever it finds an Office 365 Group nested inside another group, and will expand their membership if you specify the corresponding switch parameters. But, I’ll speak more on that later.

Handling Dynamic Distribution Groups

Exchange Dynamic Distribution Groups are a special case, as they don’t have preset membership. Instead of “listing” their members, we can “preview” the current list of recipients under the scope of the DDG filter, by means of using the Get-Recipient cmdlet. Here’s an example:

[ps]Get-Recipient -RecipientPreviewFilter (Get-DynamicDistributionGroup DDG).RecipientFilter[/ps]

While using cmdlets such as the above one isn’t anything particularly complicated, it’s not uncommon for DDGs to have filters that include the entire organization or large parts of it. As any valid Exchange recipient is included by default, sans some system objects, it’s more than likely that a DDG can have multiple other group objects “nested”, including other DDGs. And, in some cases the initial DDG can be included as a member of some of these groups. Of course, this scenario is not limited to just DDGs, it’s simply more common with them because of the membership model used.

Handling nested groups

In order to handle nested groups, we need a solution that can detect recursion and break processing as needed. To help with that, I’ve broken down the script into several smaller functions, with the “master” one trying to keep a track of whether a “child” group was already processed, or links back to the “parent”. As I am not a programmer by trade, my solution is hardly the best in terms of code practices, but it seems to do the job just fine, at least for the scenarios I could think of. If you have groups nested 10 levels deep with recurrence on every level, the script will most likely still loop indefinitely.

Assuming the code part is handled correctly, one must also decide how to handle the output. Some people will be fine just knowing that a given group has nested groups in its membership, and simply treat them as another “regular” member. Others will want to get a “flattened” list of members, with the membership of any nested groups expanded and added to the list of members of the parent group. This is the behavior when you invoke the script with the –RecursiveOutput switch. Lastly, if you want to get both the flattened membership and the email address of any nested groups, use the –RecursiveOutputListGroups switch together with the –RecursiveOutput one. Examples of the output in the different scenarios can be found in the screenshot below:

Inventorying membership of Exchange groupss

In all three examples, the script run only against a single distribution group, “DG”. The top example list just direct members of the group, the middle one includes any members of the nested “empty” group as well, since the –RecursiveOutput switch was used. Lastly, the bottom example was run with both the –RecursiveOutput and –RecursiveOutputListGroups switches, and thus includes the membership of any nested groups, as well as an entry that lists the address (or identifier) for the actual nested group.

Additional notes

Most of the building blocks of the script were explained in the previous sections, however there are few additional things to mention. First of all, the script doesn’t handle connectivity to Exchange, this part is up to you. It will invoke the Check-Connectivity helper function to detect and reuse any existing Exchange Remote PowerShell sessions, including EMS ones. Failing that, it will try to establish a session to ExO using basic auth, but that’s all. If you are connecting to any of the “sovereign” clouds or your admin credentials are protected by MFA, do the connect part manually, then invoke the script.

By default, the script will export the results to a CSV file in the current directory and will also store it in the $varGroupMembership global variable so that you can reuse it directly in the current session if you need to sort or filter it further. If you want to generate a separate CSV file for each group, uncomment line 86. Be warned though, if the script ends up in an infinite loop because of recursively nested groups, this will have quite an unpleasant impact on the filesystem!

An alternative approach might be to dot-source the script or simply reuse the function in your own scripts. If you do this, be aware that the Get-Membership function should not be called directly, as it relies on other functions for error handling and expects a properly formatted object. For the same reasons, no help is provided for the function, but I have put detailed comments around the important parts. One scenario where you will want to edit this function is when you want to use different identifier for the group member, in which case you will have to update the script block between lines 106-113.

Speaking of identifiers, all the group objects are represented by their PrimarySmtpAddress. However, as the group members wont necessarily have an email address, a different identifier might be used for them. For example, Mail Contacts or Guest Mail Users might be represented by their ExternalEmailAddress attribute instead. Among other properties that might be used as the identifier you can find UPNWindowsLiveIDExchangeGUID or ExternalDirectoryObjectId. In all cases though, the member should be represented by an unique-valued property which you can use to identify the corresponding Azure AD object, if such exists.

While I’ve tried to optimize the script as much as possible, in a large environment it will still end up issuing thousands of cmdlets and you will most likely be throttled. Adding some artificial delay to the script is a simple way to combat this, so every time the script processes a new “top level” group, 300ms delay is added as part of the connectivity check (line 21). This seems to be sufficient to properly run the script against a medium-sized tenant (10k+ objects, 800+ groups), and it resulted in no throttling during my tests. A more comprehensive solution will require you to monitor the throttling balance, as detailed for example in this article.

While large number of groups can cause issues with throttling, a different type of issue might arise if you have groups with large number of members. Since the output CSV file contains the lists of all the group members in the “Members” column, if you are opening the file with Excel you might run into the single cell size limit, which might mess up the display as well. In my tests, groups with over 1500 members (4000+ with the nested group membership flattened) caused Excel to misbehave. Your mileage will vary, but you can always rely on other text editors or even PowerShell to work with the full member list in such scenarios.

Lastly, if the script fails or doesn’t return the results you expect, you might try running it with the –Verbose switch. Or you can also drop a comment here, over at the T

read more
Azure ADUncategorized

Multi-factor Authentication by Default for Administrators in Azure AD and Office 365

attachdocumentssharepointlist13a

Microsoft is rolling out a new baseline security policy for Azure Active Directory and Office 365 that requires multi-factor authentication for privileged accounts. The policy is in public preview right now, meaning it is visible in tenants but not yet enabled.

The baseline security policy will require multi-factor authentication for accounts that are members of one of the following privileged roles:

  • Global administrator
  • SharePoint administrator
  • Exchange administrator
  • Conditional access administrator
  • Security administrator

You can view the policy in the Azure AD portal by navigating to the Conditional access section.

Although the baseline security policy is implemented as a conditional access policy there is no customization available except for excluding users and groups. Conditional access rules that you can fully customize require Azure AD Premium licenses, whereas the baseline security policy is available to all customers. You can use the exclusion option to exclude at least one global administrator account from all conditional access policies. Microsoft recommends doing so to ensure that you still have a way to log in if you inadvertently lock yourself out of all admin portals. Think of it as a “break glass in case of emergency” account. The account should have a strong password that is stored in a secure location, and is not regularly used for day to day administration tasks.

The new baseline security policy has been reported elsewhere as “mandatory” or as Microsoft “forcing” multi-factor authentication on customers’ administrative accounts. This is not true of course. You can opt-out of the policy before it goes live by choosing Do not use policy, and you can set exclusions as I just mentioned a moment ago. Aside from the emergency access account you should aim to minimize the exclusions that you add to the policy. Microsoft recommends if possible switching to Managed Service Identity (MSI) or service principals with certificates.

The nature of the policy also ensures that accounts that are temporarily elevated to a privileged role (either manually or via privileged identity management) have MFA enforced on them, reducing the risk of compromise during the period of time they hold privileged access. This is similar to another recent addition to conditional access allowing policies to be targeted at directory roles. That capability extends to a wider range of directory roles than the five that are targeted by the baseline security policy.

Overall this is a good move for Microsoft to make, strongly pushing customers towards securing privileged accounts. When I surveyed readers last year, 55% of respondents were not using MFA at all (even for admin accounts). That’s despite MFA for admin accounts being one of the recommend first steps for new Office 365 tenants, being flagged by Office 365 Secure Score, and being one of the general account security recommendations from Microsoft.

What do you think? Will you be enabling the new baseline security policy in your tenant?

read more
Uncategorized

Microsoft Graph 101: Build Intelligence with Microsoft Graph

no thumb

One API to access all of your data in the Microsoft Cloud? We’re remarkably close to that reality. Here’s what you need to know to start building applications with the Microsoft Graph API.

The Microsoft Graph is evolving into a service that provides direct API access to user information, documents, business intelligence (BI) and machine learning insights based on data from the company’s cloud-based applications and data services. While only 18 months old, Graph benefits from years of incremental work by Microsoft on Office-based data interchange features and a broad, mature platform of cloud services on which to build.

Graph’s primary benefits, however, grow from two important features: significant sources of cloud-based data and a consistent, flexible API that’s rapidly gaining new services

First, Graph enables you to access, combine, and build workflows and applications with data across a broad range of Microsoft Cloud services, including Azure Active Directory (Azure AD), Intune, Office 365, OneDrive, Power BI, SharePoint, OneDrive and more. The data may be provided by employees using Microsoft Cloud services within your organization, it could be customer information generated through business processes and applications, or it could even be provided by third-party application services.

Second, Graph provides a broad, straightforward, and consistent set of API endpoints for accessing your cloud-based data and machine learning insights. Previous data interchange efforts provided access to limited data (such as just SharePoint or Exchange data, or just data for a particular business service, such as search, scheduling, or advertising). API access was often provided through language-specific libraries or complicated data-access interfaces. Refreshingly, Graph uses simple HTTP-based API endpoints that you can access through the language or application framework of your choosing.

There’s been a lot of talk about the high-level collaboration and BI scenarios that Microsoft can, potentially, enable through the Graph API. However, there’s been significantly less explanation of the actual interfaces presented for your team to develop against and the resources Microsoft has made available for you to employ these interfaces in your own applications or daily workflows.

I’m going to walk you through the specifics of what Graph enables developers and IT professionals to do today, along with the tools and resources Microsoft has released so far for building Graph-based applications.

But first, let’s step back for a minute and briefly look at where Graph came from and what it proposes as a solution today.

What Is Microsoft Graph?
Microsoft has a long history of ambitious data interchange concepts that it iterates forward generation by generation. The original Microsoft Graph (also known as Microsoft Chart) was a 1990s-era Object Linking and Embedding (OLE) technology for Office apps, specifically Access and Excel. And it’s worth pointing out that OLE itself was an evolutionary step from the Dynamic Data Exchange (DDE) interprocess communication mechanism introduced in the late 1980s.

Jumping forward a decade or so, Microsoft continued to ship interesting cross-application collaboration tools such as Groove Networks, the company founded by Ray Ozzie that Microsoft acquired in 2005. The Groove technology evolved into the SharePoint Workspace and Azure AD, and in March 2014 Microsoft launched the Office Graph, consisting of two social networking technologies and a new search and discovery app, code-named “Oslo,” plus a new “Groups” capability, which Microsoft started extending across Office. In November 2015, the Office Graph became Microsoft Graph, a sweeping effort to bring BI beyond Office to its connected frameworks. Microsoft started out by rolling much of the Azure AD feature set into Graph.

Here’s where the technology has finally caught up with Microsoft’s ambitions: If you’re using Microsoft’s cloud services and applications, the Graph API lets you access all of that information, create service workflows, and operate on user information, documents, and machine learning insights from that data. Microsoft is closer than ever to offering a single API to build data-interchange and machine learning applications on all of the data from all of its cloud-based applications. Maybe not all just yet, but more than we’ve seen so far.

In this case, “graph” refers not to the concept of a graph database — it may in fact use a graph database behind the scenes, though Microsoft has not discussed the underlying data structures used — but more important, I think, it refers to the concept of a social graph for your documents, data and contacts within the Microsoft Cloud.

To turn this around and look at it from a user’s perspective, people in your organization are using a variety of Microsoft applications and services, including Azure AD, Office 365, OneDrive, OneNote, SharePoint, Planner, Intune and more. Graph gives you the opportunity to access and analyze all of that data through a single, consistent API layer that is straightforward to address and simple to use.

The Graph API enables some fairly interesting cross-application scenarios right out of the box. Show me the documents used by the people I work with most often. Tell me when people are added to AD and automatically kick off employee onboarding workflows. Users, groups, e-mail, contacts, and tasks are all available directly through the Graph API endpoints, along with files, notes, and chat conversations (see Figure 1).

[Click on image for larger view.]Figure 1. Microsoft Graph provides a single, simple API access strategy with many SDK options for how you access those APIs.

Using the Microsoft Graph APIs
Using the Graph API to access data is straightforward and remarkably transparent for developers who have any experience using Web service APIs. Microsoft chose to build Graph by employing mainstream, easy-to-use technologies: OpenID Connect and OAuth 2.0 for authentication to the service (using Azure AD as your identity provider), HTTP-based REST API endpoints, and standard JSON-formatted response objects.

Common data types and tasks are organized through individual API endpoints, which you’ll recognize as HTTP URLs. For example, the https://graph.microsoft.com/v1.0/me endpoint provides information about the authenticated user. Further information about that user is provided by child API endpoints: my Outlook e-mail (/me/messages), my calendar (/me/calendar), my OneDrive files (/me/drive/root/children).

Say I wanted to know about my provisioned software. The API endpoint for this Microsoft Graph data is: https://graph.microsoft.com/v1.0/me/provisionedPlans

Along with some system information, the response includes an array of objects that specifies which software has been provisioned for my use, as shown in Figure 2.

read more
Azure ADUncategorized

Microsoft Extends Azure Active Directory Conditional Access Policies

hero_azuread

The Azure Active Directory identity and access management service now supports conditional access policies when used with Microsoft Teams, as well as the Azure Portal, Microsoft announced today.

Conditional access policies refer to conditions that must be true before access to network resources is permitted. For instance, a device might need to have the latest software updates in place before gaining access to those resources. Other conditions might be the user’s location or the user’s sign-in risk, which might be factors set under conditional access policies.

Microsoft explained that, until today, IT pros using the Azure AD service didn’t have the ability to set conditional access policies for either Microsoft Teams or the Azure Portal. Now, that capability is available.

IT pros can set conditional access policies for the Azure Portal using that portal. However, Microsoft cautioned that such changes will affect other management solutions as well, such as “classic Azure portal, Azure portal, Azure Resource Manager provider, classic Service Management APIs, as well as PowerShell.”

In addition, IT pros making such changes via the Azure Portal could wind up locking themselves out.

“While configuring a policy for Azure portal, be cautious! A bad configuration might lead to you locking yourself out,” Microsoft cautioned.

On the Microsoft Teams side, IT pros can use the Microsoft Teams “cloud app for IT admins” to set conditional access policies. (Possibly, Microsoft is referring to the “Office 365 Admin Center,” a browser-based administrative portal, as described in this support document.) Microsoft Teams is the company’s newest collaboration solution for Office 365 users.

The announcement added that if other conditional access policies have been set for applications other than Microsoft Teams, then they’ll take effect, too.

“It’s important to note that Conditional Access policies created for Exchange Online and SharePoint Online cloud apps also affect Microsoft Teams as the Teams clients rely heavily on these services for core productivity scenarios such as meetings, calendars and files,” the announcement explained.

read more
Uncategorized

Google Targets Businesses with Chrome Enterprise Offering and VMware Management Partnership

no thumb

Google this week announced a new Chrome Enterprise license, its latest effort to bring the Chrome OS operating system and Chromebook laptops into the business world.

The company also announced a partnership expansion with VMware to facilitate the lifecycle management of Chromebook devices using VMware’s Workspace One technology. The expanded effort also encompasses mobile device management capabilities using VMware’s AirWatch solution, plus the ability to access Windows applications on Chromebooks using VMware’s Horizon virtual desktop infrastructure technology.

In its corresponding announcement, VMware claimed to be “the first unified endpoint management (UEM) provider with full Chrome device management capabilities.” The integration effort permits access to “full Windows desktops” by organizations, as well as “access to all enterprise applications — cloud, web, native Android, virtual Windows — from a single app catalog,” according to VMware’s announcement.

Active Directory Support
Google’s Chrome Enterprise efforts were outlined in a Web presentation today by Rajen Sheth, a senior director of product management for Google Chrome and Google Cloud. In general, Sheth depicted the Chrome Enterprise license as offering a single management approach for organizations that would permit anywhere access to applications by end users, including Chrome OS apps and Android apps from the Google Play store, as well as Windows apps through VMware’s Horizon technology.

One notable point in Sheth’s talk was the ability of organizations to create domain-joined devices using Microsoft’s Active Directory identity and access management solution with the Chrome Enterprise technology.

The Chrome Enterprise license isn’t Google’s first effort to address the business market with Chrome OS, a browser-based operating system that’s tied to the Google Play store. The new license is replacing the currently existing Chrome Management license that’s used by organizations. Those Chrome Management licensees will be “automatically upgraded to Chrome Enterprise,” a Google spokesperson indicated, during the Q&A portion of the talk.

Google also is describing the Chrome Enterprise license as having more robust management capabilities than the current Chrome Management license.

“Chrome Management license has outgrown its name and capabilities and is being replaced by Chrome Enterprise license,” a Google spokesperson indicated during the Q&A portion of Sheth’s talk. “In fact, there are many new features that are now included in Chrome Enterprise including Microsoft Active Directory Integration, third-party EMM [enterprise mobility management] integration with AirWatch as the first EMM to support and native Print management for cloud and local printing.”

Sheth made the argument that that there are 3.2 billion workers but “only 750 million PCs out there” and that many people don’t have access to PCs on a daily basis. Companies are looking at using shared devices, kiosks and meeting devices for videoconferencing, but computers should be easier to manage at a lower cost. He claimed that Chrome OS devices currently are the “leading” devices used in U.S. education markets and that the trend is catching on for businesses, too, particularly as they move toward using cloud services.

Organizations are moving toward using mobile devices, but they want control over them. With Chrome Enterprise, Chrome OS can be integrated with Microsoft Active Directory, and there’s “native” integration with printers, too, Sheth argued.

“We’re including now integration with Microsoft Active Directory, and so customers can be able to sign in to Active Directory as well as join a domain with Active Directory,” Sheth said. He added that “you’re able to actually push a set of policies from Active Directory to the Chromebook, and you can authenticate directly to Active Directory, and what we do is that when you do that, we give you the power of that login, as well as a Google login, to give you access to all of your applications.”

Here’s Google’s slide on the Active Directory capabilities of Chrome Enterprise licensing:

[Click on image for larger view.]Figure 1. Active Directory support with Chrome Enterprise. Source: Google presentation on Aug. 23.

Other Management Capabilities
With regard to the native printing capabilities of the Chrome Enterprise license, Sheth said that “customers don’t have to change anything to be able to print with a Chromebook, which was a pain point previously for enterprises.”

The overall management capabilities of Chrome Enterprise licensing are summed up in this slide:

[Click on image for larger view.]Figure 2. Chrome Enterprise management controls. Source: Google presentation on Aug. 23.

Chrome Enterprise licensing offers a few other capabilities over the consumer Chrome OS model. As illustrated on the Chrome Enterprise landing page, there’s a “managed Google Play Store” (currently at the beta stage) for distributing applications to end users. There are management capabilities for Chrome extensions and Chrome browsers. Single sign-on access is supported, and there’s device theft prevention. Chrome Enterprise licensing also offers controls for “managed networks and proxies,” as well as “managed OS updates.” Lastly, Google claims there’s “24/7 enterprise support.”

It’s also possible to manage virtual private networks using the solution, according to this Google support page.

Google is touting the security protections of Chrome OS devices. They are required to have hardware security modules installed. The firmware of the devices will check for any unwanted configuration changes with a “verified boot” feature. Other Chrome OS security features include “privilege separation, process sandboxing, full disk encryption and automatic updates,” according to a slide in the presentation.

The cost of Chrome Enterprise licensing is listed as “starting at $50 per device, annually.” The VMware management capabilities are separate costs, though, per Google’s Q&A.

read more
Uncategorized

Microsoft Previews Azure Confidential Computing and Managed Service Identity Security Protections

no thumb

Microsoft this week announced previews of two new Microsoft Azure security measures that possibly add assurances for organizations wary of trusting their data and code on outside infrastructure.

One of them is called Azure “confidential computing,” which provides protections for data when it gets processed “in the clear” from Microsoft’s datacenters, according to an announcement by Mark Russinovich, chief technology officer for Microsoft Azure. Microsoft already provides encryption to protect data when it’s stored “at rest” on Azure infrastructure.

The other security measure announced at preview is Azure Active Directory Managed Service Identity, a free resource for developers so that they don’t have to deal with code credentials when tapping Azure services.

Confidential Computing Preview
Azure confidential computing protects Azure data against the following possible threats, according to Microsoft’s announcement:

  • Malicious insiders with administrative privilege or direct access to hardware on which it is being processed
  • Hackers and malware that exploit bugs in the operating system, application, or hypervisor
  • Third parties accessing it without their consent

Typically, Azure datacenters already have internal physical security for the data that’s housed there, but the confidential computing element uses a so-called Trusted Execution Environment (TEE) to prevent outside parties from viewing the data stored on Azure, “even with a debugger,” Microsoft’s announcement claimed. The TEE, which Microsoft also refers to as an “enclave,” will check code trying to access the data and will disable operations “if the code is altered or tampered.”

Microsoft currently has two TEE options for the confidential computing scheme. There’s a pure software version known as “Virtual Secure Mode” that uses Hyper-V in Windows 10 and Windows Server 2016. The other TEE option is the hardware-based Intel Software Guard Extensions (SGX) solution, which leverages the CPU. Microsoft is working with other parties as well to develop other TEEs.

The TEE or enclave technology is already being used as part of Microsoft’s Coco Framework for blockchain electronic ledgers, and that same technology protects “Azure SQL Database and SQL Server,” too. It’s an “enhancement of our Always Encrypted capability,” Russinovich explained. For those who like diagrams, Russinovich explained the Coco Framework in this Microsoft Channel 9 video.

Confidential security is currently just available for organizations that are part of Microsoft’s “Early Access” program, so it’s still at the test level. They have to fill out a survey here to join the program.

Managed Service Identity Preview
The preview of Azure AD Managed Service Identity is designed as an aid for developers such that they won’t have to manage security credentials when using code with various Microsoft Azure services. It creates a so-called “bootstrap identity.” Using it, developers don’t have to directly access the credentials stored in the Azure Key Vault or put credentials in code, Microsoft’s announcement explained.

Microsoft currently offers Managed Service Identity previews for different Azure services, including Azure Virtual Machines (both Linux and Windows), as well as the Azure App Service and Azure Functions. The previews are rolling out gradually worldwide, so they may not be immediately available, a Microsoft document noted.

Microsoft’s announcement promised that the Azure AD Managed Service Identity is being groomed to be part of the free version of Azure AD subscriptions, so there’ll be no cost for using it.

read more
Uncategorized

Step by step O365 configuration for Single Sign On with ADFS 2016

no thumb

 

Estimated Reading Time: 10 minutes

In my last post on August 31st 2017, I have shown you how to install ADFS server role in Windows 2016 Server, today I will write about one of the most discussed and little complex topic in the O365 world, the single sign on with the ADFS Server. Although some people may think it is complex however if you configure it step by step following this article, I don’t think you will find it very complex. If you wanted to know more about Azure Hybrid Identities, please check out my blog on Azure hybrid identities here where I have clearly explained why ADFS is one of the best solution for the single sign on.

If you don’t know much about ADFS and new to the ADFS world here are some information about ADFS.

What is ADFS?

Microsoft Active Directory Federation Services (AD FS) is intended to provide a platform for handling single sign-on with cloud applications outside of the firewall. Active Directory Federation Services (AD FS) simplifies access to systems and applications using a claims-based access (CBA) authorization mechanism to maintain application security. AD FS supports Web single-sign-on (SSO) technologies that help information technology (IT) organizations collaborate across organizational boundaries. AD-FS is a role service in Windows Server 2012 R2 and Windows Server 2016 are available as a ‘free solution’.

In simple words it is a software component developed by Microsoft that can be installed on Windows Server operating systems to provide users with single sign-on access to systems and applications located across organizational boundaries. It uses a claims-based access control authorization model to maintain application security and implement federated identity.

What is claim based authentication?

Claims-based authentication is the process of authenticating a user based on a set of claims about its identity contained in a trusted token.

ADFS Features

ADFS provides the Single Sign On facility which allows user to log in once and get access of all the systems without being prompted to log in again. Some of the features of ADFS 3.0 are as follows:

  • Automatic authentication: This feature let user’s access corporate applications and resources with a single username and password.

  • Flexible Authentication Option: ADFS works with SAML and WS-Federation protocol.

Configure your O365 Tenant with on premise ADFS server

This section outlines the project steps in detail that is performed by me to implement the proposed solution:

  • Prepare the Architecture Diagram.
  • Open the required Ports between POC infrastructure and O365.
  • Activate the WhyAzure.in account for Office 365 and get the Office 365 administration account credentials.
  • Prepare and Deploy the Active Directory Federation server role in Windows 2016 Server.
  • Verify the domain ownership in GoDaddy Portal.
  • Add New Users in O365 with custom domain UPN.
  • Assign O365 License to the new users.
  • Add DNS records in the GoDaddy portal.
  • Install Azure AD Connect.
  • Configure Azure AD Connect.
  • Start the Directory Sync.
  • Configure AD Connect with the ADFS Server.
  • Configure SSL Certificate in Azure AD Connect.
  • Install Microsoft Azure Active Directory Module for Windows PowerShell.
  • Set the ADFSContext with the help of PowerShell.
  • Start the AdSyncCycle.
  • Test the ADFS based single sign on.

Proposed Architecture Diagram:

An enterprise ready Architecture Solution for using O365 SSO with ADFS server is shown below which is that standard solution and used in most of the enterprises

Fig: Whyazure Production Infrastructure.

However in this post I have not used the ADFS proxy server roles, because most of the places this role is no longer used instead people now a days are using Azure WAP (Azure Web Application Proxy) for the reverse proxy services.

More details on the Azure AD WAP can be found here.

Used Architecture Diagram:

Below you can find the Architecture for this POC

Fig: The Architecture for this POC

Component Required:

Subscription:

  • O365 Subscription

(Please note that I have tested this configuration with my Office 365 Enterprise E3 Developer license.)


Prerquisite:

  • ADFS Server Role already configured on premise.
  • ADFS Server should have a public certificate.
  • Azure Active Directory PowerShell Module.

Service Accounts:

  • Whyazure\administrator

O365 Global Admin Account:

GoDaddy Admin Account

Deployment Steps:

In the first step to setup SSO with the ADFS server I have added my custom domain to office 365 domain list.

Once it’s added the next step is to click on the start setup button in O365 and

Please note that this is a wizard which will take me to rest of the steps to configure the O365 with ADFS.

Once I click on Start Setup MS will ask me to prove my identity and some of the informations, since my public DNS register is Godaddy so I need to login to the GoDaddy portal.

Once I ran through this wizard MS will add DNS records in the Godaddy portal.

Now in this wizard the next step is to add few user’s in the O365 as you can see below, please note they are cloud users and they are NOT migrated from on premise AD.

Since we have added the users the next step is to add the licenses

I have added the licenses to the two users as you can see below

In the below screen set I can see that the emails were send to respective users mailboxes.

Now the next step is to install the office 365 Apps, however I have ignored this step and click next since I don’t need to install the Office online in our test workstations.

Now the next step is to migrate the email from existing email service provider to O365 however I have decided to not configure that since this is a new lab environment so I don’t need to run that step.

The

DNS Setup in Godaddy, I can see some of the DNS records added in my Godaddy DNS server

Now this is the end of the wizard and I can find the screen below.

Now I can check the details of the user which I have just created as you can find the below screenshot.

Now login to portal.azure.com verify that your custom domain is verified.

Once I have verified that the next step is to install the Azure AD connect application in ADFS or any other server in the on premise environment.

I have logged into my ADFS server and down loaded the Azure AD Connect, once I have download that I have checked the system configuration and I have made it sure that I have enough resource in my server to install and configure this application.

Here is the link to download Azure AD connect.

Now the next step is to install the Azure AD connect and you can find the below steps.

I have clicked on license agreement and move next

Now I have clicked on Use Express settings. Now it will start the express settings

I can find that the system is installing the required components.

The next step is to connect to my Azure Active Directory with the Global admin user id

Now in the next step I need to connect to my Active Directory Domain Services.

Once I connect to my on premise Active Directory, I am ready to synchronize my on premise AD with Azure.

I have clicked on install, it will start the synchronization of the on premise AD with the Azure AD

After some time I have found that the configuration has been completed.

There is a warning the Active Directory Recycle bin has not been setup, so I have found an article in google which describe how to fix it. Please find the link below for your reference.

https://go.microsoft.com/fwlink/?linkid=844185

Since this is not a show stopper so I can jump to the next step.

Now I login to the Azure Portal and I have found that all the on premise users are now synced with the Azure Active Directory as you can see below.

Now the next step is to perform some post synch task like set the MsolADFSContext as you can find in the screenshot below WAI-DC001.whyazure.in is the FQDN of the server where the ADFS role has been installed.

Now the next steps are to perform some additional tasks as you can find below. You need to click on next to complete all these additional tasks.

After some time I can see the list of the synchronized directory

In the next step I have tried to connect to my

In the next step I have to connect to AD FS Server in my environment.

Now I can find the list of the AD FS Server in my POC environment

Now in this step I have to import the SSL certificate which is assigned to my ADFS server.

Now this is the public certificate of the ADFS server and I have imported it in the AD connect application.

Now in the next step it will show the list of servers

Once I click on next after this I can see we are in a position to configure AD Connect

After I have clicked on the configure button, it will show the below screen.

In the next step I need to verify the AD FS Login

I got an error as shown below, I am familiar with this error it means that the Microsoft Azure Active Directory Module for Windows PowerShell has not been installed in this Computer. This is one of pre requisite which necessary and missing in this server.

So I have decided to download the Microsoft Azure Active Directory Module for Windows PowerShell from this below URL

http://connect.microsoft.com/site1164/Downloads/DownloadDetails.aspx?DownloadID=59185

Once I have downloaded the PowerShell I have ran the setup

Once I click on next I can find the below screen.

It’s a very straight forward installation and you need to click on next.

In this step it’s ready to install and you click on the install button as shown below.

Once the installation is finished you can click on the finish button.

Now in this step I need to connect O365 Msolservice and need to set the ADFSContext with the help of PowerShell

And then I need to start the AdSyncCycle

Now the last step is verify ADFS Login

And it works. So I able to successfully configure the ADFS Server with O365. Now I will test with a user account to login from internal network and external network.

Testing the user login.

Please go the Chrome or IE and open https://portal.office.com

I have tried to login with one of test user account which I have created in on premise AD and now it has been synched to Azure AD and I have assigned office 365 license to this user account. Here is the screenshot.

When I click on Next it has shown this message, taking you to your organization’s sign in page.

And it directed to https://adfs.whyazure.in (Which is the ADFS Server)

When I signed in with this user account I can able to see all the office Apps as you can see in the below screenshot.

So that means the user has successfully signed in to office 365 with his on premise AD account and the SSO is successful.

So till the end all is well. I hope you like this post. Please provide your ratings. I will bring more on O365 and Azure in my coming posts. Thanks for reading.

read more
Uncategorized

Step by step guide to create and configure Analysis Services in Azure (PaaS)

no thumb

Estimated Reading Time: 4 minutes

This post I have targeted to the BI developers and system admins who are interested to configure and work with the SQL Server Analysis Services in Azure (Called Analysis Services in Azure)

What is SQL Server Analysis Services?

SQL Analysis Service is the PaaS instance of SQL Server Analysis Services. It’s an analytical data engine which supports business analytics and helps in business decision making. It provides enterprise-grade semantic data models for business reports. You can view the reports by the following client applications.

  1. MS Excel.
  2. MS Power BI
  3. Tableu and other data visualization tools.

How the data models are developed?

The data model development is generally carried out in SSDT (SQL Server Data Tools for Visual Studio) which is available as part of the Visual Studio Add on installations. Developers generally build tabular or multidimensional data model project in Visual Studio, deploying the model as a database to a server instance, setting up recurring data processing, and assigning permissions to allow data access by end-users. When it’s ready to go, your semantic data model can be accessed by client applications supporting Analysis Services as a data source.

The following tools are also used.

  1. SSMS (SQL Server Management Studio)
  2. PowerShell

What is Azure Analysis Service?

Azure Analysis Services provides enterprise-grade data modeling in the cloud. It is a fully managed platform as a service (PaaS), integrated with Azure data platform services.

What is the advantage of choosing Azure Analysis Service instead of SQL Server Analysis Services?

Azure Analysis Services has many advantages with Azure. As per Microsoft the Azure Analysis Services integrates with many Azure services enabling you to build sophisticated analytics solutions. Integration with Azure Active Directory provides secure, role-based access to your critical data. Integrate with Azure Data Factory pipelines by including an activity that loads data into the model. Azure Automation and Azure Functions can be used for lightweight orchestration of models using custom code. It’s also complete PaaS solution offered by MS so it’s super easy to deploy and can be scale out and scale in.

Will my data is secure with Azure Analysis services?

As per Microsoft the Azure Analysis Services utilizes Azure Blob storage to persist storage and metadata for Analysis Services databases. Data files within Blob are encrypted using Azure Blob Server Side Encryption (SSE). When using Direct Query mode, only metadata is stored. The actual data is accessed from the data source at query time.

Let’s dirty our hand and see how we have configured the Azure Analysis Services.

If you go to all services and type anal it will show the analysis services as you can see below.

Once you click on the Analysis Services you can see the following thing

You can click on the + Add button above to configure the Analysis Services.

Next step is to click on Create

Please remember if you don’t want to use an existing storage account you can create a new storage account and should add a container in it for the backup.

As you have seen the blob storage container which I have created I have kept the name as backup.

Once the Analysis services is ready you can view the following screen

The next step is to view what is created.

The above screen will show the analysis service which has been created. The server name is the one which is required to connect this Analysis Service from VS SSDT or Power BI.

For connecting from SSDT you need to download and install SSDT from internet. Here is the download URLfor SSDT. In my next post how we can connect this URL from SSDT and import the data. In the next post we also need to bypass an error related to connecting a SQL Server data source located in an IaaS VM in Azure, which will lead to the installation of unified gateway. The step by step installation of the gateway also I will cover in my next post.

read more
1 2 3 4
Page 1 of 4