close

Azure App Service

Microsoft Azure App Service

Azure App ServiceAzure BOT

Using the Service Communications API to Report Service Update Messages

182-07_1-300×162 (1)

Informing Tenants About Feature Updates

I recently wrote about the transition of the Office 365 Service Communications API to become a Microsoft Graph API and how to use the API to fetch details of service incidents. As I pointed out then, the API includes the ability to retrieve information about the service update messages Microsoft posts to inform tenants about the delivery or retirement of features. These messages show up in the message center in the Microsoft 365 admin center (Figure 1) and are a great source of information about future change.

Service update messages in the Microsoft 365 admin center
Figure 1: Service update messages in the Microsoft 365 admin center

Microsoft has done a lot of work over the last few years to improve communication with tenants. They’ve:

  • Built an integration between the Message Center and Planner to synchronize updates to Planner. Tenants can then use the tasks created in Planner to assign responsibility for managing the introduction of new features or phasing out of old features. We recommend that all tenants consider using this integration to help manage change.
  • Added extra information to the messages to highlight the affected services (like Exchange Online, SharePoint Online, and Teams).
  • Introduced better filtering capability in the Message Center.

Even so, challenges still exist in dealing with the volume of updates Microsoft introduces annually. It’s not just reading about the changes as they appear to understand how a new feature will affect users and the business, or how to manage something like the retirement Skype for Business Online on July 31, 2021. Not everyone has the time or opportunity to keep tabs on new posts in the message center, and when they do, it can be challenging to understand some of the text created by Microsoft development groups to describe what they’re doing (intelligent people aren’t necessarily great writers). Another problem is tracking the frequent slippage in dates when Microsoft predicts features will be available. While Teams is notable for the high percentage of missed dates, no workload hits all its commitments.

Custom Message Processing

Good as the Message Center is, it’s always good when you can do things your own way, and that’s why the Office 365 Service Communications API is valuable. My last article covers the basics of connecting to the API and fetching data. Here we focus on the Messages API and how to extract and manipulate service update messages with PowerShell.

I like to think of practical examples to illustrate how something works. In this case, my example is a report of the service update messages flagged for tenants to act by a certain date. For instance, Teams ceased support for IE11 after November 30, 2020. That date is long gone now but a message to remind administrators of the fact remains. You could argue that this is an example of something Microsoft should clean up; equally, you could say that it’s a prompt for tenants to move off IE11 to Edge, which is why Microsoft might have left the message in place. In any case, it’s a message with an act-by date. Looking at the message center as I write, of the 256 messages, 31 have act-by dates.

I discovered this fact by running a simple Graph query using a registered app with consent to use the ServiceMessage.Read.All permission:

This code sets a date range to check service update messages against (I chose 180 days in the future) and sets up a query to find messages with an action required date less than the date. The code then runs the query and extracts the message data from the information the API returns. An individual message looks like:

So far, so good. We have some data, and the nice thing about having some data to play with is that we can decide how to slice and dice the information to make it more digestible for the target audience.

Let’s assume that we need to convince managers of the need to do some up-front preparatory work before Microsoft delivers new software to the tenant. Asking managers to go to the Microsoft 365 admin center isn’t feasible. In my experience, busy managers are more likely to review information if given a spreadsheet or report.

The next task is therefore to create code to loop through the message data retrieved from the Graph and generate suitable outputs. Apart from removing all the HTML formatting instructions from the descriptive text for a message, there’s no great challenge in this code. To make things interesting, I computed the time remaining between the current time and the action by date and flagged overdue items. You can download the complete script from GitHub. Figure 2 shows the HTML version of the report. The script also generates a CSV file.

Service update message data reported in HTML file
Figure 2: Service update message data reported in HTML file

Generating a Word Document

Given the flexibility of PowerShell, you could even create Word documents using message data in an approved form. Here’s some code to generate a Word document containing details of a message center notification.

Figure 3 shows an example of a Word document generated using the code.

A Word document generated by PowerShell using service message data
Figure 3: A Word document generated by PowerShell using service message data

Access Drives Innovation

The nice thing about having access to data is that innovative people will do interesting things with the data. Being able to process Microsoft 365 service update messages to extract whatever value you see in the information is goodness. The only question is how best to make use of the opportunity…

Source Practical365

read more
Azure App ServiceAzure Network

Using the Service Communications API to Report Service Update Messages

182-07_1-300×162

Informing Tenants About Feature Updates

I recently wrote about the transition of the Office 365 Service Communications API to become a Microsoft Graph API and how to use the API to fetch details of service incidents. As I pointed out then, the API includes the ability to retrieve information about the service update messages Microsoft posts to inform tenants about the delivery or retirement of features. These messages show up in the message center in the Microsoft 365 admin center (Figure 1) and are a great source of information about future change.

Service update messages in the Microsoft 365 admin center
Figure 1: Service update messages in the Microsoft 365 admin center

Microsoft has done a lot of work over the last few years to improve communication with tenants. They’ve:

  • Built an integration between the Message Center and Planner to synchronize updates to Planner. Tenants can then use the tasks created in Planner to assign responsibility for managing the introduction of new features or phasing out of old features. We recommend that all tenants consider using this integration to help manage change.
  • Added extra information to the messages to highlight the affected services (like Exchange Online, SharePoint Online, and Teams).
  • Introduced better filtering capability in the Message Center.

Even so, challenges still exist in dealing with the volume of updates Microsoft introduces annually. It’s not just reading about the changes as they appear to understand how a new feature will affect users and the business, or how to manage something like the retirement Skype for Business Online on July 31, 2021. Not everyone has the time or opportunity to keep tabs on new posts in the message center, and when they do, it can be challenging to understand some of the text created by Microsoft development groups to describe what they’re doing (intelligent people aren’t necessarily great writers). Another problem is tracking the frequent slippage in dates when Microsoft predicts features will be available. While Teams is notable for the high percentage of missed dates, no workload hits all its commitments.

Custom Message Processing

Good as the Message Center is, it’s always good when you can do things your own way, and that’s why the Office 365 Service Communications API is valuable. My last article covers the basics of connecting to the API and fetching data. Here we focus on the Messages API and how to extract and manipulate service update messages with PowerShell.

I like to think of practical examples to illustrate how something works. In this case, my example is a report of the service update messages flagged for tenants to act by a certain date. For instance, Teams ceased support for IE11 after November 30, 2020. That date is long gone now but a message to remind administrators of the fact remains. You could argue that this is an example of something Microsoft should clean up; equally, you could say that it’s a prompt for tenants to move off IE11 to Edge, which is why Microsoft might have left the message in place. In any case, it’s a message with an act-by date. Looking at the message center as I write, of the 256 messages, 31 have act-by dates.

I discovered this fact by running a simple Graph query using a registered app with consent to use the ServiceMessage.Read.All permission:

This code sets a date range to check service update messages against (I chose 180 days in the future) and sets up a query to find messages with an action required date less than the date. The code then runs the query and extracts the message data from the information the API returns. An individual message looks like:

So far, so good. We have some data, and the nice thing about having some data to play with is that we can decide how to slice and dice the information to make it more digestible for the target audience.

Let’s assume that we need to convince managers of the need to do some up-front preparatory work before Microsoft delivers new software to the tenant. Asking managers to go to the Microsoft 365 admin center isn’t feasible. In my experience, busy managers are more likely to review information if given a spreadsheet or report.

The next task is therefore to create code to loop through the message data retrieved from the Graph and generate suitable outputs. Apart from removing all the HTML formatting instructions from the descriptive text for a message, there’s no great challenge in this code. To make things interesting, I computed the time remaining between the current time and the action by date and flagged overdue items. You can download the complete script from GitHub. Figure 2 shows the HTML version of the report. The script also generates a CSV file.

Service update message data reported in HTML file
Figure 2: Service update message data reported in HTML file

Generating a Word Document

Given the flexibility of PowerShell, you could even create Word documents using message data in an approved form. Here’s some code to generate a Word document containing details of a message center notification.

Figure 3 shows an example of a Word document generated using the code.

A Word document generated by PowerShell using service message data
Figure 3: A Word document generated by PowerShell using service message data

Access Drives Innovation

The nice thing about having access to data is that innovative people will do interesting things with the data. Being able to process Microsoft 365 service update messages to extract whatever value you see in the information is goodness. The only question is how best to make use of the opportunity…

Source Practical 365

read more
Azure App Service

Microsoft and OpenAI Launch GitHub Copilot Programming System

no thumb

GitHub-Copilot-Open-AI-Microsoft

Over recent years, we have seen several examples of Microsoft leveraging its partnership with OpenAI, including exclusively licensing the artificial intelligence company’s GPT-3 (Generative Pretrained Transformer 3). In the latest collaboration, the companies are debuting an AI pair-programming platform called GitHub Copilot.

As the name suggests, Copilot includes work from Microsoft-owned GitHub and gives programmers tools to write code more efficiently and quickly. Microsoft is already rolling out GitHub Copilot as a preview on Visual Studio Code (VS Code).

The system runs on a new AI platform developed by OpenAI known as Codex.

“If the technical preview is successful, our plan is to build a commercial version of GitHub Copilot in the future. We want to use the preview to learn how people use GitHub Copilot and what it takes to operate it at scale,” GitHub officials point out in a FAQ document published with the platform.

Cross-Language Programming

Copilot is designed to help programmers across a wide range of languages. That includes popular scripts like JavaScript, Ruby, Go, Python, and TypeScript, but also many more languages.

As for OpenAI Codex, it is a new AI model built on machine learning training across billions of open source code lines. It also trained across natural language lines, meaning it understands both human language and programming code.

“GitHub Copilot understands significantly more context than most code assistants. So, whether it’s in a docstring, comment, function name, or the code itself, GitHub Copilot uses the context you’ve provided and synthesizes code to match. Together with OpenAI, we’re designing GitHub Copilot to get smarter at producing safe and effective code as developers use it.”

Back in May, Microsoft and OpenAI launched a new assistive AI for PowerApps will help convert natural language into workable code. It works exclusively with the Power Fx app in PowerApps and gives users the ability to implement AI tools without have a knowledge of high-level code.

Tip of the day: Thanks to the Windows Subsystem for Linux (WSL) you can run complete Linux distributions within Windows 10. In our tutorial, we show you how to install Ubuntu or other Linux packages and how to activate the bash shell.

Source Winbuzzer

read more
Azure App Service

Microsoft Azure Cloud Services Model Reaches General Availability

Azure-Availability-Zone-Microsoft-696×381

Cloud Services (extended support) is a new Microsoft solution that has been in public preview since January. After tweaking the solution over the interim months, Microsoft is now rolling out Cloud Services to everyone. The company says the offering is now generally available.

Based on the company’s Azure Resources Manager (ARM), the service replaces the previous Cloud Services that was based on Azure Service Manager (ASM). Alongside the wide release, Microsoft is also deploying a tool that helps users migrate from the old version to the new model. For the time being, this tool is working in preview.

More than just switching the base of the solution, Microsoft has also brought some changes to Cloud Services (extended support). For example, Azure Key Vault is now baked in, allowing deeper certification management.

Microsoft says the underlying function of Cloud Services, such as upgrades and rollbacks, will remain the same. Equally, the Azure GuestOS will now be aligned with Cloud Services.

Highlights

Here are the key highlights of the new service:

  • Cloud Services (extended support) also supports two types of roles, web and worker. There are no changes to the design, architecture, or components of web and worker roles.
  • No changes are required to runtime code as the data plane is the same as cloud services.
  • Azure GuestOS releases and associated updates are aligned with Cloud Services (classic).
  • Underlying update process with respect to update domains, how upgrade proceeds, rollback, and allowed service changes during an update will not change.
  • Customers must use Azure Key Vault to manage certificates in Cloud Services (extended support). Azure Key Vault lets you securely store and manage application credentials such as secrets, keys, and certificates in a central and secure cloud repository.
  • All resources deployed through the Azure Resource Manager must be inside a virtual network.
  • Each Cloud Service (extended support) is a single independent deployment. VIP Swap capability may be used to swap between two Cloud Services (extended support).

Tip of the day:

Tired of Windows 10’s default notification and other system sounds? In our tutorial we show you how to change windows sounds or turn off system sounds entirely.

Source Winbuzzer

read more
Azure App Service

Microsoft Using Special Liquid to Cool Azure Data Servers

Datacenter-Server-Liquid-Cooling-Microsoft-Azure-696×392

One of the problems companies face when handling massive computational loads is cooling. A company like Microsoft, with massive server banks within datacenters are constantly fighting against overheating. The company sees the end of the road for traditional air-cooling methods, such as fans. Instead, Microsoft is experimenting with liquid cooling by submerging servers in special tanks.

In a blog post, Microsoft explains how it uses a “two-hone immersion cooling” method by dipping servers into a liquid that does not damage electronics. The liquid carries heat away from components and then boils. By using a cooled condenser lid on top of the tank, the boiling water turns to vapor before changing back to liquid and being redistributed.

This is a closed look cooling system, according to Christian Belady, vice president of Microsoft datacenter advanced development. Speaking to The Verge, he explained how the system works:

“It’s essentially a bath tub. The rack will lie down inside that bath tub, and what you’ll see is boiling just like you’d see boiling in your pot. The boiling in your pot is at 100 degrees Celsius, and in this case it’s at 50 degrees Celsius.”

In its official blog, the company points out it is not the first company to explore this technology. For example, cryptocurrency minders have been using immersion cooling. Still, for a major leader with a massive cloud infrastructure, this is a first.

On a simpler level, the company has already explored the idea of cooling its servers by submersion. Microsoft is already dropping datacenters into the ocean to keep them cool.

Deep Sea Experiment

Microsoft’s efforts to develop underwater datacenters was launched in 2014. In 2017, Project Natick was selected among the 190 finalists of the first World Changing Ideas Awards. By 2018, the project was ready and deployed underwater off the coast of Scotland’s Orkney Islands.

Last year, Microsoft raised the datacenter capsule from the ocean and the results were positive. Microsoft says its prediction about the benefits of underwater datacenters have been upheld.

“The consistently cool subsurface seas also allow for energy-efficient data centre designs. For example, they can leverage heat-exchange plumbing such as that found on submarines,” the blog post said at the time.

Tip of the day:

Do you sometimes face issues with Windows 10 search where it doesn’t find files or return results? Check our tutorial to see how to fix Windows 10 search via various methods.

Source Winbuzzer

read more
Azure App Service

Microsoft Solves Global Azure Active Directory Outage

Azure-Space-Microsoft-696×393 (1)

Microsoft says there was an Azure Active Directory problem that is making authentication issues for some customers. According to the company, the issue is sporadic but does affect users globally. It also manifests across services, such as Dynamics 365, Microsoft Teams, Microsoft Office, Xbox Live, and Azure.

First reports of the problem started on Monday and stretched into this morning (March 16). Microsoft has now updated its Azure Status Twitter to confirm the issue has been mitigated.

“Engineers have confirmed the issue impacting Azure Active Directory has been mitigated.”

When complaints first came in, Microsoft issued the following statement regarding Azure Active Directory:

“CURRENT STATUS: Engineering teams have identified a potential underlying cause and are exploring mitigation options. The next update will be provided in 60 minutes or as events warrant.”

 

Cause

Microsoft says its analysis of the issue points to an error in the rotation of keys Azure AD uses with OpenID:

“As part of standard security hygiene, an automated system, on a time-based schedule, removes keys that are no longer in use. Over the last few weeks, a particular key was marked as “retain” for longer than normal to support a complex cross-cloud migration. This exposed a bug where the automation incorrectly ignored that “retain” state, leading it to remove that particular key.

“Metadata about the signing keys is published by Azure AD to a global location in line with Internet Identity standard protocols. Once the public metadata was changed at 19:00 UTC, applications using these protocols with Azure AD began to pick up the new metadata and stopped trusting tokens/assertions signed with the key that was removed. At that point, end users were no longer able to access those applications.”

If Azure AD has an uptime of less than 99.9% per month, users receive 25% service credit. If that number falls below 99%, they are entitled to 50%, and 100% if it’s below 95%. You can work out your downtime with the formula: “(User Minutes – Downtime)/User Minutes * 100)”.

Tip of the day:

Do you often experience PC freezes or crash with Blue Screens of Death (BSOD)? Then you should use Windows Memory Diagnostic to test your computers RAM for any problems that might be caused from damaged memory modules. This is a tool built into Windows 10 which can be launched at startup to run various memory checks.

read more
Azure App Service

Microsoft AccountGuard Security Features Coming to 31 Democracies

AccountGuard-Microsoft-696×392

Microsoft AccountGuard is evolving this week as the company brings the cybersecurity identity and access management features to 31 new democracies around the world. According to the company, “enterprise-grade” tools are now coming to other nations:

“The addition of new features to AccountGuard provides new ways to protect online accounts for political parties, candidates and their staff, health care workers, human rights defenders, journalists and certain other customers who are at greatest risk from nation-state hackers.”

AccountGuard was launched in August 2017. Available in Office 365, the service helps Microsoft Account holders running elections campaigns, in political committees, or politician staff. The tool provides more threat monitoring capabilities by regularly monitoring accounts for security breaches. Journalists, human rights workers, and more have been using Account Guard successfully.

Those industries can use the tool following Microsoft’s expansion of the service in April 2020.

During its checks, AccountGuard scans attachments for malware, phishing, and failed login attempts. If something is found, a notification is sent to the account holder. If a genuine cyber threat is uncovered, Microsoft provides remediation and ongoing support to stop the threat.

Rolling Out Now

New features coming to 31 democracies were first used during the 2020 U.S. Presidential Election. Microsoft says customers enjoyed an 18% improvement in its Identity Protection Security Score thanks to AccountGuard. This is score is an automatic review of an organization’s ability to hold off security attacks.

“These identity protection offerings help ensure only authorized people can log on to an organization’s systems and make it more difficult for hackers to impersonate legitimate staff.”

Among the countries receiving the features are the United Kingdom, France, Australia, Germany, Denmark, and Canada. You can check out the full list of supported nations here.

Tip of the day:

Did you know that as a Windows 10 admin you can restrict user accounts by disabling settings or the control panel? Our tutorial shows how to disable and enable them via Group Policy and the registry.

Source Winbuzzer

read more
Azure App Service

Microsoft Azure Space Partners with HPE for Spaceborne Computer-2 Launch

Azure-Space-Microsoft-696×393

Microsoft is teaming with Hewlett Packard Enterprise (HPE) to link the Azure cloud platform with HPE’s Spaceborne Computer-2. Under the partnership, the two companies will create compute and machine learning solutions for the supercomputer.

If you’re unfamiliar with HPE’s Spaceborne Computer-2, it is a collaboration between HP and NASA. It is a commercial supercomputer that functions in space. Specifically, it is an edge computing device that brings computation during space flights through data-intensive applications.

NASA will launch the Spaceborne Computer-2 into space on February 20 as part of the 15th Northrop Grumman Resupply Mission to Space Station (NG-15).

One of the benefits for customers is the ability to gain new data insights and research developments. For example, the information could advance fields such as weather modelling, medial imaging, plant analytics, and more.

Expanding Azure Space

With Microsoft on board, the Spaceborne Computer-2 will sync into the Azure Space initiative. Announced in October 2020, Azure Space is a bundle of cloud products combining with partnerships to make Microsoft Cloud a major player in the growing space tech area.

“HPE and Microsoft are collaborating to further accelerate space exploration by delivering state-of-the art technologies to tackle a range of data processing needs while in orbit. By bringing together HPE’s Spaceborne Computer-2, which is based on the HPE Edgeline Converged Edge system for advanced edge computing and AI capabilities, with Microsoft Azure to connect to the cloud, we are enabling space explorers to seamlessly transmit large data sets to and from Earth and benefit from an edge-to-cloud experience.

“We look forward to collaborating with Microsoft on their Azure Space efforts, which share our vision to accelerate discovery and help make breakthroughs to support life and sustainability in future, extended human missions to space.” —Dr. Mark Fernandez, Solutions Architect of Converged Edge Systems at HPE and Principal Investigator for Spaceborne Computer-2

Tip of the day:

When using your Windows 10 laptop or convertible with a mobile hotspot
you might want to limit the Internet bandwidth your PC uses. In our tutorial we are showing you how to set up a metered connection in Windows 10 and how to turn it off again, if needed.

Source Winbuzzer

read more
Azure App Service

Malwarebytes Confirms SolarWinds-Related Attack Through Microsoft 365 and Azure

Security-Icon-Microsoft-630×420

Major security and antivirus firm Malwarebytes says it was a victim of the recent SolarWinds breach through the Solarigate malware. Since last year, the state-backed breach has targeted users of the SolarWinds app Orion, including Nvidia, Microsoft, and government organizations.

In an official blog post, Malwarebytes points out it is not a user of SolarWinds apps. However, the company was breached through another vector that has already been compromised. The attack came from already breached apps that had access to Microsoft 365 and Azure services. Malwarebytes does use those two Microsoft services.

Attackers were able to access “a limited subset of internal company emails” but not any production systems.

Malwarebytes worked directly with the Microsoft Detection and Response Team (DART) to find the attack, says CEO Marcin Klecynski:

“Together, we performed an extensive investigation of both our cloud and on-premises environments for any activity related to the API calls that triggered the initial alert. The investigation indicates the attackers leveraged a dormant email protection product within our Office 365 tenant that allowed access to a limited subset of internal company emails.”

Moving forward, Malwarebytes says it is working with other security firms to share information. It is hoped it will become easier to mitigate Solarigate attacks and find responses that work to stop breaches.

Attacks

Earlier this month, the U.S. Department of Justice confirmed a Microsoft 365 breach related to the SolarWinds attack. According to the government agency, the breach left 3% of its mailbox vulnerable. However, no classified information was stolen during the attack.

While the Solarigate malware can be delivered through Microsoft services, it is not caused by them. Russia-backed threat actors used the avsvmcloud.com website to host a server for the Solorigate malware. The infection was sent to 18,000 SolarWinds Orion customers. Many of those users are major organizations and government departments.

Last month, Microsoft President Brad Smith said the attack creates “serious technological vulnerability for the United States and the world”.

Also in December, the Cybersecurity and Infrastructure Security Agency (CISA) debuted a PowerShell tool to help Microsoft 365 customers mitigate Solarigate. Microsoft had recently confirmed stolen Azure/Microsoft 365 credentials and access tokens were a part of the breach.

Tip of the day:

Did you know that a virtual drive on Windows 10 can help you with disk management for various reasons? A virtual drive is just simulated by the platform as a separate drive while the holding file might be stored anywhere on your system .

The data in the drive is available in files or folders, which are represented by software in the operating system as a drive. In our tutorial we show you different ways how to setup and use such virtual drives.

Source Winbuzzer

read more
Active DirectoryAzure App Service

Tips & Tricks for Azure File Shares

Cyber-Security-Lock-Pixabay-696×392

As with any technology, when you first get started there are sometimes some bumps to getting the setup or installation correct. Those DOH moment. No matter how many times you read the administrators installation guide there may be some items that are missing that wasn’t included in the documentation or a particular scenario not thought of. Azure file shares is no different and there are some common hiccups that can be avoided with good planning. This blog post is going to give you some tips and tricks to get started with Azure file shares to help eliminate some of those bumps that you may run into.

 

Intros please

First off, I want to give a quick intro as to what Azure file shares is for those that are hearing about this for the time.  The official description of Azure file shares is:

 

“Fully managed file shares in the cloud that are accessible via Server Message Block (SMB) protocol  (also known as Common Internet File System or CIFS). Azure File shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS.”

 

The short version is:

 

A file share that is in the cloud.

 

Azure file shares can be used to completely replace or supplement traditional on-premises file servers or NAS devices. They can be used by Windows, macOS, and Linux which can directly mount Azure file shares wherever they are in the world.  If you are thinking of using this to replace your local file shares you will need to use Azure file Sync and be running Windows server 2016.

 

You can use it for more than supplementing your local file server such as, Azure files can be used to in a lift and shift migration into Azure. This also works well for Hybrid scenario, where the application data is moved to Azure Files, and the application continues to run on-premises. An Azure file share is also a good for cloud applications to write their logs, metrics, and crash dumps.

 

With security on everyone’s mind you’re probably asking how is this secured? Well, Azure Files access control is maintained with several methods. Announced at Microsoft Ignite 2018, Azure Files supports identity-based authentication and access control with Azure Active Directory (Azure AD) (Preview). As part of the preview, Azure File supports preserving, inheriting, and enforcing NTFS DACLs in a file share. When data is copied from a file share to Azure Files, or vice versa, you can specify that NTFS DACLs are maintained. Please note this is in preview so I would not recommend into production. The stop gap until Azure AD for Azure file shares is GA is to use Azure file sync. When using Azure File Sync on your Windows file server, it preserves and replicates all discretionary ACLs, or DACLs, (whether Active Directory-based or local) to all endpoints that it syncs to in Azure.

 

 

Tips and Tricks

Below is a list of some tips and tricks to help remove any bumps you have may with Azure file shares.

 

Plan, Plan, Plan

I can’t say this enough, but you need to plan why and what before you jump into this head first otherwise you can risk failure.

  • Develop a clear plan. Identify what you’re moving to Azure files shares and the reasons for why.
  • Understanding the objectives will help you become more successful as you can determine if this is the right path or possibly a different solution is better.
  • Once you have those identified gather all the stakeholders and start to develop a plan for the implementation.

Use SMB 3.0

The preferred SMB client is 3.0.

  • You should be using SMB 3.0, however, you can access Azure file shares with SMB 2.1. Keep in mind that clients that are using SMB client 2.1 can only access it from within the same Azure region. Please also note the connection is without encryption. If you’re thinking of using SMB client 1.0, it won’t’ work.
  • If you are mounting from an on-premises server or outside your Azure region only SMB 3.0 is supported.

Open Port 445

Common cause for connection issues is Port 445 being block. This can be at the local level within your datacenter to even your ISP. To see the summary of ISPs that allow or disallow access from port 445 see here

  • Troubleshoot connection issues with Fiddler or PortQRY:
    • You can use Portqry to query the TCP:445 endpoint. If the TCP:445 endpoint is displayed as filtered, the TCP port is blocked. Here is an example query:

thumbnail image 1 of blog post titled 

							Tips & Tricks for Azure File Shares

 If TCP port 445 is blocked by a rule along the network path, you will see the following output

thumbnail image 2 of blog post titled 

							Tips & Tricks for Azure File Shares

  • Double check that your Antivirus & Firewall Software Policy allow Port 445. Often local system policies may also block this port.

Persistent Connections

Don’t you hate it when you mapped a drive then when you reboot your computer it disappears? Well that can sometimes happen when you make connections to Azure file shares and you don’t make the connection persistent. To make connections persistent you can use the following:

 

  • CMDKEY or Credential Manager to store Azure Storage account credentials

thumbnail image 3 of blog post titled 

							Tips & Tricks for Azure File Shares

  • You can also add “/persistent:yes“ to the net use command

thumbnail image 4 of blog post titled 

							Tips & Tricks for Azure File Shares

 

Install KB3114025

For those still running Windows Server 2012 R2 you may experience some slowness when you attempt to copy files to Azure file shares. There is a known issue with that which can be corrected by installing KB3114025.

  • Install on Windows 8.1 or Windows Server 2012 R2.
  • This also Increases performance on I/O intensive workloads

 

Access issues with an application or service account

If your application or service is running under a different user account than what the drive is mounted with, you may experience an issue where the application or service account cannot accessing the Azure file share. Some workarounds :

  • Mount the drive from the same user account that contains the application. You can use a tool such as PsExec.
  • Pass the storage account name and key in the user name and password parameters of the net use command.
  • Use the cmdkey command to add the credentials into Credential Manager. Perform this from a command line under the service account context, either through an interactive login or by using runas.

thumbnail image 5 of blog post titled 

							Tips & Tricks for Azure File Shares

 

  • Map the share directly without using a mapped drive letter. Some applications may not reconnect to the drive letter properly, so using the full UNC path may be more reliable.
    thumbnail image 6 of blog post titled 

							Tips & Tricks for Azure File Shares

 

Network and Security Policies for outside the company network or VPN

When implementing Azure file shares keep in mind it can be accessed from anywhere there is an internet connection if not configured correctly. If this violates any of your company polices on data access you will need to do some extra work. By default, storage accounts accept connections from clients on any network. You can restrict access to Azure file shares by configuring the associated storage account with limited access through the default network access rule.

 

  • Please note that making changes to network rules can impact your ability to connect to Azure Storage. Be sure to first grant access to any allowed networks using network rules before you change the default rule to deny access.

Source Microsoft

read more
1 2 3 5
Page 1 of 5