Azure App Service

Microsoft Azure App Service

Azure App Service

Microsoft Once Again Wins JEDI Contract Following DoD Investigation


Microsoft’s position as winner of the Pentagon’s JEDI cloud defense contract has been given a seal of approval this week. Following an investigation, the US Department of Defense (DoD) once again said Microsoft has won the $10 billion contract. It seems the reaffirmation has ended Amazon Web Service’s protests over the process.

However, Amazon is doubling down on its position that Microsoft won the JEDI contract unfairly. In response to once again missing out, AWS hit out at the investigation and President Donald Trump.

Still, it will be Microsoft celebrating winning the JEDI deal for a second time. The company’s Azure cloud services were upheld as the best choice for the project, according to the DoD:

“The Department has completed its comprehensive re-evaluation of the JEDI Cloud proposals and determined that Microsoft’s proposal continues to represent the best value to the Government,” the DoD points out in a statement. “The JEDI Cloud contract is a firm-fixed-price, indefinite-delivery/indefinite-quantity contract that will make a full range of cloud computing services available to the DoD.”

If you’re unfamiliar with JEDI, it is the Joint Enterprise Defense Infrastructure project. Microsoft’s cloud services will underpin an overhaul of the DoD’s computing infrastructure. Microsoft was originally awarded the contract a year ago.

Amazon did not take the situation lying down and immediately started legal proceedings to stop Microsoft. Amazon’s argument was always resting on the idea the process was unfair. While Amazon did not seek the multi-cloud approach some rivals did, AWS thought it was not given a fair chance.

Not Backing Down

Much of Amazon’s issue rests on a belief Donald Trump influenced the outcome of the project because of bias against the company. In fact, AWS wanted Trump to testify during the investigation.

The company previously highlighted Trump’s words on the campaign trail, where he vowed that Amazon would have problems under his presidency. He also referred to Jeff Bezos as “Jeff Bozo”, and has taken issue with his newspaper, the Washington Post.

Following the confirmation yesterday Microsoft will go ahead as the JEDI contract winner, Amazon is doubling down on its position. The company once against attacked the process and called out Trump.

Amazon describes the process as “flawed, biased, and politically corrupted’ and goes on to say that the award ‘creates a dangerous precedent that threatens the integrity of the federal procurement system”.

“There is a recurring pattern to the way President Trump behaves when he’s called out for doing something egregious: first he denies doing it, then he looks for ways to push it off to the side, to distract attention from it and delay efforts to investigate it (so people get bored and forget about it). And then he ends up doubling down on the egregious act anyway.”

Amazon’s court proceedings included an injunction that prevents Microsoft beginning work on the project. However, the DoD says work will start when the injunction is lifted.

“While contract performance will not begin immediately due to the Preliminary Injunction Order issued by the Court of Federal Claims on February 13, 2020, DoD is eager to begin delivering this capability to our men and women in uniform.”

Source Winbuzzer

read more
Azure App Service

Microsoft Sunsets Visual Studio Codespaces


Earlier this year, Microsoft sent out the debut preview of Visual Studio Codespaces, which was essentially a rebranding of Visual Studio Online. VS Online was introduced for the first time at Ignite 2019, bringing components of Visual Studio to web browsers. However, Microsoft has already decided to send Visual Studio Codespaces into the sunset.

According to the company, it is killing off the online version of Visual Studio before it has even left preview.

With Visual Studio Online (Codespaces), users have a web-based companion to the full VS experience. They can access code and edit it from any device, including smartphones. With Codespaces, Microsoft added the ability to work with extensions, access a command line, edit, run, and debug apps, and see Git repos.

So, why is Microsoft killing off what is for all intents and purposes a useful addition to the Visual Studio family? Well, it’s not as bad as it first seems because Microsoft says it is folding all Visual Studio Codespaces features into GitHub Codespaces.

If you’re unfamiliar with GitHub Codespaces, it too was launched earlier this year. Users can tap into GitHub Codespaces through a browser-like variant of the Visual Studio Code editor. In this editor, users can work with terminal access, extensions, and other tools. Furthermore, dev’s can use the Codespaces feature directly from their IDE.


Microsoft clearly does not want two separate services with the Codespaces name. To remedy this, the company is simply combining the two. All VS Codespaces features will be transitioning to GitHub Codespaces, so there should be no gap in functionality. It is worth noting GitHub Codespaces is also in public beta preview.

“After the GitHub-native experience was released, we started hearing that the two distinct experiences were causing confusion amongst our users… We believe that by consolidating the current Codespaces experiences into one, we can eliminate confusion, simplify the experience for everyone, and make more rapid progress to address customer feedback.”

Microsoft has the following timeline in place to end the VS Codespaces preview and move to GitHub Codespaces:

  • September 4, 2020 – Current users can begin transitioning to the GitHub private beta.
  • November 20, 2020 – Creation of new plans and codespaces will be disabled, although existing codespaces may continue to be used. New users will only be able to sign up for Codespaces on GitHub.
  • February 17, 2021 – The Visual Studio Codespaces portal will be retired. All plans and codespaces remaining in the service will be deleted.

From February 17, the VS Codespaces services will be shuttered permanently and all related portals deleted.

Source Winbuzzer

read more
Azure App Service

Microsoft Edge August Updates Include New Tools for Immersive Reader and Collections


Microsoft this week debuted a new blog post series that will focus on providing information on new updates for its various web tools. In this first post, the company detailed what has been delivered to the Microsoft Edge browser in August.

It was solid month for the new browser, which is running on Google’s Chromium base these days. First up is a new feature for the Collections section of the browser.

Collections was brought to the Microsoft Edge preview last year before reaching all users in April. Collections can help users organize and share to help keep track easier. Microsoft says the feature leverages “Cloud featured intelligence and an intuitive interface to help you collect, organize and share content” as you browse content on the web.

This month, the Edge Stable channel allows users to send Collections straight to OneNote on mobile and desktop. This “Send to OneNote” expands upon the similar feature already available for Excel, Pinterest, and Word.

More Updates

Microsoft also made some changes to the Immersive Reader in Microsoft Edge. This is a tool available across several Microsoft services, including the old Edge, OneNote, and even Minecraft. Developers can also access the feature to embed comprehension and text reading tools into their applications.

In August, Immersive Reader receive a picture dictionary. Users can now select a word on a website and the bower will provide an image representative of the word. Microsoft says it is “ideal for those learning another language or for students learning on their own.”

Incidentally, this week, Microsoft confirmed Immersive Reader is now generally available for all Azure Cognitive Services customers.

Last up for Microsoft Edge August updates was the built-in PDF reader gaining screen reader support alongside a new highlighter tool.

Source Winbuzzer


read more
Azure App Service

How to manage on-premises infrastructure using Azure Automation Hybrid Worker


We’ve seen a proliferation in cloud adoption as many organizations are hastily moving their workloads and resources to the cloud as users become more mobile and remote. IT admins are facing challenges with the migration of their data and apps, and also managing their hybrid environments. Consequently, IT teams are having to upskill in order to complete their daily administrative tasks.

In this article, we’ll explore the options for managing hybrid environments and automating tasks that involve both on-premises and cloud resources. It’s important for IT and Security Operations teams to have access to a centralized management tool accessing and managing both environments to cope with the shift to remote working.

Azure Automation

We’ve seen a proliferation in cloud adoption as many organizations are hastily moving their workloads and resources to the cloud as users become more mobile and remote. IT admins are facing challenges with the migration of their data and apps, and also managing their hybrid environments. Consequently, IT teams are having to upskill in order to complete their daily administrative tasks.

In this article, we’ll explore the options for managing hybrid environments and automating tasks that involve both on-premises and cloud resources. It’s important for IT and Security Operations teams to have access to a centralized management tool accessing and managing both environments to cope with the shift to remote working.

Read more: Office 365 Global Admin Best Practices by Joshua Bines

Due to it’s modern capabilities we’ll be using Azure Automation, a cloud-based management service, for automating your processes securely and efficiently. This Azure function allows administrators to manage cloud services such as Office 365, Azure, SharePoint Online, Azure Web Apps and more. However, if you run Azure Automation Runbook against your on-premises environment, you’ll get Access Denied. Currently, the only workaround is if you expose your internal network out to the internet via APIs. In this article, we’ll also be focusing on overcoming this obstruction when managing on-premises servers using Azure Automation.

To securely run Azure Automation Runbook (PowerShell or Python2 script) against the on-premises environment, we’ll have to configure the Hybrid Runbook Worker server. This is a secure delivery mechanism of the cloud-hosted script to the local on-premises environment.

The Hybrid Worker server will be going out via port 443 on the firewall to connect to Azure Automation to run Runbooks against on-premises servers such as Active Directory, Exchange, SQL Server, etc.

Azure Automation diagram

To achieve our goal, these are steps we’ll be taking:

  • Configuring Log Analytics workspace
  • Link Azure Automation account with Log Analytics workspace
  • Configuring Azure Hybrid Worker Server and Hybrid Worker Group

Configuring an Azure Log Analytics workspace

As a prerequisite, we need to set up Azure Log Analytics Workspace for the Hybrid Runbook Worker to function. A Log Analytics workspace is a unique environment for Azure Monitor Log Data, each Workspace consists of a repository, data source, and a solution that stores data there.

To set up your Log Analytics Workspace, follow the steps below:

  1. Login to the Azure tenant

2. In All Services search for Log Analytics workspace and click Add to create one

3. Complete all required fields and click Create

4. Make sure to note the Resource Group where you added the Log Analytics workspace

5. I named the Log Analytics workspace “Practical365Workspace

Log Anlaytics Workspace for Azure Automation

After the Log Analytics workspace is created, let’s add the Azure Automation account.

Setting up your Azure Automation account

You can create an Azure Automation account from multiple in the Azure Tenant. In this case, we’ll create the account right from the Log Analytics workspace.

  1. To get started, click on the newly created workspace

2. Then, navigate to Workspace Summary and click Add

3. From the Marketplace screen, select Azure Automation

Select Azure Automation

4. In the Automation screen, click Create

5. Complete all the necessary fields and click Create

Azure Automation account

Make sure you select the same Resource Group where your Log Analytics workspace is created and that the location is selected based on the mapping table. For example, If Log Analytics workspace is in East US, then Azure Automation should be in East US 2 according to the mapping table provided by Microsoft.

Link Azure Automation account with Log Analytics workspace

To link your Azure Automation account with the Log Analytics workspace, we will need to enable Inventory and Change Tracking within Azure Automation.

  • Click Automation Account (in our case “Practical365Automation”)
  • Configuration Management -> Inventory, from Log Analytics workspace, select the workspace which we created earlier Practical365Workspace.
  • Click Enable

To check if your Azure Automation is linked to a Log Analytics workspace correctly follow steps below:

  • Click on Azure Automation, in our case Practical365Automation
  • Under Related Resources, click the Linked workspace

You should see your Log Analytics workspace linked. In our case, as pictured above, we can see that  Practical365Workspace is linked.

Configuring Azure Hybrid Worker Server and Hybrid Worker Group

The Hybrid Runbook Worker runs the script passed from the Azure Automation Account, this plays a centric role in the delivery mechanism for your management tasks. Every Hybrid Runbook Worker server must be a part of the Hybrid Runbook Worker Group, this can contain one or more servers for redundancy.

Learn more: Office 365 Tenant Migration: How to Migrate Exchange Mailbox Permissions

In the lab environment, I have a Windows Server 2016 virtual machine (Name: Server01), which is part of the on-premises domain. I will use Server01 to install Hybrid Worker.

In next few steps, we’ll cover:

  • How to configure Hybrid Worker server
  • Create a Hybrid worker group
  • Add server to the group

Make sure your spelling matches identically to the name of the workspace you created earlier for the following steps.

On Server01 open PowerShell window (as an Administrator) andrun the cmdlet below, ensure your server has  outbound access to the internet:

Next, run PowerShell cmdlet below specifying Workspace Name:

PowerShell will prompt to provide additional information such as:

  • Resource Group Name : practical365resourcegroup
  • subscriptionID: XXXXXXX (Your tenant’s subscription ID)
  • Automation Account Name: Practical365Automation
  • HybridGroupName : Practical365HybridGroup (type appropriate name for the Group.)

During the installation process, you’ll need to authenticate against your Azure tenant.

To check if the installation is successful, navigate to Azure tenant, open the Azure Automation Account that we created earlier, and under the Hybrid workers groups, you should see a new group called Practical365HybridGroup.

Hybrid Worker Groups for Azure Automation

Now you’ve completed the configuration of the Hybrid worker server.

Remember that the PowerShell scripts that you’re running on Server01 from Azure Automation Account will run under your System Local Account; which means you can run management tasks directly on this server. For example, you can create a runbook to stop or start a service.

In practice, you’ll use this Hybrid Worker Server (Server01) to manage other servers. For example, you can enable or disable AD accounts on Domain controllers, provision mailboxes on Exchange server, etc. You’ll also need to set up a Run As account (domain account) in Azure Automation so that your Hybrid Worker server can manage other servers in the local network.

It’s important to note here, when you configure Run As account, it will be applied to the entire Hybrid Worker Group. Currently, there is no way to link Run As account for each Runbook in Azure Automation.

How to set up a Run As account

  1. Open Azure Automation account

2. Navigate to shared resources

3. Click the Credentials link on the left navigation

4. Add a credential and populate any required information. Here, I’m using my domain admin account (Domain/Account), which is not recommended for production. Using the principal of least-privilege, you can create an account that strictly has access to ONLY what you need to accomplish.

Azure Automation Hybrid Worker Group

5. Go back to the Azure Automation and select the Hybrid Worker Group that we created earlier in this article.

6. Select the one associated with your on-premises environment

7. Select Hybrid Worker Group settings in the left property menu and switch Run As field to Customs. From the drop-down, select the credentials that we created earlier.

Hybrid Worker Group Settings

You have now finished configuring your environment and it’s ready to test. The test use case I’m using will run the PowerShell script in the cloud to disable the on-premises AD account.

You must ensure here that your Hybrid Worker server has an appropriate PowerShell module installed. In this example, I made sure that the Active Directory PowerShell module is installed on Server01.

I logged in to the Domain Controller and opened Active Directory users and computers. As you can see, I have Adam Smith’s account enabled.

Account enabled

Next, I’ll create a Runbook in my Azure Automation account to disable this on-premises user:

  1. Navigate to Azure tenant and open Azure Automation account that we created earlier

2. Click on Runbooks on the left navigation menu

3. Click Create a Runbook

4. Complete the required information, make sure you select PowerShell for the type of the Runbook

Create a runbook

5. In the editor window, type PowerShell cmdlet then Save and Publish.

Disable Account

We’re now ready to run the PowerShell cmdlet. Click on Start, under Run on, switch to Hybrid Worker, and select the one we created earlier. The PowerShell script will be queued and executed, wait until the Job-status is set to complete.

Let’s go back to the on-premises domain controller and check the Active Directory Users and Computers.

Check active users and computers

Your domain account is now disabled. As you can see, we were able successfully run a cloud-hosted PowerShell command against on-premises server without exposing it to the internet.

This use case demonstrates how we can manage an on-premises workload using scripts stored and initiated in the cloud. It opens endless opportunities to manage both on-premises and cloud from a centralized location.

Source Practical365

read more
Azure App Service

Australia Seeks Switch to Microsoft Azure-Based Cloud Records


Australia has announced governmental organizations will soon allow records to be accessed in the cloud. The push is part of an initiative by the Digital Transformation Agency (DTA) as it continues to digitize the country’s government.

Specifically, the Digital Records Transformation Initiative (DRTI) wants to modernize the public sector and improve productivity. Automation and cloud integration will drive the evolution. A focus will be placed on allowing government agencies to access data inter-organizationally.

Now the DTA is searching for a software-as-a-service platform to handle government digital practices and to solve current problems in records management.

“The customer wants to create an information management environment that is agile and suited to current and emerging records management needs,” the DTA wrote.

“The customer is seeking an innovative records management solution that connects to multiple business systems and repositories to automatically capture, classify, and securely dispose of records based on their content with minimal user input in accordance with the requirements of applicable records management legislation, standards, and authorities.”

In its statement, the DTA confirmed Microsoft Azure is the corporate environment underpinning the transformation. It is also leveraging Microsoft 365. Any successful solution would need to be based on this current environment.

Australia’s federal government has embarked on ambitious plans to overhaul the country’s digital capabilities.

“Government has made significant progress on our digital journey, however, there is a long way to go. If we are to deliver the high calibre of digital services that Australians deserve, we need to act quickly and strategically to lift digital capability,” the APSC, alongside the DTA, wrote in APS Digital Professional Stream Strategy.

Data Limitations in Australia?

Last year, Microsoft Chief Counsel and president, Brad Smith claimed it was becoming increasingly difficult for companies to store data in Australia. He described the country’s encryption regulations as “no longer comfortable”.

In 2018, the Australian government passed new data encryption laws. A first of its kind in the world, the legislation was created in an effort to stop crime and terrorism. Critics have said the laws could have a worse impact on security and will compromise user privacy.

The law forces companies to show data, even if it has been encrypted. This means companies are compelled to share information they don’t always have access to. Smith said this could mean companies stop storing data in Australia.

“But when I travel to other countries I hear companies and governments say ‘we are no longer comfortable putting our data in Australia. So, they are asking us to build more data centres in other countries, and we’ll have to sort through those issues.”

Source WInbuzzer

read more
Azure App Service

GitHub Mobile Version 1.2 Arrives on iOS and Android


Last year, Microsoft’s GitHub open-source code repository announced it was arriving on mobile. In March, the company’s new GitHub mobile apps landed on Android and iOS. Since it has been running in beta testing the tool has become popular.

Now GitHub mobile is receiving an update that bumps the service up to version 1.2 and adds a few interesting changes. One of the new tools is improved pull requests on GitHub. This includes support for marking files as viewed, deleting, and collapsing files.

Furthermore, image, PDF, and markdown files will now render when selected. Elsewhere, GitHub mobile now has smoother performance when comments are being typed. Badges for organizations on profiles will now link directly back to the connected organization.

Below is the full changelog

  • Improved pull request review experience, with support for marking files as viewed, collapsing files, deleted files, and more
  • Markdown, image, and PDF files now render if you click on them while browsing code
  • Typing new comments is now smoother than ever, with no jumping or flickering
  • Labels, user statuses, and commit messages that used emoji shortcodes now properly render emojis
  • Organization badges on user profiles link to the mentioned organization
  • You can view multiple author avatars for commits on pull request timelines
  • New fork badge on repository profile that links to the parent repository
  • New “Metaphorical Technology” custom app icon!
  • New support for iPad pointer effects
  • Fixed an iPad bug where keyboard dismisses while typing a review
  • Fixed voice-over bugs in the inbox filter view”

GitHub Mobile

With the mobile apps, users of the Microsoft-owned code repository service can manage their projects on mobile devices. Furthermore, developers can provide feedback, respond to comments, organize tasks, and review pull requests.

It is also possible to view code but that as far as the app takes it in terms of code management. You won’t be able to edit code on the app, at least not yet.

You can check out the GitHub mobile app from Google Play or App Store.

Source Winbuzzer

read more
Azure App Service

How to move your process time from Office 365 to Azure Batch – Part 2


Many of the services offered by Azure can be used to extend the capabilities and improve the efficiency Office 365. This is especially the case for SharePoint, both on-premises and in the cloud. One of the services offered by Azure is “Azure Batch”, which is a Platform as a Service (PaaS) solution that provides all the physical infrastructure plus the core software to run programs in groups, removing excessive server loads from production.

In Part One of this article, where we explored how Azure batch works, how it works with SharePoint and how to initiate this in your environment. In Part Two, we’ll be looking at the configuration of Azure Batch, its execution and the results the solution creates. The complete source code for the application can be found in the GitHub repository.

The Configuration of Azure Batch

Step 8 – The MainAsync routine is the principal entry point to create the Azure infrastructure. Here you need to start the batch process, gather the results, and send them back to SharePoint. This element of the task is asynchronous and takes care of the creation and decommissioning of the relevant Azure resources in a proactive way.

Azure Batch script

Figure 6. The principal part of the MainAsync method

Note: the batch operations (inside the “batchClient” object creation) are defined as:

Azure batch script screenshot

Figure 7. Definition of the Batch operations

Step 9 – The storageAccount variable contains a pointer to the Azure Storage using its connection string. The ConfigureAzureStorage routine creates all the necessary containers and blobs in the Azure Storage and returns a variable blobClient of type CloudBlobClient.

Here, you’ll create three blobs: one to contain the XML files from SharePoint, one for the executables that will process the XML files, and one for the results of the processing operation.

A screenshot of Azure batch script

Figure 8. The Azure Storage Blob Containers are created dynamically

Step 10 – The “UploadFilesToAzureStorage” routine carries out two different tasks:

  • It reads the XML files for each item in the SharePoint List and uploads them to the data file container
  • It uploads the files of the executable CreatePdf.exe that will process the data (the code of this executable is explained at the end of the article). This routine returns a DotNet Tuple that contains two generic lists with objects of type ResourceFile containing references to each of the files uploaded to Storage

The first part of the routine loops through all the elements of the SharePoint List using the SharePoint Client Side Object Model. It then converts the attached XML file of each element to a stream, creates a generic list of Tuples that contains the element name and stream of the XML file, and uploads it to the Azure storage container. The second part is identical, but for the files of the processor.

Step 11 – To upload the files processed by the Batch, Azure needs the SAS (Shared Access Signature) of the output blob container to be used in Azure Storage. The SAS is calculated by the GetContainerSasUrl routine:

Figure 9. The calculation of the SAS for the output container

Step 12 – Initially, in the part corresponding to the creation of the infrastructure needed in Azure, the Pool that contains the processing Virtual Machines is created using the routine CreatePoolAsync:

Figure 10. The creation of the Virtual Machines Pool

During the creation of the Pool, you need to specify how many nodes (Virtual Machines) are needed (two in the example), which type of machines (“standard_d1_v2” in the example, the list with possible values can be found in here) and the type of operating system (“6” in the example for Windows Server 2019; the list with values can be found in here).

It’s also possible modify the configuration, that is, if more power is needed, the Pool itself creates additional Virtual Machines. The Pool specifies a command that is executed when each Virtual Machine starts. In this case, it is indicating that two working variables must be created to contain the directories where the executables will be uploaded and where the working files are to be maintained.

Step 13 – Once we have the Pool, we need to create the Job that will contain the Tasks. This is done by the CreateJobAsync routine.

Figure 11. The creation of the Job

Step 14 – The AddTasksAsync method is responsible for creating the Tasks to be executed and started:

Azure batch script

Figure 12. The creation of the Tasks

Task must be defined for each data file that has been uploaded to the storage. One of these Tasks is a command that you need to run; for example, the command starts the CreatePdf.exe with two parameters: a specific data file, and the SAS of the Storage Output Blob.

Azure Batch distributes the Tasks across the available Virtual Machines, balancing the load between them. Finally, the JobOperations method starts the Azure Batch job, creates the Virtual Machines, downloads the executable and data files into them, and gives the command to start the job.

Step 15 – Tasks can be monitored when they are finished. The MonitorTasks routine defines the timeout as a parameter. If the timeout fires before any Task has finished, a message is displayed. Also, if a Task ends smoothly or with an error, the corresponding message is also displayed.

Step 16 – When the task finishes running, it uploads the result to the Azure Storage output blob. When all tasks are finished, the DownloadBlobsFromContainerAsync routine uploads the PDF files to the corresponding SharePoint List items using, again, the SharePoint Client Side Object Model.

Figure 13. The results are uploaded to Azure and SharePoint

Step 17 – When the PDF files are uploaded to SharePoint, we do not need to keep the Azure Batch or Azure Storage infrastructure for longer (if the resources are up and running, you need to pay for them). To delete all those resources, the “CleanUpResources” function is called at the end of the process.

Figure 14. Clean up the Azure resources

Step 18 – The program run by the Tasks must be a Windows Console Application. Add a new Project in the Visual Studio Solution (called CreatePdf in the example). To create the PDFs, the example uses iText7, a CSharp library that allows you to create files of this type (, it is a commercial product, but you can use the Open Source version for testing).

You need to install the iText7 NuGet in the project and add using directives to iText.Kernel.Pdf, iText.Layout, and iTextSharp.Layout.Element. Also, use the NuGet Windows.Azure.Storage to upload the files to the Azure Storage, and add using directives to Microsoft.WindowsAzure.Storage.Blob.

This program simply reads the XML file coming from SharePoint and creates a PDF file:

Figure 15. The creation of the PDF

The UploadFileToContainer uploads the created PDF file to the Azure Storage output blob:

Figure 16. Upload of the PDF to the Azure Storage blob

Step 19 – Set the constant values at the beginning of the code with the real values which builds the complete solution. Run the program and observe the messages in the console:

Figure 17. The output of the batch processing

When the application finishes working, the Azure resources are deleted and each item in the SharePoint List have two files as attachments (the XML and PDF files), containing the fields information:

Figure 18. The result files in SharePoint


Within the many ways to use Azure services to complement the operation of SharePoint OnLine or OnPrem, Azure Batch enables you to move CPU and/or memory-intensive operations outside the SharePoint tenant to remote processes in Azure Virtual Machines.

Azure Batch works in conjunction with Azure Storage to receive the data files to be processed, executable files, and the results from processing. The system is scalable with virtually no calculation capacity and memory limits. The use cases are intelligent, going from relatively straightforward examples as the creation of PDFs as in the example, passing through extended engineering stress calculations for mechanisms and civil buildings, to time and resources expensive calculations for solving algorithms for physics or astronomical problem.

Source Practical365

read more
Azure App Service

Microsoft Discusses Azure Cloud Capacity Changes It Made to Handle the COVID-19 Crisis


Throughout the COVID-19 crisis, Microsoft has been consistently tweaking Azure cloud capacity to keep up with the demand of stay-at-home workers. While some countries are emerging from the crisis, many are still locked in the grip of the COVID-19 pandemic

This week, Microsoft offered an update on how it is handling Azure cloud capacity. As usual, the focus is on how the company manages demand for Microsoft Teams, which is based on Azure. Microsoft workplace collaboration/video communication platform has seen a significant rise in users during COVID-19.

From a daily active userbase of around 20 million, Microsoft Teams usage has soared to over 75 million. Even that figure comes from Microsoft’s financial announcement last month. It is reasonable to presume Teams use has continued to rise since then.

Either way, the jump in users obviously put a greater load on the cloud base. Microsoft spoke in April about how it was adjusting capacity for some services and reducing for others based on importance.

“Teams went from a service that was cool and convenient and something that people realized was the future of communication to very quickly becoming a mission critical, can’t-live-without-it sort of thing,” said Mark Longton, a principal group program manager for Microsoft Teams. “Really what this did was accelerate us into the future.”

Upping Capacity

In an update this week, Microsoft says employees in its datacenters have been putting in long shifts to ensure new servers are installed and running. Of course, the company points out these round-the-clock shifts observed distancing guidelines of six feet.

According to the company, a priority was put on adding servers to the regions most affected by COVID-19. Furthermore, Microsoft doubled the capacity of one of its undersea cables that runs under the Atlantic. Microsoft also “negotiated with owners of another to open up additional capacity.”

The United States and Europe were two of the worst hit regions. Microsoft tripled deployment capacity through its America Europe Connect cable.

Source Winbuzzer

read more
Azure App Service

Microsoft Azure Expanding to Deal with COVID-19 Demand


Microsoft has revealed it is expanding the capacity of its cloud services and Microsoft Azure to keep up with demand amid the COVID-19 pandemic. In a blog post, the company says it is constantly working to ensure there is enough service performance for all users, but is prioritizing frontline workers, first responders, and healthcare organizations.

To help handle the load, Microsoft is boosting its worldwide data center output and says it is adding more resource limits to new Azure customers.

On the Microsoft Azure blog, Microsoft says how it is managing “business continuity with Azure”, including helping customers complete urgent work and continuing to expand Azure alongside ongoing demands.

At the end of last month, Microsoft said it was tweaking cloud services like Microsoft Teams to handle growing demand. In an Azure blog post, Microsoft said “We have seen a 775 percent increase of our cloud services in regions that have enforced social distancing or shelter in place orders.”

Microsoft Teams Demand

Last week, Microsoft corrected that 775% information because the data only related to Teams in Italy. And it seems Microsoft Teams is the focus of the company’s continued push to manage Azure to keep up with demand.

In the latest blog post, Microsoft explains how Teams continues to grow:

“Last month, the surging use of Teams for remote work and education due to the pandemic crossed into unprecedented territory. Although we had seen surges in specific data center regions or wider geographies before, such as in response to natural disasters, the substantial Teams demand increase from Asia and then quickly followed in Europe indicated that we were seeing something very different, and increasingly global.”

To refocus Teams and prepare it for the massive surge in users, Microsoft took these steps:

  • “Optimized and load-balanced Teams architecture without interrupting the customer experience.
  • Expediting additional server capacity to the specific regions that faced constraints.
  • Approving the backlog of customer quota requests.
  • Removing restrictions for new free and benefit subscriptions in several regions.
  • Refining our Azure demand models.”

Source Winbuzzer

read more
Azure App Service

Microsoft Azure’s Remote Rendering Service Reaches Public Preview


Over a year after its announcement, Azure’s Remote Rendering service is available to the public. A preview version of the tool was announced in February last year and has been polished by private users since then.

Remote Rendering leverages cloud streaming technology to render high-quality 3D content remotely and deliver it to even low-end devices. A clear target here is the HoloLens 2, which has little space for powerful internals but still has a demand for detailed visuals.

In an enterprise environment, the availability of high-fidelity renders can be especially important. A higher polygon count can help medical students better understand operating procedures or an engineer can get guidance in a similar fidelity to the real world.

“A traditional approach to viewing 3D content on untethered devices is called decimation, which compresses the models and removes polygons. This simplifies the model to a point where it can run on slower GPU hardware,” explained Microsoft. “The result can be a loss of important detail that’s needed to make key business and design decisions. Azure Remote Rendering renders content in the cloud and streams it to devices in real time so that people can use interactive, high-quality 3D models with every detail intact and no compromise on quality.”

The experience is available via an SDK, which Microsoft says is easy to integrate into applications. Some of the same tech is likely being used in Project xCloud to deliver game streaming, so this is another example of a crossover between the company’s seemingly unrelated brands.

You can sign-up for the Remote Rendering preview on the Azure site, with $280 in free credits available to new users and a 10-minute quickstart tutorial.

Source Winbuzzer

read more
1 2 3
Page 1 of 3