Azure Advocates Weekly Round Up – Security, JavaScript, and more M365 this week!

This article is contributed. See the original author and article here.

How does accessibility fit into an MVP? | Creating Startups
Simona Cotin


This is a guest post by Obinna Ekwuno, software engineer at Gatsby and Marcy Sutton, freelance web developer and accessibility specialist. Read more about Obinna and Marcy at the end of this article. Accessibility has become a buzz word for making sure that disabled users get a fair experience when interfacing with platforms on the web or on mobile.


 


OfficeDev/M365Bootcamp-TeamsOneProductivityHub
Ayca Bas


Use Microsoft Graph Toolkit to build a solution for Microsoft Teams that will track daily calendar, tasks and e-mails in a Teams Tab as one productivity hub. – OfficeDev/M365Bootcamp-TeamsOneProductivityHub


 


How to Develop Serverless Apps with GitHub Codespaces | A Cloud Guru
Alvaro Videla Godoy


Want to develop a serverless app with Codespaces? This tutorial will show you how to get an Azure Functions & Static Web App project ready.


 


Azure Stack Hub Partner Solutions Series – iVedha
Thomas Maurer


Aytra was borne as an ISV solution aimed at enabling partners in their Azure Stack Hub journey. Check it out!


 


Intro to Mobile Development
Nitya Narasimhan


Mobile app development is both a huge opportunity and a constant challenge. In this talk we’ll look at the mobile development landscape – from native apps to multi-platform development and mobile web. We’ll talk about design challenges and personalizing user experiences to match diverse contexts. And we’ll look at emerging paradigms in dual-screen and multi-posture devices (e.g., Surface Duo) and talk about how we can leverage these technological advances to rethink the modern mobile app.


 


Microsoft 365 PnP Weekly – Episode 100
Waldek Mastykarz


A weekly discussion of the latest news and topics around Microsoft 365


 


What is infrastructure as code? | Azure DevOps Blog
Jay Gordon


What is infrastructure as code? Microsoft Azure provides you with a number of options to deploy your infrastructure. In the One Dev Question series, Cloud Developer Advocate Abel Wang explains how Azure DevOps provides developer services to support teams to plan work…


 


Top 10 Best Practices for Azure Security
Sonia Cuff


Mark Simos, lead Cyber security architect for Microsoft, explored the lessons learned from protecting both Microsoft’s own technology environments and the responsibility we have to our customers, and shares the top 10 (+1!) recommendations for Azure security best practices.


 


What is Azure Hybrid Benefit?
Sarah Lean


The Azure Hybrid Benefit can help you save move when you are running your workloads within Azure by leveraging your on prem licenses, find out more in this blog post. 


 


Automanage for Azure virtual machines
Thomas Maurer


Azure Automanage for virtual machines is a service that helps to discover, onboard, and configure Azure Management services for Azure VMs.


 


Running A Virtual Conference: Roguelike Celebration’s AV Setup
Em Lazer-Walker


The Roguelike Celebration conference has been running for five years, but two weeks ago marks our fir… Tagged with streaming, events, conferences, twitch.


 


DanWahlin/FluidVue
Dan Wahlin


Contribute to DanWahlin/FluidVue development by creating an account on GitHub.


 


Adding Let’s Encrypt SSL Certificate to Azure Functions
Justin Yoo


This post shows how to generate an SSL certificate through Let’s Encrypt API and bind the certificate to the custom APEX domain on Azure Functions app.


 


AzUpdate: Microsoft 365 Apps Admin Center, Azure Site Recovery TLS Certificate Changes and more
Anthony Bartolo


Lots to share in the world of Microsoft services this week. News includes Microsoft 365 Apps Admin Center’s new inventory & monthly servicing feature currently in preview, Azure Cognitive Services has achieved human parity in image captioning, Azure Site Recovery TLS Certificate Changes, Static Web App PR Workflow for Azure App Service using Azure DevOps, and of course the Microsoft Learn Module of the week.


 


250 million reasons to build applications on Microsoft 365
Waldek Mastykarz


You might have heard of Microsoft Teams, Outlook, or SharePoint. But did you know that next to being some of the most popular applications from Microsoft, they are a part of a highly extensible development platform with a rich partner ecosystem?


 


Want to Learn JavaScript? We’ve Got a Series for You! | LINQ to Fail
Aaron Powell


Get ready to dive into all things JavaScript.


 

Single Node Data Exploration and ML on Azure Databricks

Single Node Data Exploration and ML on Azure Databricks

This article is contributed. See the original author and article here.

Oftentimes data scientists and other users working on smaller data sets in Azure Databricks explore data and build machine learning (ML) models using single-machine python and R libraries. This exploration and modeling doesn’t always require the distributed computing power of the Delta Engine and Apache Spark offered in Azure Databricks. Doing this type of work on a traditional multi-node cluster often results in wasted/underutilized compute resources on worker machines which results in unnecessary cost.


 


MikeCornell_0-1602856306237.jpeg


 


Single Node clusters is a new cluster mode that allows users to use their favorite libraries like Pandas, Scikit-learn, PyTorch, etc. without wasting unnecessary compute/cost associated with traditional multi-node clusters. Single Node clusters also support running Spark operations if needed, where the single node will host both the driver and executors spread across the available cores on the node. This provides the ability to load and save data using the efficient Spark APIs (with security features such as User Credential Passthrough) and also doing efficient exploration and ML using the most popular single-machine libraries.  


 


MikeCornell_1-1602856306293.png


 


If/when a data scientist wants to use distributed compute to do things like hyperparameter tuning and AutoML or work with larger datasets, they can simply switch over to a standard cluster with more nodes. 


 


When the Single Node capability is combined with other capabilities like:



Azure Databricks provides a truly unified experience intended to make data scientists and other analysts more efficient and effective.


 

Azure portal – Announcements and videos

Azure portal – Announcements and videos

This article is contributed. See the original author and article here.

Like any cloud service, the Azure portal itself also gets functionality updates and changes. So how do you keep up with what’s new?


 


Azure portal update blog


The Azure portal product team maintain a blog where, on a monthly basis, they post a summary of what’s in the latest update. Visit the Azure portal blog and follow them for notifications via Tech Community or via RSS feed. 


 


Did you know that Cosmos DB now has a Serverless capacity model, for consumption based billing (in public preview)? That was in the October 2020 update!


 


Azure portal “How To” Video Series


The Azure portal team also publishes short “how to” videos on YouTube, under the Microsoft Azure channel.


Azure Portal How To.png


Here are some of my favorites:


 


Improvements to the Linux Virtual Machine experience





 


How to monitor Azure Functions





 


How to connect to a storage account using private link





 


 


Learn more:



 


 

Learn TV Live Session – Develop secure IoT Solutions for Azure Sphere with IoT Central 20th Oct 2020

Learn TV Live Session – Develop secure IoT Solutions for Azure Sphere with IoT Central 20th Oct 2020

This article is contributed. See the original author and article here.

On the 20th October at 1PM PDT, 9PM BST, Mustafa Saifee, a Microsoft Learn Student Ambassador from SVKM Institute of Technology, India and Dave Glover, a Cloud Advocate from Microsoft will livestream an in-depth walkthrough of how to develop a secure IoT solution with Azure Sphere and IoT Central n Learn TV.


 


The content will be based on a module on Microsoft Learn, our hands-on, self guided learning platform, and you can follow along at https://docs.microsoft.com/en-us/learn/modules/develop-secure-iot-solutions-azure-sphere-iot-central/ 


 


You can follow along with us live on October 20th, or join the Microsoft IoT Cloud Advocates in our IoT TechCommunity throughout October to ask your questions about IoT Edge development.


 


Meet the presenters


 

 

mustafasaifee.jpg


Mustafa Saifee


Microsoft Learn Student Ambassador


SVKM Institute of Technology


 

dave-glover.png


Dave Glover 
Senior Cloud Advocate, Microsoft


IoT and Cloud Specialist


 


Session details



In this session Dave and Mustafa will deploy an Azure Sphere application to monitor ambient conditions for laboratory conditions. The application will monitor the room environment conditions, connect to IoT Hub, and send telemetry data from the device to the cloud. You’ll control cloud to device communications and undertake actions as needed.



Learning Objectives



In this module, you will:



  • Create an IoT Central Application

  • Configure your Azure Sphere application to IoT Central

  • Build and deploy the Azure Sphere application

  • Display the environment telemetry in the IoT Central Dashboard

  • Control an Azure Sphere application using Azure IoT Central properties and commands



 


Ready to go


Our Livestream will be shown live on this page and at Microsoft Learn TV on Tuesday 20th October 2020 or early morning of Wednesday 21th October in APAC time zone. 



 

This is a Global event and can be viewed LIVE at Microsoft Learn TV https://docs.microsoft.com/en-us/learn/tv/ at these times:



 

USA 1pm PDT
USA 4pm EDT
UK 9pm BST
EU 10pm CEST

India 1.30am (21th October) IST






Let's Encrypt SSL Certificate to Azure Functions

Let's Encrypt SSL Certificate to Azure Functions

This article is contributed. See the original author and article here.

Throughout this series, I’m going to show how an Azure Functions instance can map APEX domains, add an SSL certificate and update its public inbound IP address to DNS.


 



  • APEX Domains to Azure Functions in 3 Ways

  • Let’s Encrypt SSL Certificate to Azure Functions

  • Updating DNS A Record for Azure Functions Automatically

  • Deploying Azure Functions via GitHub Actions without Publish Profile


 


In my previous post, I discussed how to map a root domain or APEX domain with an Azure Functions instance. Let’s bind an SSL certificate to the custom domain, which is generated by Let’s Encrypt so that we can enable HTTPS connection through the custom domain.


 


Let’s Encrypt


 


Let’s Encrypt is a non-profit organisation that issues free SSL certificate. Although it’s free, it’s widely accepted and backed by many tech companies. There are a few limitations, though. It’s valid only for three months. In other words, we MUST renew the SSL certificate issued by Let’s Encrypt for every three months. But you know, we’ve got automation! So, don’t worry about the certificate renewal as long as we’ve got the automation process for it.


 


Azure App Service Site Extension


 


Azure App Service provides the site extension feature. One of the extensions is the Let’s Encrypt Site Extension. It’s written as the Azure WebJob style so that the WebJob runs every three months to renew the certificate automatically. It’s a pretty useful extension.


 



 


However, this extension has a few critical drawbacks as well.


 



  • It only runs on Windows-based App Service instances (including Azure Functions) because WebJob basically relies on the Windows platform. No Linux-based App Service, unfortunately.

  • It shares the runtime environment with the App Service instance. Therefore, whenever we deploy a new App Service instance, we MUST always deploy the extension and configure it.

  • If we deploy an application with the “delete all files before deployment” option, the WebJob will get deleted.


 


It doesn’t seem to be a way for production use. What else can we take to bind the SSL certificate for free?


 


Azure Functions App Only for SSL Certificate Management


 


We’re lucky enough to have Shibayan who publishes an excellent Azure Function app that manages Let’s Encrypt SSL Certificate with no dependency on the App Service instances. Through the application, we can quickly generate and renew as many SSL certificates as we can and store them to Azure Key Vault. The stored SSL certificates are directly bound to Azure Functions instances. How fantastic!


 


First of all, run the ARM template below to provision an Azure Functions app and Key Vault instance. But, if you like, you can write your own ARM template and run it.


 



 


The provisioned Azure Functions app instance got the Managed Identity feature enabled so the app can directly access to the Key Vault instance to store SSL certificates. Once all relevant resources are provisioned, follow the process below.


 



Let’s say the Azure Functions app instance for the SSL certificate management as https://ssl-management.azurewebsites.net.



 


Authentication / Authorisation


 


The provisioned Azure Functions app includes an admin UI which is only accessible through authentication. Therefore, activate the Authentiation / Authorisation feature like below:


 



 


Then, configure the Azure Active Directory for authentication. We use the account registered to Azure Active Directory. Set the management mode to Express and put the app name. The default value of the app name is the Function app name. We don’t need to change it.


 



 


Now, we got the Azure Functions app configured for SSL certificate management.


 


Azure DNS Configuration


 


I’m assuming that we use Azure DNS for domain management. Go to the resource group where the Azure DNS instance is provisioned and select Access control (IAM) blade, then assign a role to the Azure Functions app for SSL certificate management.


 



 



  • Role: DNS Zone Contributor

  • Assign access to: Function App

  • Selected members: Azure Functions app for SSL certificate management. Only apps that Managed Identity feature enabled appear here.


 


SSL Certificate Generation


 


Open a web browser and access to the admin UI for the SSL certificate management, by accessing to https://ssl-management.azurewebsites.net/add-certificate. If it’s the first time for you to access, you’ll be asked to log-in.


 



 


Once logged-in, the admin UI appears. For APEX domain, enter nothing to the Record name field then click the Add button. If you want to issue the certificate for subdomains, add the subdomain to the Record name field. You can also issue one certificate for as many domains as you want. Here we generate one certificate for both cnts.com and dev.cnts.com.


 



 



If you prefer to creating a separate certificate for each domain, cnts.com and dev.cnts.com, then run the registration twice.



 


Once completed, the pop-up appears like:


 



 


Let’s go to the Azure Key Vault instance to check whether the SSL certificate has been generated or not.


 



 


SSL Certificate Binding to APEX Custom Domain on Azure Functions


 


We’ve got the custom APEX domain, mapped from the previous post. Now, it’s time to bind the certificate with the domain. Go to the Azure Functions instance that I want to attach the certificate and select the TLS/SSL settings blade. Click the Private Key Certificates (.pfx) tab then Import Key Vault Certificate button to import the one stored in our Key Vault instance.


 



 


Once imported, you can see the screen below. As we generated one certificate for both cnts.com and dev.cnts.com, it’s normal to see both domain names.


 



 


Let’s select the Custom domains blade. The domain is still not bound with the SSL certificate that we just imported. Click the Add binding link, choose cnts.com for the Custom domain field, cnts.com,dev.cnts.com for the Private Certificate Thumbprint field. And finally, choose SNI SSL for the TLS/SSL Type field.


 



 


Now we can see the SSL certificate is properly bound with the custom APEX domain.


 



 




 


So far, we’ve walked through how Let’s Encrypt SSL certificate can be bound with a Custom APEX domain on Azure Functions instance. In the next post, I’ll discuss how the inbound IP of the Azure Functions instance is automatically updated to the A Record of Azure DNS.


 


This article was originally published on Dev Kimchi.

Monitoring O365 Service Status with Azure Sentinel

Monitoring O365 Service Status with Azure Sentinel

This article is contributed. See the original author and article here.

During the past few weeks Microsoft has experienced some unfortunate outages in our cloud services.  These outages led to a number of organizations I support reaching out and asking, “How can I better proactively monitor the status of Office 365?”.  This gave me an idea……but before we get to that, let’s discuss where you can find service status information for Office 365 and Azure.


 


Office 365 Service Status


The primary location to find the status of Office 365 Services is inside the Admin Portal using the Service Health Dashboard (https://portal.office.com/Adminportal/Home#/servicehealth) .


 

 

In addition to this portal, if you are a Twitter user you can follow Microsoft 365 Status (@MSFT365Status) to get notifications of incidents within Microsoft 365:


O365 Status Twitter.png


If you are interested in the status of Microsoft Azure, you can leverage the Service Health Blade (https://aka.ms/azureservicehealth :(


Azure Service Health.jpg


These are all very effective methods of tracking service status, but what if I am leveraging Azure Sentinel as my SIEM and I want to track the Office 365 Service Status?   Well that was the question that got me started on this article.  I find it is easiest to learn new technology by having a problem to resolve or an actual goal to achieve.  So I decided this was a good use case to learn more about how to get custom data, in this case REST API data, into Azure Sentinel, use that data to alert on service degradation and then create a new workbook to visualize it.  A pretty lofty goal for a guy with almost zero coding experience.  Let’s see how it worked out…..


 


Step One:  Getting Office 365 Service Status via API


As with just about every other component of the Microsoft Cloud, Office 365 Service Status can be accessed via the Office 365 Management API ( https://docs.microsoft.com/en-us/office/office-365-management-api/office-365-service-communications-api-reference ).  I decided the most effective way to pull this data and send it to Azure Sentinel was to use an Azure Logic App.  If you are not familiar with Azure Logic Apps, it is a low code/no code cloud service that helps you schedule, automate, and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations.  Azure Logic Apps are a sibling to Microsoft Power Automate that is part of Office 365, so learning one of these services translates to the other.  This was very helpful because a Microsoft MVP in the UK, Lee Ford, had written a blog post in 2019 on accessing the Service Status via Power Automate (which was called Flow at the time):  https://www.lee-ford.co.uk/get-latest-office-365-service-status-with-flow-or-powershell/ .  I built on Lee’s idea to create my Logic App:


 


I started by creating a new Logic App that runs on a schedule and connects to the Office 365 Management API to get the Service Status via an “HTTP” Action.  I chose every 4 hours, you can decide how often you want to pull the data for your use case.


Logic Apps 1.png


 


The first thing that probably stands out to you is that the “Security Guy” hard coded the authentication secrets into the Logic App.  I only did this for ease of development.  If I were building this application for a production environment I would leverage Azure Key Vault to securely store this information (https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-azure-resource-manager-templates-overview#best-practices—workflow-definition-parameters ). 


 


Next I used a “Parse JSON” action to manipulate the returned information from the HTTP Get.  I used the schema from Lee Ford’s Blog post as my sample payload.


Logic Apps 2.png


 

Now the last step is a little tricky.  We need to take the returned JSON payload and send it to Azure Sentinel.  This payload is an array, so it must be iterated through.  Luckily, Logic Apps is built for people with minimal coding experience and helps guide you through the experience.  Since we want to send this data to Azure Sentinel, which is built on Azure Log Analytics, we choose the “Send Data to Log Analytics” Action.  When I click in the box for “JSON Request body” I am provided a pick list of returned information to choose from.  However, the item we need to use is not shown, so you need to click the “see more” option in the pick list.  This will expose the “value” item, which is what we need.


Logic Apps 3.png


 

When we finish filling in the required parameters, Logic Apps will automatically recognize this is an array and create a For Each container to iterate through the values…pretty cool!


Logic Apps 4.png


 

We are not finished yet.  We don’t actually want “value” in the JSON Request Body field.  We want whatever is the “Current Item” in the loop.  So, delete “value” in the Send Data action and go back to the bottom of your pick list and choose Current Item.


Logic Apps 5.png


 

And that’s it!  You have now ingested Office 365 Service Status to Azure Sentinel.  One thing I forgot to point out, Azure Log Analytics will automatically create the custom log the first time the Logic App runs.  It will add a table called “yourname_CL”. 


Sentinel Custom Log Table.png


 

Step Two:  Making use of the data


Now that we have ingested the service status data into Azure Sentinel, let’s do something with it.


 


First let’s write a simple KQL (Kusto Query Language) query to pull out the basic data we need:


Simple KQL Query.png


 


Now let’s create a scheduled query analytics rule that will create an incident when a service is degraded:


Incident 1.png


Incident 2.png


Incident 3.png


 

 

One of the cool new features in Azure Sentinel that you will notice above is where we can get a preview of what this query will produce.  Based on the settings I have chosen; this will create 1 Alert per day.  You don’t want to create an alert flood, but you do want to be notified appropriately.  So, change the Query Scheduling for what makes sense for your organization. 


Incident 4.png


 

I’m just going to use the defaults for Incident Settings.


Incident 5.png


 

You can even use an Azure Logic App playbook to take some automated action based on the Incident.


Incident 6.png


 

Done!  Now we will see an incident generated if there is a service degradation in Office 365.  See below:


Incident Generated.png


 

For a production environment, I would probably want to be a little more detailed in my incident generation, getting down to individual services, but hopefully this has shown you the “Art of the Possible” and you can take it further.


 


Step Three:  Bonus Step!  Let’s create a workbook in Azure Sentinel to display some of the information we have gathered.


Workbooks (https://docs.microsoft.com/en-us/azure/sentinel/tutorial-monitor-your-data ) provide a way to visualize your data in a custom dashboard experience. 


 


Let’s see what we can come up with.  First we need to create a new workbook:


Create Workbook 1.png


 

This will get a workbook populated with some sample data to start with, let’s edit it:


Create Workbook 2.png


 

Let’s start off by just making a simple grid of the query we already built to show degraded services in the past 4 hours:


Create Workbook 3.png


 

That will get us a simple workbook like this (I also edited the title before I captured the screenshot):


Create Workbook 4.png


 


That’s not very exciting, so let’s add another Query section and try to build a graph:


Create Workbook 5.png


 


We are going to build a “honey comb” graph that will show which services are operational and which are degraded:


Create Workbook 6.png


 

Instead of creating multiple screenshots I have highlighted in green the items I changed.  Also I used a query that returns all service status, not just degraded. (see above)


 


When you click “Done Editing” you will get this visualization which can be zoomed into and out of, as well as moved around. Not perfect, but it only took a few minutes to build.  I’m sure you can come up with an even better one!


Final Workbook.png


 

Thank you for getting this far in my post…..It went a little long :smiling_face_with_smiling_eyes:.  I hope you found this useful and that you can use it to build something for your organization.  Please post comments or questions below.


Thanks, Tony!