What's New in Excel for the web

What's New in Excel for the web

This article is contributed. See the original author and article here.

Our Excel team strives to provide its customers with rich experiences that increase productivity in Excel for the web. Recently, we shared how we’ve continued to better navigate and manipulate your Excel files in a browser, making working in and navigating around a workbook and other interactions, faster and smoother. Today, we’re excited to introduce new features and capabilities to help you easily format your data with color and style along with a new mini toolbar, table improvements, and more. In this article, we cover:


 



  • Custom color palettes

  • Cell styles gallery

  • Draw & erase borders

  • Mini toolbar

  • Table improvements

  • New printing experience (coming soon)


 


Custom color palette


Match your brand colors or fine tune your color choices with custom color palettes:



  • Select from a wide range of color options via the more colors dialog box by simply dragging the color slider

  • Change color shade by dragging your cursor around the more colors rectangle and view your selection in the preview box

  • Input RGB values or hex values directly for easy color selection


 


Custom-color-palette.gif


 


Cell styles gallery


Keep the formatting of your data consistent so it’s easy to read and understand by applying cell styles such as fonts, number formats, and cell borders and shading.


 


cell-styles.gif


 


Draw & erase borders


Highlight your data or differentiate one set of data from another by adding or removing cell borders.  Pick Draw Border to add outer borders, Draw Border Grid to add gridlines, or Erase Borders to erase them.


 Draw borders.png


 


Mini toolbar


Right click to get quick access to most common formatting commands via the new mini toolbar.


 


Mini Tool bar.png


 


Table improvements


Tables are muscle memory for many Excel users, and we want to continue bringing you a more consistent table experience across Excel on the desktop and Excel for the web, from design and styling to naming to total rows and more:


 



  • Select table design and styling options


 


Table-Improvements_table-design-and-styling-options.png


 



  • Rename a table


 


Table-Improvements_table-rename.png


 



  • Add total row


 


Table-Improvements_add-total-rowpng.png


 



  • Format any data as table


 


Table-Improvements_format-data-as-table.png


 


New printing experience (coming soon)


See what you’re printing and customize it the way you want it with the new printing experience in Excel for the web, now supporting print preview with page layout settings:



  • Set print area: active sheet, entire workbook, or current selection

  • Insert/delete page breaks


 


Print.gif


 


These are just some of the latest improvements, many more are coming soon!


 


Your feedback helps shape the future of Excel for the web. Please let us know how you like a particular feature and what we can improve upon—send us a smile or frown.


 


Want to know more about Excel for the web?  See What’s new in Excel for the web and subscribe to our Excel Blog to get the latest updates. Stay connected with us and other Excel fans around the world – join our Excel Community and follow us on Twitter.


 


Thank you!


 

Test Automation and EasyRepro: 05 – Adding EasyRepro Tests to Azure DevOps

Test Automation and EasyRepro: 05 – Adding EasyRepro Tests to Azure DevOps

This article is contributed. See the original author and article here.

The following is the fifth and final on a series of articles by @Ali Youssefi that we have been cross-posting into this Test Community Blog for a couple of months now. These articles were first published by Ali in the Dynamics community but since the topic related with Testing, Quality and Selenium, we though it would makes sense to publish here as well.


 


If you didn’t get a chance to catch the previous parts of these series, please have a look the links below: 



Otherwise, please read ahead!


 


Summary


 


EasyRepro is an open source framework built upon Selenium allowing automated UI tests to be performed on a specific Dynamics 365 organization. This article will cover incorporating EasyRepro into a DevOps Build pipeline, allowing us to begin the journey toward automated testing and quality. We will cover the necessary settings for using the VSTest task and configuring the pipeline for continuous integration and dynamic variables. Finally we will review the test result artifacts providing detailed information about each unit test and test run.


 


 


Getting Started


 


 


If you haven’t already, please review the previous articles showing how to create, debug and extend EasyRepro. This article assumes that the steps detailed in the first article titled Cloning locally from Azure DevOps have been followed. This approach will allow our test design team to craft and share tests for quality assurance across our DevOps process.


 


 


The Run Settings File


The run settings file for Visual Studio unit tests allow variables to be passed in similar to the app.config file. However this file is specific to Visual Studio tests and can be used in the command line and through Azure DevOps pipelines and test plans. Here is a sample to create a runsettings file.


 


 


Microsoft_Testing_Team_0-1619204872011.jpeg


 


 


 


 


The image above shows how to implement the TestRunParameters needed for EasyRepro unit tests. You can also find an example in the Microsoft.Dynamics365.UIAutomation.Sample project called easyrepro.runsettings. The runsettings file can be used to set the framework version, set paths for adapters, where the result artifacts will be stored, etc. In the Settings section below we will point to a runsettings file for use within our pipeline.


 


 


The ClassInitialize Data Attribute


 


 


The ClassInitialize data attribute is used by Visual Studio unit tests to invoke the constructor of a test class. This decoration coupled with the runsettings file will allow us to pass in a TestContext object containing the run parameters.


 


 


Properties


Microsoft_Testing_Team_1-1619204872048.jpeg


 


 


The configuration values from the runsettings file are included in the Properties object similar to the app.config file. For usage with EasyRepro we will want to leverage the .ToSecureString extension method which will help us when logging into the platform. Below is an example using this extension method.


 


 


Microsoft_Testing_Team_2-1619204872020.jpeg


 


 


Setting up the Build Pipeline


 


 


In the first article Test Automation and EasyRepro: 01 – Getting Started, we discussed how to clone from GitHub to a DevOps Git repository which we can then clone locally. The examples below follow this approach and assume you have cloned locally from Azure DevOps Git repo.


The first thing to setting up the Build Pipeline is to navigate to the DevOps organization and project. Once inside the project, click on the Pipelines button to create a Build Pipeline. Tasks here require resolving the NuGet package dependencies, building a solution with MS Build and running unit tests using the VS Test task as shown in the image below.


Microsoft_Testing_Team_3-1619204872062.jpeg


 


The core task is the VsTest which will run our Visual Studio unit tests and allow us to dynamically pass values from the Build pipeline variables or from files in source control. The section below goes into the VsTest task, specifically version 2.0.


 


 


Reviewing the VsTest task


Microsoft_Testing_Team_4-1619204872068.jpeg


 


 


Test Files


 


 


The test file field needs to point to a working directory, dictated by the Search folder field, to locate the compiled unit test assemblies. When using the default task this field looks for assemblies with the word test. If you’re starting with the Sample project from EasyRepro you will need to change this to look for the word Sample as shown above. When this task is run you can confirm if the correct assembly is found in the log for the task.


 


 


Test Filter Criteria


 


 


The test filter criteria field is used to help limit the scope of the unit tests run within the unit test assembly. Depending on the data attribute decorations you can restrict the unit test run task to only run specific tests. The criteria can be somewhat challenging if you haven’t worked with them before so I’d suggest using the Visual Studio Command Prompt to test locally to better understand how this will work in Azure DevOps Pipelines.


 


 


Microsoft_Testing_Team_5-1619204872144.jpeg


 


 


The above image shows an example of using the TestCaseFilter argument to run a specific test class. This argument can be used to run specific classes, priorities, test categories, etc. For instance


More information on the test filter criteria can be found here.


 


 


Settings File


 


 


The settings file field works with the vstest.console.exe utilizing the “/Settings” argument but allows the ability to pick a file from the repository directly. This field is customizable to also work with Build Pipeline variables which I’ll describe next.


 


 


Microsoft_Testing_Team_6-1619204872125.jpeg


 


 


 


Override Test Run Parameters


 


 


Overriding test run parameters is useful when we want to reuse the same test settings but pass in variables from the Build Pipeline. In the example below I’m replacing the parameters from the runsettings file on the left with Build Pipeline variables on the right.


 


 


Microsoft_Testing_Team_7-1619204872139.jpeg


 


 


Below are my pipeline variables I’ve defined. This allows me to check in a runsettings file but not have to modify parameters before committing. The values can be plain or secure strings which you will have to take into account if you plan to use one or the other. These variables can also be provided at run time when we queue the build.


 


 


Microsoft_Testing_Team_8-1619204872081.png


 


 


Enabling the Continuous Integration Trigger


Microsoft_Testing_Team_9-1619204872103.jpeg


 


 


Enabling the continuous integration trigger allows developers to craft their unit tests and have our build pipeline run on push of a commit. This is configured from the Triggers tab on the pipeline which will bring up the above image. To enable, check the box for ‘Enable continuous integration’ and set which branch you want to have this fire off on. This doesn’t restrict a tester from queuing a build on demand but does help us move forward towards automation!


 


 


Running the Build Pipeline


 


 


To kick off the build pipeline, commit and push changes to your unit tests as you would any Git commit. Once the push is finished you can navigate to the Azure DevOps org and watch the pipeline in action. Below is a recording of a sample run.


 


 


Microsoft_Testing_Team_10-1619204872251.gif


 


 


 


Exploring the Results File


 


 


The results of the unit tests can be found in the build pipeline build along with the logs and artifacts. The test results and artifacts are also found in the Test Runs section of the Azure Tests area. The retention of these results are configurable within the Azure DevOps project settings. Each build can be reviewed at a high level for various test result statuses as shown below:


 


 


Microsoft_Testing_Team_11-1619204872132.jpeg


 


 


 


The summary screen includes the unit tests ran and information about failed tests that can be used to track when a regression began. Columns shown include the last time a test ran and what build it began to fail on.


 


 


Microsoft_Testing_Team_12-1619204872158.jpeg


 


 


When investigating a failed unit test a link to a new or existing bug can be added. This is useful to help track regressions and assign to the appropriate team. Bugs can be associated from the test run or specific unit test and include links to the build, test run and test plan. The exception message and stack trace are automatically added if linked from the failed unit test.


 


 


Microsoft_Testing_Team_13-1619204872114.jpeg


 


 


 


Each test run includes a test results file that can be downloaded and viewed with Visual Studio. The test artifacts can also be retained locally for archiving or reporting purposes. The contents of the test can be extracted and transformed to be used by platforms such as PowerBI or Azure Monitor.


 


 


Microsoft_Testing_Team_14-1619204872151.gif


 


 


Its key to point out that when used with a Azure Test Run these results can be retrieved via the API and reported on directly. Below is an image of the response from the Test Run.


 


 


Microsoft_Testing_Team_15-1619204872090.jpeg


 


 


Next Steps


 


 


Including Unit Tests as a Continuous Delivery Quality Gate


 


 


Building and running our EasyRepro tests with Build Pipelines represent an important first step into your journey into DevOps as well as what has been called Continuous Quality. Key benefits here include immediate feedback and deep insights into high value business transactions. Including these types of tests as part of the release of a Dynamics solution into an environment is paramount to understanding impact and providing insight.


 


 


Including Code Analysis as a Continuous Delivery Quality Gate


 


 


One thing I’d like to point out is that UI testing can help ensure quality but this should be coupled with other types of testing such as security testing or code analysis. The PowerApps Product Group has a tremendously valuable tool called the PowerApps Project Checker for code analysis. Project Checker can help identify well documented concerns that may come up as part of our deployment. This tool can be used via PowerShell or from the PowerApps Build Tasks within the Visual Studio Marketplace. It can also be run from the PowerApps portal in a manual fashion if desired.


 


 



This code quality step can be included as part of the extraction of modifications from your development or sandbox environments or as a pre step before packaging a managed solution for deployment. For additional detail there is a wonderful post by the Premier Field Engineer team documenting how to include this task into your pipelines. Special thanks to Premier Field Engineers’ Paul Breuler and Tyler Hogsett for documenting and detailing the journey into DevOps.


I highly recommend incorporating this important step into any of your solution migration strategies even if you are still manually deploying solutions to better understand and track potential issues.


 


 


Scheduling Automated Tests


 


 


Scheduling tests can be done in various ways, from build and release pipelines to Azure Test Plans. For pipelines triggers can be used to schedule based off a predetermined schedule. Azure Test Plans allow for more flexibility to run a specific set of tests based off of test cases linked to unit tests. To find out more about setting this up, refer to this article.


 


 


Conclusion


 


 


This article covers working from a local version of EasyRepro tests and how to incorporate into a Azure DevOps Build Pipeline. It demonstrates how to configure the VsTest task, how to setup triggers for the Build Pipeline and how to review Test Results. This should be a good starting point into your journey into DevOps and look forward to hearing about it and any question you have in the comments below.


 


 


At the Microsoft Test Team, we have followed this tutorial and often use EasyRepro for our UI-Test related needs for Dynamics 365, please stay tune for a Tips and Trick related to EasyRepro, until next time!

Success for small and midsized businesses requires agility

Success for small and midsized businesses requires agility

This article is contributed. See the original author and article here.

If small and midsized businesses (SMBs) have one thing in common, it is that they are all unique. Differentiate or fail is especially true when it comes to small and midsized business strategy. The secret sauce is what helps leaders of these companies blaze new trails and disrupt industries with innovative products and services. However, disrupting industries requires more than passion and great ideas; SMBs also need the ability to quickly adapt business and operating models to deliver on their vision and brand promises. And the level of business agility required for success takes an ecosystem to deliver.

Adapt faster with Dynamics 365 Business Central

Microsoft Dynamics 365 Business Central provides a connected cloud business management solution for growing SMBs. Connected means you can bring together your finance, sales, services, and operations teams within a single application to get the insights needed to drive your business forward and be prepared for what’s next. While the out-of-the-box capabilities meet the needs for standard business operations, Dynamics 365 Business Central offers operational flexibility to help SMBs adapt faster to changing market conditions and customer expectations.

Our customers use Microsoft Power Platform, connected to Dynamics 365 Business Central, Office 365, Microsoft Azure, and hundreds of other apps, to further analyze data, build solutions, automate processes, and create virtual agents. Microsoft Power Apps turns ideas into organizational solutions by enabling everyone to build custom apps that solve unique business challenges.

We also work closely with our partners to support unique business processes, workflows, and operational models at scale. Our partners build a solution based on industry best practices, so you don’t have to recreate the wheel. You can find over 1,400 apps on Microsoft AppSource to easily tailor and extend Dynamics 365 Business Central to meet unique business or industry-specific needs. Filter for Dynamics 365 Business Central on AppSource and you will find apps for everything from A to Z allocations, banking, and construction to warehousing, XML, yield, and zoning.

At our Directions North America partner conference on April 26, 2021, we will introduce some new applications that will be coming soon to Microsoft AppSource, including:

  • Bill.com streamlines accounts payables and receivables automation workflows and payments: Announced earlier this month, Bill.com will integrate with Dynamics 365 Business Central and Microsoft Dynamics GP. Bill.com is a leading provider of cloud-based software that automates complex back-office financial operations for SMBs. Mutual customers will now be able to take control of their financial processes, save time, and scale with confidence through the power of the integrations’ accounts payables (AP) and accounts receivables (AR) intelligent automation workflows and payments. The Bill.com and Microsoft Dynamics 365 automatic sync is now live. Find out more information on the Bill.com Sync with Microsoft Dynamics 365.
  • Square lets you accept payments quickly, easily, and securely: Make accepting card payments fast, painless, and secure so that you don’t miss a sale. Whether you’re selling in person, online, or on the go, Square helps you get paid fast, every time. Using the Square Payments app from AppSource will automatically sync with Dynamics 365 Business Central so you can keep track of all of your payments in one place.
  • ODP Corporation (parent company of Office Depot) digital procurement platform: As announced on February 22, 2021, the ODP Corporation is partnering with Microsoft to transform how businesses buy and sell. ODP is working to bring the power of their new digital procurement technology platform to Dynamics 365 Business Central customers to help them realize immediate purchase savings and procurement automation. This exciting integration will be coming later in 2021.

Work smarter with Dynamics 365 Business Central

As Dynamics 365 Business Central is part of the Microsoft cloud, our customers also have the ability to extend the solution using Office 365, including Teams, Outlook, Excel, and Word. Don’t let disconnected processes and systems hold your people back. Connecting your business application with productivity tools can help support remote work, improve security, and control costs. Ensure your people can be more collaborative, productive, and impactful by seamlessly connecting Dynamics 365 Business Central with Microsoft 365 apps.

No matter how you choose to manage customer workflows, processes, and unique models, Dynamics 365 Business Central provides the operational foundation and flexibility that is required for success.

Independent Software Vendors (ISVs) and Business Central partners can learn more at Dynamics 365 Business Central the Directions North America virtual event on April 26-28, 2021.

The post Success for small and midsized businesses requires agility appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Best practices for leveraging Microsoft 365 Defender API's – Episode Three

Best practices for leveraging Microsoft 365 Defender API's – Episode Three

This article is contributed. See the original author and article here.

In the previous episode, we described how you can easily use PowerBi to represent Microsoft 365 data in a visual format. In this episode, we will explore another way you can interact with the Microsoft 365 Defender API. We will describe how to automate data analysis and hunting using Jupyter notebook.


 


Automate your hunting queries 


While hunting and conducting investigations on a specific threat or IOC, you may want to use multiple queries to obtain wider optics on the possible threats or IOCs in your network. You may also want to leverage queries that are used by other hunters and use it as a pivot point to perform deep analysis and find anomalous behaviors. You can find a wide variety of examples in our Git repository where various queries related to the same campaign or attack technique are shared.  


In scenarios such as this, it is sensible to leverage the power of automation to run the queries rather than running individual queries one-by-one.  


This is where Jupyter Notebook is particularly useful. It takes in a JSON file with hunting queries as input and executes all the queries in sequence. The results are saved in a .csv file that you can analyze and share. 


 


Before you begin 


JUPYTER NOTEBOOK 


If you’re not familiar with Jupyter Notebooks, you can start by visiting https://jupyter.org for more information. You can also get an excellent overview on how to use Microsoft 365 APIs with Jupyter Notebook by reading Automating Security Operations Using Windows Defender ATP APIs with Python and Jupyter Notebooks.   


 


VISUAL STUDIO CODE EXTENSION 


If you currently use Visual Studio Code, make sure to check out the Jupyter extension 


msftdario_27-1619422918103.png


Figure 1. Visual Studio Code – Jupyter Notebook extension 


 


Another option to use Jupyter Notebook is the Microsoft Azure Machine Learning service. 


Microsoft Azure Machine Learning is the best way to share your experiment with others and for collaboration. 


Please refer to Azure Machine Learning – ML as a Service | Microsoft Azure for additional details. 


msftdario_28-1619422918116.png


Figure 2. Microsoft Azure Machine Learning 


 


In order to create an instance, create a resource group and add the Machine Learning resource. The resource group lets you control all of the resources from a single entry point. 


msftdario_29-1619422918122.png


Figure 3. Microsoft Azure Machine Learning – Resource 


 


When you’re done, you can run the same Jupyter Notebook you are running locally on your device.  


msftdario_30-1619422918118.png


Figure 4. Microsoft Azure Machine Learning Studio 


 


App Registration 


The easy way to access the API programmatically is to register an app in your tenant and assign the required permissions. This way, you can authenticate using the application ID and application secret. 


Follow these steps to build your custom application. 



msftdario_31-1619422977477.png


Figure 5. App registration 


  


Select “NEW REGISTRATION“. 


  


msftdario_32-1619422977481.png


Figure 6. Register an application 


 


Provide the Name of your app, for example, MicrosoftMTP, and select Register. 


Once done, select “API Permission“. 


  


msftdario_33-1619422977495.png


Figure 7. API Permissions 


  


Select “Add a permission“. 


msftdario_34-1619422977484.png


Figure 8. Add permission 


 


Select the “APIs my organization uses“. 


  


msftdario_35-1619422977485.png


  Figure 9. Alert Status 


  


msftdario_36-1619422977486.png


Figure 10. Request API permission 


  


Search for Microsoft Threat Protection and select it. 


msftdario_37-1619422977487.png


Figure 11. Microsoft Threat Protection API 


 


Select “Application Permission“. 


msftdario_38-1619422977489.png


Figure 12. Application Permissions 


 


Then select: 



  • AdvancedHunting.Read.All 

  • Incident.Read.All 


 


msftdario_39-1619422977491.png


Figure 13. Microsoft 365 Defender API – Read permission 


 


Once done select “Add permissions“. 


msftdario_40-1619422977492.png


Figure 14. Microsoft 365 Defender API – Add permission 


 


Get Started 


Now that we have the application ready to access the API via code, let’s try to see is any of the Qakbot queries shared in Microsoft 365 Defender Git produce any results. 


msftdario_41-1619423114158.png


 


Figure 15. Microsoft 365 Defender – Hunting Queries 


 


The following queries will be used in this tutorial:  


 


Javascript use by Qakbot malware 


Process injection by Qakbot malware 


Registry edits by campaigns using Qakbot malware 


Self-deletion by Qakbot malware 


Outlook email access by campaigns using Qakbot malware 


Browser cookie theft by campaigns using Qakbot malware 


Detect .jse file creation events 


 


We need to grab the queries that we want to submit and populate a JSON file with this formatPlease be sure that you are properly managing the escape character in the JSON file (if you use Visual Studio Code (VSCode) you can find extensions that can make the ESCAPE/UNESCAPE process easiest, just pick your favorite one). 


 


 


 


 


 

[ 
        { 
            "Description": "Find Qakbot overwriting its original binary with calc.exe", 
            "Name": "Replacing Qakbot binary with calc.exe", 
            "Query": "DeviceProcessEvents | where FileName =~ "ping.exe" | where InitiatingProcessFileName =~ "cmd.exe" | where InitiatingProcessCommandLine has "calc.exe" and InitiatingProcessCommandLine has "-n 6" and InitiatingProcessCommandLine has "127.0.0.1" | project ProcessCommandLine, InitiatingProcessCommandLine, InitiatingProcessParentFileName, DeviceId, Timestamp", 
            "Mitre": "T1107 File Deletion", 
            "Source": "MDE" 
        } 
] 

 


 


 


 


 


Once you have all your queries properly filled, we must provide the following parameters to the script in order to configure the correct credential, the JSON file, and the output folder. 


msftdario_42-1619423295313.png


Figure 16. Jupyter Notebook – Authentication 


 


Because we registered an Azure Application and we used the application secret to receive an access token, the token is valid for 1 hour. Within the code verify if we need to renew this token before submitting the query. 


msftdario_43-1619423295303.png


Figure 17. Application Token lifetime validation 


 


When building such flow we should take into consideration Microsoft 365 Defender Advanced hunting API quotas and resources allocation. For more information, see Advanced Hunting API | Microsoft Docs.  


msftdario_44-1619423295312.png


Figure 18. API quotas and resources allocation taking into consideration 


 


 We run the code by loading the query from the JSON file we defined as input. We then view the progress and the execution status on screen. 


msftdario_45-1619423295315.png


Figure 19. Query Execution 


 


The blue message indicates the number of queries that is currently running and its progress. 


The green message shows the name of the query that is being run. 


The grey message shows the details of the submitted query. 


If there are any results you will see the first 5 records, and then all the records will be saved in a .csv file in the output folder you defined. 


 


msftdario_46-1619423295309.png


Figure 20.  Query results – First 5 records 


 


Bonus 


You can post the summary of the query execution in a Teams channel, you need to add Incoming Webhook in your teams. 


 


msftdario_47-1619423368892.png


Figure 21.  Incoming Webhook 


 


Then you need to select which Teams channel you want to add the app. 


msftdario_48-1619423368929.png


Figure 22.  Incoming Webhook – add to a team 


 


Select “Set up a connector”. 


msftdario_49-1619423368932.png


Figure 23.  Incoming Webhook – Setup a connector 


 


Specify a name. 


msftdario_50-1619423368937.png


Figure 24.  Incoming Webhook – Config 


 


Now you need to copy the URL, then paste the URL in the Jupyter Notebook. 


msftdario_51-1619423368906.png


Figure 25.  Incoming Webhook – teamurl variable 


 


Then remove the comment from the latest line in the code to send the message to Teams. 


msftdario_52-1619423368909.png


Figure 26.  Incoming Webhook – teamsurl variable 


 


You should receive a similar message like the following in the Teams channel: 


msftdario_53-1619423368914.png


Figure 27.  Query result summary – Teams Message 


 


Conclusion 


In this post, we demonstrated how you can use the Microsoft 365 Defender APIs and Jupyter Notebook to automate execution of hunting queries playbook. We hope you found this helpful! 


 


Appendix  


For more information about Microsoft 365 Defender APIs and the features discussed in this article, please read: 




The sample Notebook discussed in the post is available in the github repository
Microsoft-365-Defender-Hunting-Queries/M365D APIs ep3.ipynb at master · microsoft/Microsoft-365-Defender-Hunting-Queries (github.com)


 


As always, we’d love to know what you think. Leave us feedback directly on Microsoft 365 security center or start a discussion in Microsoft 365 Defender community

April identity updates – Preview of embed Azure AD B2C sign-in interface in an iframe

April identity updates – Preview of embed Azure AD B2C sign-in interface in an iframe

This article is contributed. See the original author and article here.

Howdy folks,


 


I’m excited to share the latest Active Azure Directory news, including feature updates, support depreciation, and the general availability of new features that will streamline administrator, developer, and end user experiences. These new features and feature updates show our commitment to simplifying identity and access management, while also enhancing the kinds of customization and controls our customers need.


 


 


New features



  • Embed Azure AD B2C sign-in interface in an iframe (Preview): Customers have told us how jarring it is to do a full-page redirect when users authenticate. Using a custom policy, you can now embed the Azure AD B2C experience within an iframe so that it appears seamlessly within your web application. Learn more in the documentation.


B2C iframe.png


 


 



  • Custom email verification for Azure AD B2C (GA): You can send customized email to users that sign up to use your customer applications, with a third-party email provider such as Mailjet or SendGrid. Using a Azure AD B2C custom policy, you can set up an email template, From: address, and subject, as well as support localization and custom one-time password (OTP) settings. Learn more in the documentation.


B2customemail.png


 


 


 



  • Additional service and client support for Continuous Access Evaluation (CAE) – MS Graph service & OneDrive clients on all platforms (Windows, Web, Mac, iOS, and Android) start to support CAE at the beginning of April. Now OneDrive client access can be terminated immediately right after security events, like session revocation or password reset, if you have CAE enabled in your tenant.


 


 


We’re always looking to improve Azure AD in ways that benefit IT and end users. Often, these updates originate with the suggestions of users of the solution. We’d love to hear your feedback or suggestions for new features or feature updates in the comments or on Twitter (@AzureAD).



Alex Simons (@Alex_A_Simons)


Corporate VP of Program Management


Microsoft Identity Division


 


 


Learn more about Microsoft identity: