[Event Recap] Humans of IT @ Microsoft Ignite 2020

[Event Recap] Humans of IT @ Microsoft Ignite 2020

This article is contributed. See the original author and article here.

Did you manage to participate and tune into Microsoft Ignite 2020 this year? We hope you had a great experience! If you missed the sessions being played live, fret not as we’ve got you covered with this recap. Read on to all get caught up!

 

“One of the greatest digital experiences. Thank you #MSIgnite, you’ve proven that #EmpoweringCommunity is becoming stronger even while #COVID attempted to divide us “physically”. I enjoyed being part of the show. #CommunityRocks #HumansofIT” – Fatima Z. Benhamida, Office Apps and Services MVP

 

This year, for the first time ever at Microsoft Ignite history, humans around the world joined virtually during sessions on the Humans of IT track spanning over 48 hours. Topics spanned across a variety of critical topics centered around leveraging tech for good, creating social impact, how to hack your career, and tips on sustaining human connections in a virtual world as we navigate this new normal. Through #RealTalk conversations with leaders, influencers, and community members, attendees found a safe and welcoming space for them to relate and connect with one another on a personal level.

 

Want to hear what attendees have to say about Humans of IT at Microsoft Ignite? Check out all the tweets on Twitter via the #HumansofIT hashtag!

 

Here are a few of our favorite tweets:

 

Phil Erb.png

Jon Jerman tweet.png

 

 

 

 

Abi Fidler.png

 

Microsoft Ignite: Day 1 (Tuesday, September 22)

 

The Humans of IT kicked off on an inspiring note with Humans of IT Community Ambassador @Dux Raymond Sy moderating a panel about tangible ways to leverage tech to empower nonprofits during these challenging times.

 

Leveraging Tech for Good Empowering Nonprofits in a Global Pandemic.png

 

Panelists included @Shingyu@Mariotrentim , @foyinb and Sam Spilsbury who shared about their own stories in leveraging tech for good through nonprofit organizations and how others in the community can also get involved:

  • Boys & Girls Club of America – As the Director of Youth Development programs at the Boys of Girls Club of America, Shing Yu has served youth through technology helping provide access to Internet & Device donations, virtual program, and website design. Want to volunteer? Connect with your local Boys & Girls Club, or email Shing Yu directly to learn more about opportunities within the US. 
  • Tech Stylers – Foyin Olajie-Bello started an organization to empower young women in Africa to build apps, bots, websites, automation & AI without learning to code. Want to volunteer? Help identify find ways to upskill others, and pay it forward. Check out the Tech Stylers Twitter handle for the latest news and announcements.
  • ITA Ex – Mario supports business continuity in Education by creating resources, articles, videos and guidance to non-profits in Brazil. Many MVPs in Brazil are working together to build a consortium of sorts to pool together tech talents and resources to help support nonprofits in the region.
  • Um Por Todos – Mario also helped nonprofits shift their digital strategy and processes by leveraging Power Apps to track donations for COVID-19 relief. How to get involved: Consider volunteering for either of these organizations through the websites listed.
  • St. John Ambulance – Sam helped build a dynamic reporting platform using Power BI to improve the volunteer experience at St John Ambulance. How to get involved: Consider building relationships with organizations that have volunteer opportunities, and engage with them to find out how you can best support them.
  • Mclean Bible Church – Mclean Bible Church helps distribute food and meals to needy families in the greater Washington D.C. using Microsoft Teams and Power Apps to enable volunteers to serve their local community. How to get involved: Consider serving at local food banks, encourage your user group to adopt a local non-profit organization and consider providing your products or services for free to non-profits. You can also check out the Microsoft for Nonprofits site for more resources. 

Tech stylers.pngSt Johns Ambulance.jpgBoys and girls club.png

 

 

 

 

We carried that same energy through the afternoon with the “Sustaining Humans Connection in a Virtual World with Technology” panel, featuring individuals who share their unique perspective and research on the vital role technology plays in sustaining quality human connections that are fully accessible in the world of remote work. Megan Lawrence, Sr. Accessibility Evangelist, led an insightful discussion with panelists including Leah Katz-Hernandez, an employee who lives with hearing loss as well as Allexa Laycock, a filmmaker with the Rooted in Rights organization and seeks to remove the stigma around disability, mental health and chronic illnesses by spotlighting stories from the community. Panelists shared lessons about relationship building, staying focused and productive while remote, and building accessible technologies so that truly everyone – regardless of ability – can stay connected from remote places.  

 

Excerpts from the panel:

  • Leah Katz-Hernandez, Communications Manager at the Office of the CEO, explains that based on her own experience, disabilities are already commonly misunderstood in the workplace, and the new world of remote work may enhance misunderstanding unless we are all intentional about how we engage with others, and put more thought and effort into connecting with colleagues and building relationships. Understanding different communication styles is essential to successful remote work cultures in the long term.
  • Michael Bohan, from Microsoft’s Human Factors Engineering Team shared about his team’s research which helped identify key stressors that remote work can put on the human brain, which in turn impacts our relationships and ability to focus. For example, the brain has to work extra hard to internalize relationships in-person when they are first established online.
  • Allexa Laycock, Creative Director at Rooted in Rights Washington highlighted the fact that the disability community often has compelling solutions to problems and want to be included in conversations about technology, but tend to be overlooked. The current pandemic has brought these gaps to light, and although the global situation has been stressful, it has also created a silver lining in that there are now opportunities to share solutions in ways they haven’t before.

 

Sustaining human technology.pngSustaining 3.JPGSustaining 2.png

 

 

 

 

 

 

 

 

 

Lastly, Scott Hanselman and Grace Macjones had a candid conversation about “How to be a Social Technologist in a Digital World,” discussing how to grow an online presence and stay connected in a virtual world. They shared ideas on how to create balance between meetings, boost the quality of online social interactions, productivity and home lives. They also addressed the important topic of burnout, and watching out for early signs of burnout to help support mental wellness during these challenging times.

 

ST 123.jpg

 

22.png

 

The broader Tech Community also hosted inspirational community table talks on human-centered topics, where community peers shared their personal stories and held fruitful discussions around community and tech careers.

 

Microsoft Ignite: Day 2 (Wednesday, September 23)

 

Day 2 of Microsoft Ignite Humans of IT sessions focused on topics such as navigating personal career journeys, discovering tech superpowers, mentoring future technologists, and using AI for Good.

 

We started the day with power-packed stories (excuse the pun!) from the Power Platform team, moderated by Jeremiah Marble. Panelists were from all walks of life and diverse backgrounds, including Gomolemo Mohapi, Ashlee Culmsee, Joe Camp, and Mary Thompson, who shared their perspectives on how you can “Be The Hero Of Your Own Story” – their personal stories were truly inspiring, emphasizing that it’s never too late to try something new in your life and that everyone can find their way into IT if they are up for it, even if it means finding ways to cope with social anxiety.

 

Power Apps.jpg

 

The inspiration continued with an amazing group of leaders who are using AI for Good in an innovative way. During the “Intersection of AI and Humanity: Solving World Challenges through AI Innovation” we heard from a panel of speakers who shared about their personal and professional experiences that have led them to successfully combine their imagination and creativity with the power of AI to help solve health, societal and humanitarian challenges.  

 

Furthermore, Iasia Brown roller-skated (literally!) her way onto the screen to share her unique experience transitioning into tech during the “Choose Your Adventure: How to Hack Your Tech Career and Carve The Path YOU Want” panel with Chloe Brown, Karen Alarcon, and Ryen Macababbad. This session reignited feelings of excitement and validity for the audience as they talked through how individuals with different passions, interests, and backgrounds can successfully make their way into tech, navigate through challenges and how they’ve worked through imposter syndrome and overcome a fear of failure. Remember – to quote Iasia, “The only dead-end is death. As long as you’re alive, there are no dead ends; you can always make a career switch and pursue something new at ANY stage in life.” 

 

Go chase that dream, Humans of IT!

 

rollerblade.png

CYOA.png

The afternoon wrapped up with a session about the collective experience of 50 founders in just 30 minutes with the Managing Director of Microsoft for Startups, Annie Parker and her co-host Lahini Arunachalam, walking the audience through key findings from recent customer research. Additionally, we learned about the journeys of two incredible women founders:

  • Mikaela Jade, founder of Indigital, the first Indigenous Digital Skills training program which teaches kids how to bring over 80,000 years of Indigenous cultural knowledge, history, and language to life through augmented reality, Minecraft and Python coding
  • Kai Frazier, founder & CEO of Kai XR, an EdTech company that provides inclusive and accessible opportunities for underserved communities. Kai talked about her journey is securing funding as a Black female founder, finding investors and carving opportunities for herself through sheer grit and determination.

Founder 2.pngFounder 1.png

 

To wrap up the event on a high note, we were thrilled to have Microsoft Director of Merchandise Planning and former HBCU alum, Tammy Richardson host an engaging panel with our HBCU student ambassadors from five schools in the Louisiana area. These students shared about their experiences as students from underrepresented communities in tech during the “Mentoring Future Technologists: HBCU Students get Real” session. They inspired attendees to think more deeply about the future of tech, how to accelerate tech careers for people of color, and how to begin your own tech journey and create opportunities for yourself despite the disadvantages you may face. The best quote from someone in the audience was that they left the session feeling assured that “the future of tech is in good hands”. Kudos to all our Microsoft Ignite HBCU Student Ambassadors!

 

HBCU student ambassadors.jpg

 

We are truly grateful to have been able to share the voices of so many amazing speakers and (virtually) encouraged and inspired thousands of attendees this year on a global scale in a deeply meaningful, human way.

 

We hope you enjoyed the Humans of IT track @ Microsoft Ignite as much as we did, and we’ll see you next year!

 

Missed a session and want to catch up? Or perhaps you want to rewatch them all over again?

Access all of the Humans of IT session recordings here. If the session you want is not showing up yet, check back again in 24 hours as they are currently being batch-uploaded.

 

Community reflection / Homework (of course you knew there would be homework!)
Want to see what other community members thought of Humans of IT sessions at Microsoft Ignite? Read the responses posted here.

 

Share your Microsoft Ignite experience with us: 

  • What were YOUR favorite parts about this digital Microsoft Ignite?
  • What do you hope to see from Humans of IT at the next virtual conference?

We want to hear from you in the comments below. Who knows, there might be a lucky winner being picked for a random Humans of IT swag giveaway again ;)

 

See you next time! :smiling_face_with_smiling_eyes:

 

#HumansofIT

#MSIgnite

Continued performance improvements to Excel in Microsoft 365: time to take another look

This article is contributed. See the original author and article here.

When planning an Office upgrade, Excel performance is often a top concern. We understand you need to be certain that your macros, formulas, and data models will execute as smoothly after your upgrade to Microsoft 365 Apps as they did before. That’s why the Excel team has been hard at work on a long list of performance upgrades aimed at making Excel in Microsoft 365 the new gold standard for Excel performance.  

 

This week at Ignite, we announced faster Aggregation and RealTimeData functions – a third round of enhancements following Speedy Lookups in 2018 and a host of CPU and memory-use improvements in 2017. And customers are taking note: even before this last round of improvements, an analyst at Bridgewater Associates told us that “Moving from 32-bit Excel 2016 to Microsoft 365 Apps has increased the speed in which we can run our custom tools and delivered greater stability and performance.” 

 

 

 

If concerns about Excel performance have been keeping you on an older version of Office, it’s time to take another look at what Excel in Microsoft 365 has to offer. Support for Office 2010 ends on October 13, 2020. Make the shift to Microsoft 365 Apps today. 

Use Azure IR to Tune ADF and Synapse Data Flows

Use Azure IR to Tune ADF and Synapse Data Flows

This article is contributed. See the original author and article here.

Azure Integration Runtimes are ADF and Synapse entities that define the amount of compute you wish to apply to your data flows, as well as other resources. Here are some tips on how to tune data flows with proper Azure IR settings.

 

In all 3 of these examples, I tested my data flows with a demo set of mocked-up loans data in a CSV file located in my Blob Store container. There were 887k rows with 74 columns and in each case I read from the file, duplicated the data into 2 separate streams, 1 row aggregated a row count, and the 2nd row masked PII data with a 1-way hash. I then loaded the data into a destination blob store folder sink.

 

perf4.png

 

Each of these executions below was run from an ADF pipeline using the “Debug > Use Activity Runtime” setting so that I could manually adjust the number of cores and compute type for each run. This means that I am not using the warmed-up debug cluster session. The average start-up time for the Databricks cluster was 4.5 mins. I also left all optimization settings in the transformations to default / use current partitioning. This allowed data flows to rely fully on Spark’s best guess for partitioning my file-based data.

Compute Optimized

First, I ran the pipeline using the lowest cost option, Compute Optimized. For very memory-intensive ETL pipelines, we do not generally recommend using this category because it has the lowest RAM/Core ratio for the underlying VMs. But it can be useful for cost-savings and pipelines that are not acting on very large data without many joins or lookups. In this case, I chose 16 cores, which is 8 cores for the driver node and 8 cores for the worker nodes.

Results

  • Sink IO writing: 20s
  • Transformation time: 35s
  • Sink post-processing time: 40s
  • Data Flows used 8 Spark partitions based on my 8 core worker nodes.

perfco1.png

 

General Purpose

Next, I tried the exact same pipeline using General Purpose with the small 8 core (4+4) option, which gives you 1 driver and 1 worker node, each with 4 cores. This is the small default debug cluster you are provided with the Default Auto Azure Integration Runtime. General Purpose is a very good middle option for data flows with a better RAM-to-CPU ratio than Compute Optimized. But I would highly recommend much higher core counts than I used here in this test. I am only using the default 4+4 to demonstrate to you that the default 8 core total is fine for small debugging, but not good for operationalized pipelines. 

Results

  • Sink IO writing: 46s
  • Transformation time: 42s
  • Sink post-processing time: 45s
  • Data Flows partitioned the file data into 4 parts because in this case because I backed out to only 4 worker cores.

perfgp1.png

Memory Optimized

This is the most expensive option and the highest RAM-to-CPU ratio, making it very good for large workloads that you’ve operationalized in triggered pipelines. I gave it 80 cores (64 for workers, 16 for driver) and I naturally had the best individual stage timings with this option. The Databricks cluster took the longest to startup in this configuration and the larger number of partitions led to a slightly higher post-processing time as the additional partitions were coalesced. I ended up with 64 partitions, one for each worker core.

Results

  • Sink IO writing: 19s
  • Transformation time: 17s
  • Sink post-processing time: 40s

perfmo.png

Azure portal September 2020 update

Azure portal September 2020 update

This article is contributed. See the original author and article here.

General
  • Increased auto-refresh rate options on dashboards
  • Improvements to the ARM template deployment experience in the Azure portal
  • Deployment of templates at the tenant, management group and subscription scopes

Database

  • Configuration of Always On availability groups for SQL Server virtual machines

Management + Governance > Resource Graph Explorer

  • New features in the Resource Graph Explorer

Storage > Storage account

  • Azure blob storage updates
    • Object replication for blobs now generally available
    • Blob versioning now generally available
    • Change feed support for blob storage now generally available
    • Soft delete for containers now in public preview
    • Point-in-time restore for block blobs now in public preview

Azure mobile app

  • Azure alerts visualization

Intune

  • Updates to Microsoft Intune

 

Let’s look at each of these updates in greater detail.

           

General

Increased Auto-Refresh Rate Options on Dashboards

We’ve updated the auto-refresh rate options to include 5, 10, and 15 minutes.  

 

  1. Go to the left navigation and choose “Dashboard”

    autorefresh 1.png

  2. Select the auto-refresh rate you’d like and click “Apply”. Now your dashboard will refresh at the interval you selected.

    autorefresh 2.png

  3. You can check when your dashboard was last refreshed at the top right of your dashboard.

    autorefresh 3.png

This Azure portal “how to” video will show you how to set up auto-refresh rates.    

 

General

Improvements to the ARM template deployment experience in the Azure portal

The custom template deployment experience in the Azure portal allows customers to deploy an ARM template. This experience has been updated with the following improvements:

  1. Easier navigation – Now there are multiple tabs which allows users to go back and select a different template without re-loading the entire page.

    ARM template.png

  2. Review  + create tab – The popular “Review + create” tab has been added to the custom deployment experience, allowing users to review parameters before starting the deployment.
  3. Multi-line JSONs and comment support – The Edit template view now supports multi-line JSON and JSONs with comments in accordance with ARMs capabilities.

 

General

Deployment of templates at the tenant, management group and subscription scopes  

The custom template deployment experience in the Azure portal now supports deploying templates at the tenant, management group, and subscription scopes. The Azure portal looks at the schema of the ARM template to infer the scope of the deployment. The correct deployment template schema for deployments at different scopes can be found here. The Deployment scope section of the Basics tab will automatically update to reflect the scope inferred from the deployment template.

 

deployment dir.png

Deployment scope section of the Basics tab when the schema of the template indicates that it is a deployment at tenant scope.

 

deployment mgmt.png

Deployment scope section of the Basics tab when the schema of the template indicates that it is a deployment at management group scope.

 

deploy sub.png

Deployment scope section of the Basics tab when the schema of the template indicates that it is a deployment at subscription scope.

 

Steps:

  1. Navigate to the custom template deployment experience in the Azure portal.
  2. Choose the “Build your own template in the editor” option in the “Select a template” tab.
  3. Author an ARM deployment template that deploys at the tenant, management group or subscription scope and save.
  4. Complete the Basics tab by providing values for deployment parameters.
  5. Review the parameters and trigger a deployment in the Review + create tab.

 

Databases > SQL Virtual Machines

Configuration of Always On availability groups for SQL Server virtual machines

It is now possible to set up an Always On availability group (AG) for your Azure SQL Server Virtual Machines (VM) from the Azure portal. Also available with the use of an ARM template and Azure SQL VM CLI, this new experience simplifies the process of manually configuring availability groups into a few simple steps.

 

You can find this experience in any of your existing SQL virtual machines so long as they are registered with the SQL VM Resource Provider and are version SQL Server 2016 Enterprise or higher. To get started, the experience outlines the pre-requisites that must be completed outside the portal including joining the VMs to the same domain. After meeting the prerequisites, you can create and manage the Windows Server Failover Cluster, the Availability Groups, and the listener, as well as manage the set of VMs in the cluster and the AGs, all from the same portal experience.

 

Follow the detailed step-by-step documentation to set up availability groups. Here we will outline a few of the steps to access this capability:

  1. Sign in to the Azure portal.
  2. In the top search bar, search for “Azure SQL” and select “Azure SQL” under the list of Services.
  3. In the Azure SQL browse list, find and select the SQL VM that is at least version SQL Server 2016 Enterprise or higher that you would like to use as your primary VM.
  4. Select “High availability” in the left-side menu of the SQL VM resource.
  5. Make sure the VM is domain-joined, then select “New Windows Server Failover Cluster” at the top of the page.

    always 1.png

  6. Once you have created or onboarded to a Windows Server Failover Cluster, you can create your first availability group where you will name the availability group, configure a , and select from a list of viable VMs to include in the AG.

    always 2.png

  7. From there you can add databases and other VMs to the availability group, create more availability groups, and manage the configurations of all related settings.

    always 3.png

Management + Governance > Resource Graph Explorer

New features in the Resource Graph Explorer

We’ve added three features to the Resource Graph Explorer:

  1. Keyboard shortcuts are now available, such as Shift+ Enter to run a query. A list of the shortcuts can be found here.
  2. You can now see more than 1000 results by paging through the list.

    resource.png

  3. “Download as CSV” will now save up to 5000 results.

 

Storage/Storage account

Azure blob storage updates

Object Replication for Blobs now generally available

Object replication is a new capability for block blobs that lets you replicate your data from your blob container in one storage account to another anywhere in Azure.  This feature unblocks a new set of common replication scenarios:

  • Minimize latency – Users consume the data locally rather than issuing cross-region read requests.
  • Increase efficiency – Compute clusters process the same set of objects locally in different regions.
  • Optimize data distribution – Data is consolidated in a single location for processing/analytics and resulting dashboards are then distributed to your offices worldwide.
  • Minimize cost – Tier down your data to archive upon replication completion using lifecycle management policies to minimize the cost.

Learn more

 

blob updates.png

View any existing replication rules on your account by navigating to the Object replication resource menu item in your storage account.  Here you can edit or delete existing rules or download them to share with others.  This is also the entry point to either create a new rule or upload an existing rule that’s been shared with you.

 

blob  updates 2.png

Create a replication rule by specifying the destination and source.

 

blob updates 3.png

Upload a replication rule that has been shared with you.

 

Blob Versioning now generally available

Blob Versioning for Azure Storage automatically maintains previous versions of an object and identifies them with version IDs. You can list both the current blob and previous versions using version ID timestamps. You can also access and restore previous versions as the most recent version of your data if they were erroneously modified or deleted by an application or other users. 

 

Together with our existing data protection features, Azure Blob storage provides the most complete user configurable settings to protect your business-critical data.

Enabling versioning is free, but when versions are created, there will be costs associated with additional data storage being used. 

 

Learn more by viewing documentation and the “how to” video

blob vers.png

Enable versioning while creating your storage account.  Versioning can be enabled/disabled after create  as well under the Data Protection resource menu item for your storage account.

 

blob vers 2.png

View and manage blob versions.

 

Change feed support for blob storage now generally available

Change feed provides a guaranteed, ordered, durable, read-only log of all the creation, modification, and deletion change events that occur to the blobs in your storage account.

Change feed is the ideal solution for bulk handling of large volumes of blob changes in your storage account, as opposed to periodically listing and manually comparing for changes. It enables cost-efficient recording and processing by providing programmatic access such that event-driven applications can simply consume the change feed log and process change events from the last checkpoint.

Learn more

change feed.png

Enable change feed while creating your storage account.  Change feed can be enabled/disabled after create as well under the Data Protection resource menu item for your storage account.

 

change feed 2.png

Change feed logs are written to the $blobchangefeed container in your storage account.  Click the following link to understand change feed organization.

 

 

Soft delete for containers now in public preview 

Soft delete for containers protects your data from being accidentally or erroneously modified or deleted. When container soft delete is enabled for a storage account, any deleted container and their contents are retained in Azure Storage for the period that you specify. During the retention period, you can restore previously deleted containers and any blobs within them by calling the Undelete container operation.

Learn more

softdelete.png

Enable soft delete for containers while creating your storage account.  Soft delete can be enabled/disabled after create as well under the Data Protection resource menu item for your storage account.

 

softdelete 2.png

View and restore deleted containers.

 

 

Point-in-time restore for block blobs now in public preview 

Point-in-time restore provides protection against accidental deletion or corruption by enabling you to restore block blob data to an earlier state. This feature is useful in scenarios where a user or application accidentally deletes data or where an application error corrupts data. Point-in-time restore also enables testing scenarios that require reverting a data set to a known state before running further tests.

 

Point-in-time restore requires the following features to be enabled:

  • Soft delete
  • Change feed
  • Blob versioning

 

Point-in-time restore is currently supported for preview for general purpose v2 storage accounts in the following regions:

  • Canada Central
  • Canada East
  • France Central

Learn more.

point.png

Enable point-in-time restore while creating your storage account.  Point-in-time restore can be enabled/disabled after create as well under the Data Protection resource menu item for your storage account.

 

point2.png

Kick off a point-in-time restoration and roll back containers to a specified time and date.

 

 

Azure mobile app

Azure alerts visualization

The Azure mobile app now has a chart visualization for Azure alerts in the Home view. You can now choose between rendering a list view of your fired Azure alerts or displaying them as a chart. The chart view arranges the alerts by severity so you can quickly check the alerts that are active on your Azure environment.

 

In order to choose the Azure alerts chart visualization in the mobile app:

  • On Android: Turn on the “Chart” toggle in the Latest alerts card in the Home view.
  • On iOS: Tap the “Chart” tab in the Latest alerts card in the Home view.

mobile.png

 

INTUNE

Updates to Microsoft Intune

The Microsoft Intune team has been hard at work on updates as well. You can find the full list of updates to Intune on the What’s new in Microsoft Intune page, including changes that affect your experience using Intune.

 

Azure portal “how to” video series

Have you checked out our Azure portal “how to” video series yet? The videos highlight specific aspects of the portal so you can be more efficient and productive while deploying your cloud workloads from the portal.  Check out our most recently published videos:

 

 

Next steps

The Azure portal has a large team of engineers that wants to hear from you, so please keep providing us your feedback in the comments section below or on Twitter @AzurePortal.

 

Sign in to the Azure portal now and see for yourself everything that’s new. Download the Azure mobile app to stay connected to your Azure resources anytime, anywhere.  See you next month!

Released: Public preview of Azure Arc enabled SQL Server

Released: Public preview of Azure Arc enabled SQL Server

This article is contributed. See the original author and article here.

Azure Arc enabled SQL Server is now in public preview. It extends Azure services to SQL Server instances deployed outside of Azure in the customer’s datacenter, on the edge or in a multi-cloud environment.

 

The preview includes the following features:

– Use Azure Portal to register and track the global inventory of your SQL instances across different hosting infrastructures. You can register an individual SQL instance or register a set of servers at scale using the same auto-generated script.

– Use Azure Security Center to produce a comprehensive report of vulnerabilities in SQL servers and get advanced, real time security alerts for threats to SQL servers and the OS.

– Investigate threats in SQL Servers using Azure Sentinel 

– Periodically check the health of the SQL Server configurations and provide comprehensive reports and remediation recommendations using the power of Azure Log analytics.

 

The following diagram illustrates the architecture of Azure Arc enabled SQL Server

pubic-preview-architecture.png

 

The SQL Server can be installed in a virtual or physical machine running Windows or Linux that is connected to Azure Arc via the Connected Machine agent. The agent is installed and the machine is registered automatically as part of the SQL Server instance registration. The agent maintains secure communications with Azure Arc over an outbound port 443 directly or via a HTTP proxy. Any SQL Server instance version 2012 or higher can be registered with Azure Arc. 

 

How to get started

Check out Azure Arc enabled SQL Server documentation for more details on how to register and manage your SQL Server instances using Azure Arc.