Lesson Learned #170: Connection Pooling caches IP address

This article is contributed. See the original author and article here.

Today, I had a discussion with a colleague with a very interesting situation. All points to that the IP address is part of the cached data in the connection pooling slot. 


 


In order to test this situation, I developed this small PowerShell Script doing a simple connection and executing “SELECT 1” command around for 10000 times.  Using connection pooling the connection will be re-use every time without opening a new port everytime. 


 


So, during the execution of PowerShell script, we are going to use the host file to modify the resolution of the IP address  of our Azure SQL Server servername.database.windows.net based on North Europe. 


 


Results using connection pooling:


 



  • Using a fake IP we obtain errors about the connection, removing this fake IP the execution becomes successulf but if I changed the resolution of the IP again to this fake address new connections using the same connection pooling slot is still working, so the IP address has not been updated. 

  • However,  without using connection pooling and repeating the same procedure without using connection pooling I have the expected behavior, everytime that I changed the IP resolution to the fake IP I got an error. 


 


Lesson Learned: In case of failovers or IP changes if you are using connection pooling, all points that the connection pooling slot will be used the previous IP. In order to avoid this issue, I implemented the following workaround:


 



  • In every connection check the DNS

  • if the IP is changed execute a ClearPool of the SQLConnection in order to clean this cache, with an exception if that both IPs are from the same datacenter (round-robin access), because, there is not needed to execute the clean the pool.


 


Enjoy!


 


Code:

$DatabaseServer = "servername.database.windows.net"
$Database = "DatabaseName"
$Username = "UserName"
$Password = "Password" 
$Pooling = $true
$NumberExecutions =100000
$FolderV = "C:MyFolder"
[string]$LocalInitialIP = ""

#-------------------------------
Function DetectIP([Parameter(Mandatory=$true, Position=0)]
                  [string] $IP)
{ 
  
  $IP=$IP.Trim()
  if(($IP -eq "40.68.37.158") -or ($IP -eq "104.40.168.105") -or ($IP -eq "52.236.184.163")) {return "WE"}

  if(($IP -eq "40.113.93.91") -or ($IP -eq "52.138.224.1") -or ($IP -eq "13.74.104.113")) {return "NE"}

  return "UNknow"

  ##See URL: https://docs.microsoft.com/en-us/azure/azure-sql/database/connectivity-architecture
}



#-------------------------------
Function DeleteFile{ 
  Param( [Parameter(Mandatory)]$FileName ) 
  try
   {
    $FileExists = Test-Path $FileNAme
    if($FileExists -eq $True)
    {
     Remove-Item -Path $FileName -Force 
    }
    return $true 
   }
  catch
  {
   return $false
  }
 }

function GiveMeSeparator
{
Param([Parameter(Mandatory=$true)]
      [System.String]$Text,
      [Parameter(Mandatory=$true)]
      [System.String]$Separator)
  try
   {
    [hashtable]$return=@{}
    $Pos = $Text.IndexOf($Separator)
    $return.Text= $Text.substring(0, $Pos) 
    $return.Remaining = $Text.substring( $Pos+1 ) 
    return $Return
   }
  catch
  {
    $return.Text= $Text
    $return.Remaining = ""
    return $Return
  }
}

#--------------------------------------------------------------
#Create a folder 
#--------------------------------------------------------------
Function CreateFolder
{ 
  Param( [Parameter(Mandatory)]$Folder ) 
  try
   {
    $FileExists = Test-Path $Folder
    if($FileExists -eq $False)
    {
     $result = New-Item $Folder -type directory 
     if($result -eq $null)
     {
      logMsg("Imposible to create the folder " + $Folder) (2)
      return $false
     }
    }
    return $true
   }
  catch
  {
   return $false
  }
 }

#--------------------------------
#Validate Param
#--------------------------------

function TestEmpty($s)
{
if ([string]::IsNullOrWhitespace($s))
  {
    return $true;
  }
else
  {
    return $false;
  }
}

Function GiveMeConnectionSource([Parameter(Mandatory=$false)][String][ref]$InitialIP)
{ 
  for ($i=1; $i -lt 10; $i++)
  {
   try
    {
      logMsg( "Connecting to the database...Attempt #" + $i) (1)
      if( TestEmpty($InitialIP) -eq $true)
       {$InitialIP = CheckDns($DatabaseServer)}
      else
      {
       [string]$OtherIP = CheckDns($DatabaseServer)
       If( $OtherIP -ne $InitialIP )
       {
         $PreviousOneDC=DetectIP($InitialIP)
         $NewOneDC = DetectIP($OtherIP)

         If($PreviousOneDC -ne $NewOneDC)
         { 
           [System.Data.SqlClient.SqlConnection]::ClearAllPools()
           logMsg("IP changed noticed. Cleaning Pools...") (1)
         }
         else
         { 
           logMsg("IP changed noticed, same DC...") (1)
         }
          
       }   
      }

      $SQLConnection = New-Object System.Data.SqlClient.SqlConnection 
      $SQLConnection.ConnectionString = "Server="+$DatabaseServer+";Database="+$Database+";User ID="+$username+";Password="+$password+";Connection Timeout=15" 
      if( $Pooling -eq $true )
        {
          $SQLConnection.ConnectionString = $SQLConnection.ConnectionString + ";Pooling=True"
        } 
      else
        {
          $SQLConnection.ConnectionString = $SQLConnection.ConnectionString + ";Pooling=False"
        }

      $SQLConnection.Open()
      logMsg("Connected to the database...") (1)
      return $SQLConnection
      break;
    }
  catch
   {
    logMsg("Not able to connect - Retrying the connection..." + $Error[0].Exception) (2)
    Start-Sleep -s 5

   }
  }
}

#--------------------------------
#Check DNS
#--------------------------------
function CheckDns($sReviewServer)
{
try
 {
    $IpAddress = [System.Net.Dns]::GetHostAddresses($sReviewServer)
    foreach ($Address in $IpAddress)
    {
        $sAddress = $sAddress + $Address.IpAddressToString + " ";
    }
    logMsg("ServerName:" + $sReviewServer + " has the following IP:" + $sAddress) (1)
    return $sAddress
    break;
 }
  catch
 {
  logMsg("Imposible to resolve the name - Error: " + $Error[0].Exception) (2)
  return ""
 }
}

#--------------------------------
#Log the operations
#--------------------------------
function logMsg
{
    Param
    (
         [Parameter(Mandatory=$true, Position=0)]
         [string] $msg,
         [Parameter(Mandatory=$false, Position=1)]
         [int] $Color
    )
  try
   {
    $Fecha = Get-Date -format "yyyy-MM-dd HH:mm:ss"
    $msg = $Fecha + " " + $msg
    Write-Output $msg | Out-File -FilePath $LogFile -Append
    $Colores="White"
    $BackGround = 
    If($Color -eq 1 )
     {
      $Colores ="Cyan"
     }
    If($Color -eq 3 )
     {
      $Colores ="Yellow"
     }

     if($Color -eq 2)
      {
        Write-Host -ForegroundColor White -BackgroundColor Red $msg 
      } 
     else 
      {
        Write-Host -ForegroundColor $Colores $msg 
      } 


   }
  catch
  {
    Write-Host $msg 
  }
}



cls


  $sw = [diagnostics.stopwatch]::StartNew()


  logMsg("Creating the folder " + $FolderV) (1)
   $result = CreateFolder($FolderV) 
   If( $result -eq $false)
    { 
     logMsg("Was not possible to create the folder") (2)
     exit;
    }
logMsg("Created the folder " + $FolderV) (1)

$LogFile = $FolderV + "Results.Log"    #Logging the operations.

logMsg("Deleting Log File") (1)
   $result = DeleteFile($LogFile) #Delete Log file
logMsg("Deleted Log File") (1)

$query = @("SELECT 1")

$LocalInitialIP = CheckDns($DatabaseServer)

  for ($i=0; $i -lt $NumberExecutions; $i++)
  {
   try
    {

      $SQLConnectionSource = GiveMeConnectionSource([ref]$LocalInitialIP) #Connecting to the database.
      if($SQLConnectionSource -eq $null)
      { 
        logMsg("It is not possible to connect to the database") (2)
      }
      else
      {
       
       $SQLConnectionSource.StatisticsEnabled = 1 
       $command = New-Object -TypeName System.Data.SqlClient.SqlCommand
       $command.CommandTimeout = 60
       $command.Connection=$SQLConnectionSource
                  
         for ($iQuery=0; $iQuery -lt $query.Count; $iQuery++) 
         {
           $start = get-date
             $command.CommandText = $query[$iQuery]
             $command.ExecuteNonQuery() | Out-Null 
           $end = get-date
           $data = $SQLConnectionSource.RetrieveStatistics()
           logMsg("-------------------------")
           logMsg("Query                 :  " + $query[$iQuery]) 
           logMsg("Iteration             :  " +$i) 
           logMsg("Time required (ms)    :  " +(New-TimeSpan -Start $start -End $end).TotalMilliseconds) 
           logMsg("NetworkServerTime (ms):  " +$data.NetworkServerTime) 
           logMsg("Execution Time (ms)   :  " +$data.ExecutionTime) 
           logMsg("Connection Time       :  " +$data.ConnectionTime) 
           logMsg("ServerRoundTrips      :  " +$data.ServerRoundtrips) 
           logMsg("-------------------------")
          }
          $SQLConnectionSource.Close()

      }
    }
    catch
   {
    logMsg( "You're WRONG") (2)
    logMsg($Error[0].Exception) (2)
   }
   
}
logMsg("Time spent (ms) Procces :  " +$sw.elapsed) (2)
logMsg("Review: https://docs.microsoft.com/en-us/dotnet/framework/data/adonet/sql/provider-statistics-for-sql-server") (2)

 


 

Azure Percept enables simple AI and computing on the edge

Azure Percept enables simple AI and computing on the edge

This article is contributed. See the original author and article here.

The combination of artificial intelligence and computing on the edge enables new high-value digital operations across every industry: retail, manufacturing, healthcare, energy, shipping/logistics, automotive, etc. Azure Percept is a new zero-to-low-code platform that includes sensory hardware accelerators, AI models, and templates to help you build and pilot secured, intelligent AI workloads and solutions to edge IoT devices. This posting is a companion to the new Azure Percept technical deep-dive on YouTube, providing more details for the video:


 


 


 


Hardware Security


 


The Azure Percept DK hardware is an inexpensive and powerful 15-watt AI device you can easily pilot in many locations. It includes a hardware root of trust to protect AI data and privacy-sensitive sensors like cameras and microphones. This added hardware security is based upon a Trusted Platform Module (TPM) version 2.0, which is an industry-wide, ISO standard from the Trusted Computing Group. Please see the Trusted Computing Group website for more information with the complete TPM 2.0 and ISO/IEC 11889 specification. The Azure Percept boot ROM ensures integrity of firmware between ROM and operating system (OS) loader, which in turn ensures integrity of the other software components, creating a chain of trust.


 


The Azure Device Provisioning Services (DPS) uses this chain of trust to authenticate and authorize each Azure Percept device to Azure cloud components. This enables an AI lifecycle for Azure Percept: AI models and business logic containers with enhanced security that can be encrypted in the cloud, downloaded, executed at the edge, with properly signed output sent to the cloud. This signing attestation provides tamper-evidence for all AI inferencing results, providing a more fully trustworthy environment. More information on how the Azure Percept DK is authenticated and authorized via the TPM can be found here: Azure IoT Hub Device Provisioning Service – TPM Attestation.


 


Example AI Models


 


This example showcases a Percept DK semantic segmentation AI model (Github source link) based upon U-Net, trained to recognize the volume of bananas in a grocery store. In the video below, the bright green highlights are the inferencing results from the U-Net AI model running on the Azure Percept DK:


 


 

Semantic segmentation AI models label each pixel in a video with the class of object for which it was trained, which means it can compute the two-dimensional size of irregularly shaped objects in the real world. This could be the size of an excavation from a construction site, the volume of bananas in a bin, or the density of packages in a delivery truck. Since you can perform AI inferencing over these items periodically, this enables time-series data upon the change in shape of the objects being detected. How fast is the hole being excavated? How efficient is the cargo space loading utilization in the delivery truck? In the banana example above, this time series data allows retailers reduce food waste by creating more efficient supply chains with less safety stock. In turn this reduces CO2 emissions by less transportation of fresh food, and less fertilizer required in the soil.


 


This example also showcases the Bring Your Own Model (BYOM) capabilities of Azure Percept. BYOM allows you to bring your own custom computer vision pipeline to your Azure Percept DK. This tutorial shows how to convert your own TensorFlow, ONNX or Caffe models for execution upon the  Intel® Movidius™ Myriad™ X within the Azure Percept DK, and then how to subclass video pipeline IoT container to integrate your inferencing output. Many of the free, pre-trained open-source AI models in the Intel Model Zoo will run on the Myriad X.


 


People Counting Reference Application


 


Combining edge-based AI inferencing and video with cloud-based business logic can be complex. Egress, storage and synchronization of the edge AI output and H.264 video streams in the cloud makes it even harder. Azure Percept Studio includes a free, open source reference application which detects people and generates their {x, y} coordinates in a real-time video stream. The application also provides a count of people in a user-defined polygon region within the camera’s viewport. This application showcases the best practices for security and privacy-sensitive AI workloads.


 


The overall topology of the reference application is shown below. The left side of the illustration contains the set of components which run at the edge within Azure Percept DK. The H.264 video stream and the AI inferencing results are then sent in real-time to the right side which runs in the Azure public cloud:


 


 

DK-1.png


 


Because the Myriad X performs hardware encoding of both the compressed full motion H.264 video stream and AI inferencing results, hardware-synchronized timestamps in each stream makes it possible to provide downstream frame-level synchronization for applications running in the public cloud. The Azure Websites application provided in this example is fully stateless, simply reading the video + AI streams from their separate storage accounts in the cloud and composing the output into a single user interface:


 


People Detector GIF Animation.gif


 


Included in the reference application is also a fully automated deployment model utilizing an Azure ARM template. If you click the “Deploy to Azure” button, then “Edit Template”, you will see this code:


 


DK-3.PNG


 


This ARM template deploys the containers to the Azure Percept DK, creates the storage accounts in the public cloud, connects the IoT Hub message routes to the storage locations, deploys the stateless Azure Websites, and then connects the website to the storage locations. This example can be used to accelerate the development of your own hybrid edge/cloud workloads.


 


Get Started Today


 


Order your Percept DK today and get started with our easy-to-understand samples and tutorials. You can rapidly solve your business modernization scenarios no matter your skill level, from beginner to an experienced data scientist. With Azure Percept, you can put the cloud and years of Microsoft AI solutions to work as you pilot your own edge AI solutions.


 

Friday Five: Top Azure, Teams Tips!

Friday Five: Top Azure, Teams Tips!

This article is contributed. See the original author and article here.

20180904081913-IMG_0221_medium copy.jpg


The Ultimate How To Guide for Presenting Content in Microsoft Teams


Vesku Nopanen is a Principal Consultant in Office 365 and Modern Work and passionate about Microsoft Teams. He helps and coaches customers to find benefits and value when adopting new tools, methods, ways or working and practices into daily work-life equation. He focuses especially on Microsoft Teams and how it can change organizations’ work. He lives in Turku, Finland. Follow him on Twitter: @Vesanopanen


image.png


Backup all WSP SharePoint Solutions using PowerShell


Mohamed El-Qassas is a Microsoft MVP, SharePoint StackExchange (StackOverflow) Moderator, C# Corner MVP, Microsoft TechNet Wiki Judge, Blogger, and Senior Technical Consultant with +10 years of experience in SharePoint, Project Server, and BI. In SharePoint StackExchange, he has been elected as the 1st Moderator in the GCC, Middle East, and Africa, and ranked as the 2nd top contributor of all the time. Check out his blog here.


tobias.jpg


Getting started with Azure Bicep


Tobias Zimmergren is a Microsoft Azure MVP from Sweden. As the Head of Technical Operations at Rencore, Tobias designs and builds distributed cloud solutions. He is the co-founder and co-host of the Ctrl+Alt+Azure Podcast since 2019, and co-founder and organizer of Sweden SharePoint User Group from 2007 to 2017. For more, check out his blog, newsletter, and Twitter @zimmergren


image.png


Azure: Talk about Private Links


George Chrysovalantis Grammatikos is based in Greece and is working for Tisski ltd. as an Azure Cloud Architect. He has more than 10 years’ experience in different technologies like BI & SQL Server Professional level solutions, Azure technologies, networking, security etc. He writes technical blogs for his blog “cloudopszone.com“, Wiki TechNet articles and also participates in discussions on TechNet and other technical blogs. Follow him on Twitter @gxgrammatikos.


ChrisH-1Edit.PNG


Teams Real Simple with Pictures: Hyperlinked email addresses in Lists within Teams


Chris Hoard is a Microsoft Certified Trainer Regional Lead (MCT RL), Educator (MCEd) and Teams MVP. With over 10 years of cloud computing experience, he is currently building an education practice for Vuzion (Tier 2 UK CSP). His focus areas are Microsoft Teams, Microsoft 365 and entry-level Azure. Follow Chris on Twitter at @Microsoft365Pro and check out his blog here.

Export Power Apps and Power Automate user licenses

Export Power Apps and Power Automate user licenses

This article is contributed. See the original author and article here.

Context


During a Power Platform audit for a customer, I was looking for exporting all user licenses data into a single file to analyze it with Power BI. In this article, I’ll show you the easy way to export Power Apps and Power Automate user licenses with PowerShell!


 


Download user licenses


To do this, we are using PowerApps PowerShell and more particular, the Power Apps admin module.


 


To install this module, execute the following command as a local administrator:


 

Install-Module -Name Microsoft.PowerApps.Administration.PowerShell

 


Note: if this module is already installed on your machine, you can use the Update-Module command to update it to the latest version available.


 


Then to export user licenses data, you just need to execute the following command and replace the target file path to use:


 

Get-AdminPowerAppLicenses -OutputFilePath <PATH-TO-CSV-FILE>

 


Note: you will be prompted for your Microsoft 365 tenant credentials, you need to sign-in as Power Platform Administrator or Global Administrator to execute this command successfully.


 


After this, you can easily use the generated CSV file in Power BI Desktop for further data analysis:


 


power-platform-licenses.png


 


Happy reporting everyone!


 


You can read this article on my blog here.


 


Resources


https://docs.microsoft.com/en-us/powershell/powerapps/get-started-powerapps-admin


https://docs.microsoft.com/en-us/powershell/module/microsoft.powerapps.administration.powershell/get-adminpowerapplicenses

AzUpdate: Deploying HoloLens 2, App Gateway URL rewrite, Ubuntu 16.04 End of Support and more

AzUpdate: Deploying HoloLens 2, App Gateway URL rewrite, Ubuntu 16.04 End of Support and more

This article is contributed. See the original author and article here.

The team is excited this week to share what we’ve been working on based on all your input. News to be covered includes Application Gateway URL Rewrite General Availability, End of Support for Ubuntu 16.04, Using Azure Migrate with Private Endpoints, Overview of HoloLens 2 deployment and security-based Microsoft Learn Module of the week.


 



 


Application Gateway URL Rewrite General Availability


 



 


Azure Application Gateway can now rewrite the host name, path and query string of the request URL. You can now also rewrite the URL of all or some of the client requests based on matching one or more conditions as required. Administrators can also now choose to route the request based on the original URL or the rewritten URL. This feature enables several important scenarios such as allowing path based routing for query string values and support for hosting friendly URLs.


 


Learn more here: Rewrite HTTP Headers and URL with Application Gateway


 


Upgrade your Ubuntu server to Ubuntu 18.04 LTS by 30 April 2021


 


ubuntu_server_16_04_lts_end_of_support_sml.png


 


Ubuntu is ending standard support for Ubuntu 16.04 LTS on 30 April 2021. Microsoft will replace the Ubuntu 16.04 LTS image with an Ubuntu 18.04 LTS image for new compute instances and clusters to ensure continued security updates and support from the Ubuntu community. If you have long running compute instances, or non-autoscaling clusters (min nodes > 0), please follow the instructions here for manual migration before 30 April 2021. After 30 April 2021, support for Ubuntu 16.04 ends and no security update will be provided. Please migrate to Ubuntu 18.04 immediately or before 30 April 2021, and please note that Microsoft will not be responsible for any kind of security breaches after the deprecation.


 


Azure Migrate with private endpoints


 



 


You can now use Azure Migrate: Discovery and Assessment and Azure Migrate: Server Migration tools to privately and securely connect to the Azure Migrate service over an ExpressRoute private peering or a site-to-site VPN connection via an Azure private link. The private endpoint connectivity method is recommended when there is an organizational requirement to not cross public networks to access the Azure Migrate service and other Azure resources. You can also use the private link support to use an existing ExpressRoute private peering circuits for better bandwidth or latency requirements.
 
Learn more here: Using Azure Migrate with private endpoints


 


HoloLens 2 Deployment Overview


 



 


Our team recently partnered with the HoloLens team to collaborate on and curate documentation surrounding HoloLens 2 deployment. Just like any device that requires access to an organizations network and data, the HoloLens in most cases requires management via that organization’s IT department.


 


This writeup shares all the necessary services required to get started:  An Overview of How to Deploy HoloLens 2


 


 


Community Events



 


MS Learn Module of the Week


Microsoft_Learn_Banner.png


 


AZ-400: Develop a security and compliance plan


Build strategies around security and compliance that enable you to authenticate and authorize your users, handle sensitive information, and enforce proper governance.
 




Modules include:


 



  • Secure your identities by using Azure Active Directory

  • Create Azure users and groups in Azure Active Directory

  • Authenticate apps to Azure services by using service principals and managed identities for Azure resources

  • Configure and manage secrest in Azure Key Vault

  • and more


 


Learn more here: Develop a security and compliance plan


 



 


 


Let us know in the comments below if there are any news items you would like to see covered in the next show. Be sure to catch the next AzUpdate episode and join us in the live chat.