Load testing databases with Azure Load Testing

Load testing databases with Azure Load Testing

This article is contributed. See the original author and article here.

In today’s  fast-paced digital world, databases are at the heart of almost every application or service. Databases are responsible for storing and managing vast amounts of data, providing real-time access to information, and powering critical business operations. As the amount of data and the number of users accessing databases continues to grow, it is essential to ensure that they can handle the expected workload and perform efficiently under heavy traffic. Whether you are launching a new application or getting ready for peak traffic, load testing helps you ensure that your database can handle the load and deliver reliable performance.


 


While most database queries typically happen through an application endpoint, there are situations where it is beneficial to directly test the database without involving intermediaries. One such scenario is when you want to assess the performance of a specific query without executing every query in the system or to evaluate the performance of a new query under heavy load. It could also be that your database is used by multiple applications.


 


In this blog post we will look at load testing Azure SQL Database using Azure Load Testing. You can use a similar approach to test other databases on Azure like MongoDB, PostgreSQL etc. We will cover everything you need to know, from setting up your JMeter script, to running the load test and identifying performance bottlenecks.


Setting up Azure SQL Database


The first step in load testing an Azure SQL Database is setting it up. You can use an existing Azure SQL Database instance. For this blog post, we’ll use a sample database. You can create your own sample instance using the Azure portal.


Once you have created the instance, note down the server name, database name, and login credentials. You will need these details later to connect JMeter to the database. Make sure to allow Azure services and resources to access your Azure SQL server as shown below.


 


Screenshot of Networking tab in Azure SQL DatabaseScreenshot of Networking tab in Azure SQL Database


Setting up the JMeter script


To load test your Azure SQL DB you will need to download the Microsoft JDBC Driver for SQL Server . You can download the driver here. Follow the steps below to. You use the artifacts from the samples repository to set up the load test.



  1. Open JMeter and create a new test plan. In your JMeter Test Plan browse and choose the JDBC driver.
    Screenshot of JDBC driver configuration in JMeter GUIScreenshot of JDBC driver configuration in JMeter GUI

  2. Add a JDBC Connection Configuration element to the test plan.

  3. The server name, database name , and login credentials for the Azure SQL Database instance are parameterized and can be provided using environment variables and secrets. You can store the password in an Azure Key Vault and access the same in your JMeter script using the GetSecret function. See the using secrets in Azure Load Testing docs for more detail.
     

     

     


    Screenshot of database configuration in JMeter GUIScreenshot of database configuration in JMeter GUIScreenshot of user defined variables in JMeter GUIScreenshot of user defined variables in JMeter GUI

  4. Add a Thread Group element to the test plan.

  5. Configure the Thread Group element to simulate the desired number of users and requests. In this script we have parameterized the concurrent threads (users) and duration as environment variables.

  6. Add a JDBC Request element to the Thread Group.

  7. Enter the SQL query that you want to execute on the Azure SQL Database instance. You can add multiple requests for multiple queries.

  8. If your queries require input data, you can add a CSV input file to provide data to the JMeter script.


Running the load test


You can now run this script on Azure Load Testing.



  1. Create an Azure Load Testing resource if you don’t already have one.

  2. Create a test by selecting Upload a JMeter script.

  3. Upload the JMeter script, the JDBC driver  and the CSV file. You need to upload this because it is not already installed on the test engine.
    Screenshot of test creation in Azure Load TestingScreenshot of test creation in Azure Load Testing

  4. In the parameters tab, add the following.

    1. The environment variable values for the following

      1.  threads – the number of concurrent users per engine

      2. duration – the duration of the test.

      3. Database – the database URL

      4. Username – the username to login to the database



    2. The password as a secret. Enter the secret name and the secret identifier from the Azure Key Vault (AKV). Remember to grant ‘Get’ permission on secrets to this load testing resource on the AKV using managed identity.
      Screenshot of parameters in Azure Load TestingScreenshot of parameters in Azure Load Testing



  5. In the Monitoring tab, select your Azure SQL database instance.  By default you can view the CPU percentage, connections failed and deadlocks for your SQL database.

  6. Select Review + Create to create and run the test.


Monitoring 


Once the test run starts, you can monitor the client side and server side metrics on the dashboard in real time. The load begins to ramp up slowly to 150 virtual users and after the load reached the maximum virtual users, the database started returning errors. The errors are of type ‘request limit has exceeded’.


Screenshot of test results in Azure Load Testing with errorsScreenshot of test results in Azure Load Testing with errors


You can monitor the server side metrics as well to understand the reason for errors. You can click on Configure metrics to add additional metrics to monitor the performance of your database. As you can see, the average CPU percentage and average DTU percentage peaked after some time. Azure SQL Database recommends setting alerts for if the average CPU and DTU percentage go above 80%.


Screenshot of test results in Azure Load Testing with high CPU and DTU percentScreenshot of test results in Azure Load Testing with high CPU and DTU percent


Fixing performance bottlenecks


Once you have identified performance issues, you can take steps to optimize the performance of your Azure SQL Database. Some tips  to improve the performance of your Azure SQL Database include:



  1. Index optimization: Ensure that your database has the appropriate indexes to speed up query execution.

  2. Query optimization: Optimize your SQL queries to ensure that they are as efficient as possible.

  3. Scaling: Consider scaling up or out to increase the capacity of your Azure SQL Database.


In this case I know that my database is not able to handle the load because of the limit in DTUs. Now scale up the Azure SQL Database to 200 DTUs. Once done, re-run the test in Azure Load Testing and monitor the metrics.


Screenshot of test results in Azure Load Testing with no errorsScreenshot of test results in Azure Load Testing with no errors


Now I see that there were no errors and the average CPU and DTU percentages were within acceptable limits.


Screenshot of test results in Azure Load Testing with low CPU and DTU percentScreenshot of test results in Azure Load Testing with low CPU and DTU percent


Conclusion


In conclusion, load testing is an essential aspect of database performance testing. It helps to identify performance bottlenecks, improve database performance, and ensure that it can handle the expected workload. Remember, load testing should be an ongoing process, so make sure to integrate load tests in your CICD workflows to identify any issues early in the development lifecycle and optimize your database’s performance.


If you have any feedback on Azure Load Testing, let us know through our feedback forum. Refer to the  previous blogs on Azure load testing here. The resources used in this blog post are available in the Azure Load Testing samples repository.


 


Happy Load Testing!

Guest Access with Field Service Mobile: Introducing Tenant Switcher for Field Service Mobile

Guest Access with Field Service Mobile: Introducing Tenant Switcher for Field Service Mobile

This article is contributed. See the original author and article here.

A common scenario for Field Service organizations is to augment their staff with external vendor resources.  Leveraging Azure Active Directory B2B Guest Access, vendors can be added to the organizational directory without being created as full first party users within the organization. This allows a clean delineation of users to manage security and data access. 

Dynamics 365 has made this vendor onboarding process even easier with Wave 1 2023 by introducing Tenant Switcher for Field Service Mobile. Tenant Switcher provides a user interface where guest users can now easily switch between their Home and Guest Tenants. 

Other considerations to note: 

  • Guest Users require a Field Service license and appropriate Security role for access to Field Service Mobile. 
  • Model Driven Application Authentication supports work or school accounts. AAD B2B Guest users configured with a personal account would not be able to authenticate and access the Field Service Mobile application directly. 

Field Service (Dynamics 365) mobile app overview  – Dynamics 365 Field Service | Microsoft Learn

The post Guest Access with Field Service Mobile: Introducing Tenant Switcher for Field Service Mobile appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Error reference article now available in Microsoft Learn

This article is contributed. See the original author and article here.

The Viva Insights team recently published a new article on Microsoft Learn: Error messages in Viva Insights | Microsoft Learn


 


If you’re seeing a message related to one of these issues, check out our new document:



  • Page access

  • License assignment

  • Data upload

  • Organization insights


As always, if this article doesn’t have the answers you’re looking for, our support team is happy to help.

How Microsoft Teams helped the Breakthru app bring wellbeing to 45,000 organizations

How Microsoft Teams helped the Breakthru app bring wellbeing to 45,000 organizations

This article is contributed. See the original author and article here.

The Breakthru app in Teams is available to more than 300 million potential monthly active users in 500,000 organizations. Finding the right audience is critical for independent software vendors (ISVs), and just three years after launching on Teams, Breakthru reaches more than 45,000 organizations worldwide, with a growing customer base.

The post How Microsoft Teams helped the Breakthru app bring wellbeing to 45,000 organizations appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Lesson Learned #368: Connection Retry-Logic using ODBC API code

This article is contributed. See the original author and article here.

This week I had a service request where our customer didn’t have a connection retry logic implemented in their application code in the event of a connection failure to Azure SQL. I would like to share an example about how to implement it. 


 


First the C# code using ODBC API:


 


 

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Runtime.InteropServices;
using System.Diagnostics;

namespace DotNetExample
{
    class ClsODBCAPI
    {
        // Import the ODBC API functions using P/Invoke
        [DllImport("odbc32.dll")]
        private static extern short SQLAllocHandle(short handleType, IntPtr inputHandle, out IntPtr outputHandle);

        [DllImport("odbc32.dll")]
        private static extern short SQLSetEnvAttr(IntPtr environmentHandle, int attribute, IntPtr valuePtr, int stringLength);

        [DllImport("odbc32.dll")]
        private static extern short SQLConnect(IntPtr connectionHandle, string serverName, short nameLength1, string userName, short nameLength2, string password, short nameLength3);

        [DllImport("odbc32.dll")]
        private static extern short SQLExecDirect(IntPtr statementHandle, string query, int textLength);

        [DllImport("odbc32.dll")]
        private static extern short SQLFetch(IntPtr statementHandle);

        [DllImport("odbc32.dll")]
        private static extern short SQLGetData(IntPtr statementHandle, short columnIndex, short targetType, IntPtr targetValue, int bufferLength, out int indicatorValue);

        [DllImport("odbc32.dll")]
        private static extern short SQLDisconnect(IntPtr connectionHandle);

        [DllImport("odbc32.dll")]
        private static extern short SQLFreeHandle(short handleType, IntPtr handle);

        [DllImport("odbc32.dll")]
        private static extern short SQLGetDiagRec(
            short handleType,
            IntPtr handle,
            short recordNumber,
            IntPtr sqlState,
            out int nativeError,
            IntPtr messageText,
            short bufferLength,
            out short textLength
        );

        public void Main()
        {
            // Initialize ODBC environment handle
            IntPtr environmentHandle = IntPtr.Zero;
            SQLAllocHandle(1, IntPtr.Zero, out environmentHandle);
            SQLSetEnvAttr(environmentHandle, 200, (IntPtr)3, 0);

            // Initialize ODBC connection and statement handles
            IntPtr connectionHandle = IntPtr.Zero;
            IntPtr statementHandle = IntPtr.Zero;
            short retcode;
            retcode = SQLAllocHandle(2, environmentHandle, out connectionHandle);

            try
            {
                // Connect to the database
                retcode = RetryLogicUsingODBCAPI(connectionHandle);

                if( retcode != 1  )
                    {
                        return;
                    }

                retcode = SQLAllocHandle(3, connectionHandle, out statementHandle);
                // Prepare and execute a query
                SQLExecDirect(statementHandle, "SELECT top 200 TextToSearch FROM PerformanceVarcharNVarchar", 60);

                // Fetch and display the result set
                int id = 0;
                while (SQLFetch(statementHandle) == 0)
                {
                    // Retrieve data for each column
                    id = id + 1;
                    int nameLength = 200;
                    IntPtr namePtr = Marshal.AllocHGlobal(nameLength);
                    SQLGetData(statementHandle, 1, 1, namePtr, nameLength, out nameLength);
                    string name = Marshal.PtrToStringAnsi(namePtr);

                    Console.WriteLine("ID: " + id);
                    Console.WriteLine("Name: " + name);

                    Marshal.FreeHGlobal(namePtr);
                }
            }
            catch (Exception ex)
            {
                // Handle any errors that occur
                Console.WriteLine("Error: " + ex.Message);
            }
            finally
            {
                // Disconnect and free resources
                SQLDisconnect(connectionHandle);
                SQLFreeHandle(3, statementHandle);
                SQLFreeHandle(2, connectionHandle);
                SQLFreeHandle(1, environmentHandle);
            }

        }

        private short RetryLogicUsingODBCAPI(IntPtr connectionHandle)
        {
            int maxRetryAttempts = 5;
            int retryIntervalSeconds = 10;
            int retryCount = 0;
            short retcode = 0;

            TimeSpan ts;
            string elapsedTime;

            Stopwatch oConnTime = new Stopwatch();

            while (retryCount < maxRetryAttempts)
            {
                try
                {
                    retryCount++;
                    retcode = SQLConnect(connectionHandle, "DSNName", 7, "username", 8, "Password", 8);

                    if (retcode == 1)
                    {
                        ts = oConnTime.Elapsed;
                        elapsedTime = String.Format("{0:00}:{1:00}:{2:00}.{3:00}", ts.Hours, ts.Minutes, ts.Seconds, ts.Milliseconds / 10);
                        Console.WriteLine("Connected to the database. Time Spent:" + elapsedTime);
                        return retcode;
                    }
                    else
                    {
                        Console.WriteLine("SQLConnect failed with retcode: " + retcode);
                        GetODBCErrorDetails(connectionHandle);
                        Console.WriteLine("Retrying connection...in...{0} ms", (1000 * retryIntervalSeconds));
                        System.Threading.Thread.Sleep(1000 * retryIntervalSeconds);
                        retryIntervalSeconds = Convert.ToInt32(retryIntervalSeconds * 1.5);
                    }

                }
                catch (Exception ex)
                {
                    Console.WriteLine("Error: " + ex.Message);
                }
            }
            return -1;
        }

        static void GetODBCErrorDetails(IntPtr handle)
        {
            const int SQL_HANDLE_ENV = 1;
            const int SQL_HANDLE_DBC = 2;

            IntPtr sqlStatePtr = Marshal.AllocHGlobal(6);
            IntPtr messageTextPtr = Marshal.AllocHGlobal(1024);
            int nativeError;
            short textLength;

            short retcode = SQLGetDiagRec(
                SQL_HANDLE_DBC,
                handle,
                1,
                sqlStatePtr,
                out nativeError,
                messageTextPtr,
                1024,
                out textLength
            );

            if (retcode == 0)
            {
                string sqlState = Marshal.PtrToStringAnsi(sqlStatePtr);
                string messageText = Marshal.PtrToStringAnsi(messageTextPtr, textLength);
                Console.WriteLine("ODBC Error Details:");
                Console.WriteLine("SQLState: " + sqlState);
                Console.WriteLine("Native Error: " + nativeError);
                Console.WriteLine("Message: " + messageText);
            }
            else
            {
                Console.WriteLine("Failed to retrieve ODBC error details.");
            }

            Marshal.FreeHGlobal(sqlStatePtr);
            Marshal.FreeHGlobal(messageTextPtr);
        }
    }

}

 


 


 


This first of the code declares and imports the required functions from the odbc32.dll library using P/Invoke. These functions are used to interact with the ODBC API.


 


In the Main method, the ODBC environment handle is initialized using SQLAllocHandle function. The SQLSetEnvAttr function is used to set the environment attribute. Then, the ODBC connection and statement handles are initialized using SQLAllocHandle.


 


Inside the try block, the RetryLogicUsingODBCAPI method is called to establish a connection to the database. If the connection is successful (retcode = 1), a query is executed using SQLExecDirect. The result set is fetched using SQLFetch, and the data is displayed.


In case of any errors, the catch block handles and displays the exception message. The finally block is used to disconnect from the database and free the allocated resources.


 


The RetryLogicUsingODBCAPI method is responsible for implementing the connection retry logic. It attempts to connect to the database using SQLConnect within a while loop. If the connection is successful (retcode = 1), it returns the retcode. Otherwise, it displays the failure details, waits for a specified interval, and increases the interval for subsequent retries.


 


The GetODBCErrorDetails method retrieves ODBC error details using SQLGetDiagRec function. It takes the handle as input and retrieves the SQLState, native error code, and message text associated with the error.


 


Enjoy!