Lesson Learned #452: Understanding CPU Time and Elapsed Time in SQL Query Execution

This article is contributed. See the original author and article here.

Have you ever wondered why some SQL queries take forever to execute, even when the CPU usage is relatively low? In our latest support case, we encountered a fascinating scenario: A client was puzzled by a persistently slow query. Initially, the suspicion fell on CPU performance, but the real culprit lay elsewhere. Through a deep dive into the query’s behavior, we uncovered that the delay was not due to CPU processing time. Instead, it was the sheer volume of data being processed, a fact that became crystal clear when we looked at the elapsed time. The eye-opener was our use of SET STATISTICS IO, revealing a telling tale: SQL Server Execution Times: CPU time = 187 ms, elapsed time = 10768 ms. Join us in our latest blog post as we unravel the intricacies of SQL query performance, emphasizing the critical distinction between CPU time and elapsed time, and how understanding this can transform your database optimization strategies.


 


Introduction


In the realm of database management, performance tuning is a critical aspect that can significantly impact the efficiency of operations. Two key metrics often discussed in this context are CPU time and elapsed time. This article aims to shed light on these concepts, providing practical SQL scripts to aid database administrators and developers in monitoring and optimizing query performance.


 


What is CPU Time?


CPU time refers to the amount of time for which a CPU is utilized to process instructions of a SQL query. In simpler terms, it’s the actual processing time spent by the CPU in executing the query. This metric is essential in understanding the computational intensity of a query.


 


What is Elapsed Time?


Elapsed time, on the other hand, is the total time taken to complete the execution of a query. It includes CPU time and any additional time spent waiting for resources (like IO, network latency, or lock waits). Elapsed time gives a more comprehensive overview of how long a query takes to run from start to finish.


 


Why Are These Metrics Important?


Understanding the distinction between CPU time and elapsed time is crucial for performance tuning. A query with high CPU time could indicate computational inefficiency, whereas a query with high elapsed time but low CPU time might be suffering from resource waits or other external delays. Optimizing queries based on these metrics can lead to more efficient use of server resources and faster query responses.


 


Practical SQL Scripts


Let’s delve into some practical SQL scripts to observe these metrics in action.


Script 1: Table Creation and Data Insertion


 


 


 

CREATE TABLE EjemploCPUvsElapsed (
    ID INT IDENTITY(1,1) PRIMARY KEY,
    Nombre VARCHAR(5000),
    Valor INT,
    Fecha DATETIME
);

DECLARE @i INT = 0;
WHILE @i < 200000
BEGIN
    INSERT INTO EjemploCPUvsElapsed (Nombre, Valor, Fecha)
    VALUES (CONCAT(REPLICATE('N', 460), @i), RAND()*(100-1)+1, GETDATE());
    SET @i = @i + 1;
END;

 


 


 


This script creates a table and populates it with sample data, setting the stage for our performance tests.


Script 2: Enabling Statistics


Before executing our queries, we enable statistics for detailed performance insights.


 


 


 

SET STATISTICS TIME ON;
SET STATISTICS IO ON;

 


 


 


Script 3: Query Execution


We execute a sample query to analyze CPU and elapsed time.


 


 


 

SELECT *
FROM EjemploCPUvsElapsed
ORDER BY NEWID() DESC;

 


 


 


Script 4: Fetching Performance Metrics


Finally, we use the following script to fetch the CPU and elapsed time for our executed queries.


 


 


 

SELECT 
    sql_text.text,
    stats.execution_count,
    stats.total_elapsed_time / stats.execution_count AS avg_elapsed_time,
    stats.total_worker_time / stats.execution_count AS avg_cpu_time
FROM 
    sys.dm_exec_query_stats AS stats
CROSS APPLY 
    sys.dm_exec_sql_text(stats.sql_handle) AS sql_text
ORDER BY 
    avg_elapsed_time DESC;

 


 


 


Conclusion


Understanding and differentiating between CPU time and elapsed time in SQL query execution is vital for database performance optimization. By utilizing the provided scripts, database professionals can start analyzing and improving the efficiency of their queries, leading to better overall performance of the database systems.

General Availability of Database Migration Service for Oracle in Azure Data Studio

General Availability of Database Migration Service for Oracle in Azure Data Studio

This article is contributed. See the original author and article here.

 


 


We are thrilled to announce an addition to our Database Migration Service capability of supporting Oracle to SQL scenario and General availability of the Oracle Assessment and Database schema conversion toolkit . In tune with the changing landscape of user needs, we’ve crafted a powerful capability that seamlessly blends efficiency, precision, and simplicity, promising to make your migration journey smoother than ever.


 


 


Why Migrate?


Shifting from Oracle to SQL opens a world of advantages, from heightened performance and reduced costs to enhanced scalability.


 


Introducing the Database Migration Service Pack for Oracle


At the core of our enhanced Database Migration Service is the seamless integration with Azure Data Studio Extensions. This dynamic fusion marries the best of Microsoft’s Azure platform with the user-friendly interface of Azure Data Studio, ensuring a migration experience that’s both intuitive and efficient.


 


neelball_0-1700320227143.png


 


What’s Inside the Service Pack:


 


Holistic Assessment:


 


Gain deep insights into your Oracle database with comprehensive assessment tools.


Identify potential issues, optimize performance, right-size your target, and enjoy automated translation of Oracle PL/SQL to T-SQL.


Assessment Oracle to SQL


Automated Conversion of Complex Oracle Workloads:


 


Effortlessly convert Oracle schema to SQL Server format.


The conversion wizard guides you through the process, providing a detailed list of successfully converted objects and highlighting areas that may need manual intervention.


Code Conversion Oracle to SQL


Reusable Interface:


 


The database conversion employs SQL Project, delivering a familiar development experience.


Reuse and deploy your previous development work, minimizing the learning curve and maximizing efficiency.


Elevate Your Database Experience


Our Database Migration capability is not just a tool; it’s a solution to seamlessly transition from Oracle to SQL with ease.Ready to embark on a migration journey that exceeds expectations? Keep an eye out for updates, tutorials, and success stories as we unveil this transformative capability.


 


Modernize your database, upgrade your possibilities.

* TIPS & TRICKS & NEW CAMPAIGNS * Digital Marketing Content (DMC) On Demand tool

* TIPS & TRICKS & NEW CAMPAIGNS * Digital Marketing Content (DMC) On Demand tool

This article is contributed. See the original author and article here.

 


Mary_0-1700237986902.png


 


Digital Marketing Content (DMC) OnDemand works as a personal digital marketing assistant and delivers fresh, relevant and customized content and share on social, email, website, or blog. It runs 3-to-12-week digital campaigns that include to-customer content and to-partner resources. This includes an interactive dashboard that will allow partners to track both campaign performance and leads generated in real time and to schedule campaigns in advance


 


TIPS AND TRICKS


DMC campaigns were created to assist you with your marketing strategies in automated way. However, we understand that you want to make sure the focus remains on your business as customers and prospects discover your posts. There are several ways you can customize campaigns to put the focus on your business and offerings:



  • Customize the pre-written copy | Although we provide you with copy for your social posts, emails, and blog posts, pivoting this copy to highlight your unique value can help ensure customers and prospects understand more about your business and how you can help solve their current pain points.  


  • Upload your own content throughout the campaigns | If you have access to the Partner GTM Toolbox co-branded assets, you can create your own content quickly and easily through customizable templates. Choose your colors, photography, and copy to help customers and prospects understand more about your business. Alternatively, you can learn more about how to create your own content by reading the following blog posts: one-pagers and case studies.  Once complete, click on “Add new content” within the any campaign under “Content to share”.


  • Engage with your audience | Are people replying to your LinkedIn, Facebook, and X (formerly Twitter) posts? Take some time to respond to build a rapport.  


  • Access customizable content | Many campaigns in PMC contain content that was designed for you to customize. Microsoft copy is included, but designated sections are left blank for your copy and placeholders are added to ensure you are following co-branding guidelines. You can find examples here.  


  • Upload your logos | Cobranded content is being added on a regular basis, so make sure you’re taking advantage of this recently added functionality to extend your reach.


 


NEW CAMPAIGNS


NOTE: To access localized versions, click the product area link, then select the language from the drop-down menu.



Unlock new value with your Dynamics 365 data in Microsoft 365 and Fabric, while safeguarding your data and assets

Unlock new value with your Dynamics 365 data in Microsoft 365 and Fabric, while safeguarding your data and assets

This article is contributed. See the original author and article here.

Navigating the complex world of business data in the age of AI presents unique challenges to enterprises. Organizations grapple with harnessing disparate data sources, addressing security and compliance risks while maintaining cost effectiveness. Overwhelming challenges often lead to inefficiencies and missed opportunities more than ever. Dynamics 365 and Dataverse are offering a unified platform for managing data effectively and securely at hyperscale, while empowering low code makers and business users of Microsoft Dynamics 365 and Power Platform.  

Customers in different industries, from finance to retail, have been trusting their data and processes with Microsoft Dynamics 365 and Power Platform for years. We are excited to announce multiple features for helping IT administrators navigate effectively with rising AI, data, and security challenges. With Microsoft Purview integration, data governance and compliance risks are significantly reduced. Azure Sentinel provides vigilant monitoring against threats, while enhanced logging to the Microsoft 365 unified audit log tackles insider threats head-on. With Dataverse Link to Fabric, administrators can enable simpler data integration for low code makers, without the need to build and govern complex data pipelines. Moreover, the introduction of Dataverse elastic tables and long-term data retention strategies promise a substantial improvement in both hyperscale management and ROI, reinforcing a robust, secure, and cost-efficient data ecosystem.  

Protect your data and assets in the age of AI 

Growing cyber risks and increased corporate liability exposure for breaches have driven an increased focus on security in many organizations. To address this, Dataverse provides a comprehensive platform to secure your data and assets. At Ignite 2023 we are announcing several new security capabilities:  

Govern Dynamics 365 and Power Platform data in Dataverse through Microsoft Purview integration 

Dataverse integration with Microsoft Purview’s Data Map, available shortly in public preview, enables automated data discovery and sensitive data classification. The integration will help your organization to understand and govern their business applications data estate, safeguard that data, and improve their risk and compliance posture. Learn more here: http://aka.ms/DataversePurviewIntegration 

graphical user interface, text, application, email

Monitor and react to threats with Sentinel  

Microsoft Sentinel solution for Microsoft Power Platform will be in public preview across regions over the next few weeks. Microsoft Sentinel is also integrated with Dynamics 365, with recently added OOB analytics rules. With Sentinel integration, customers can detect various suspicious activities such as Microsoft Power Apps execution from unauthorized geographies, suspicious data destruction by Power Apps, mass deletion of Power Apps, phishing attacks (via Power Apps), Power Automate flows activity by departing employees, Microsoft Power Platform connectors added to an environment, and the update or removal of Microsoft Power Platform data loss prevention policies. Learn more here: http://aka.ms/DataverseSentinelIntegration  

Manage the risk of insider threats via enhanced logging to the Microsoft 365 Unified Audit Log 

To manage the risk of insider threats, all administrator actions in Power Platform are logged to the Microsoft 365 Unified Admin audit log, enabling security teams that manage compliance and insider risk management teams who act on events the ability to mitigate risks in an organization. Learn more here: http://aka.ms/PowerPlatformAdminAuditLogging 

Seamlessly integrate your Dynamics 365 data with Fabric and Microsoft 365  

Empower your users to do more with Link to Fabric  

We are excited to announce general availability of Link to Fabric, a no-copy, no-ETL direct link between Dataverse and Microsoft Fabric. If your organization uses Dynamics 365 or Power Platform with Power BI, Link to Fabric enables simple data integration, and allows low code makers to securely derive AI driven insights across business apps and data sources and drive action.  

Low code Makers can Link to Microsoft Fabric from the Analyze menu in Power Apps maker portal command bar. The system validates configuration and lets you choose a Fabric workspace without leaving maker portal. When you confirm, the system securely links all your tables into a system generated Synapse Lakehouse in the Fabric workspace you selected. Your data stays in Dataverse while you can work with all Fabric workloads like SQL, Python and PowerBI without making copies or building pipelines. As data gets updated in Dataverse, changes are reflected in Fabric, near real time. 

diagram

We are also excited to announce joint partner and ISV solutions with Dynamics 365, Power platform and Microsoft Fabric. Partners and system are leveraging the Dataverse Link to Fabric to provide value added solutions that combine business functionality with insights and built-in actions.  

MECOMS is a top recommended solution for energy and utility companies across the globe enables next generation cities with smart energy and utility solutions that manage consumption from “meter to cash”. MECOMS 365, built on Dynamics 365 and Microsoft Fabric, gathers smart meter data from homes and businesses, processes billing and reconciliation in Dynamics 365 and integrates with Dynamics 365 for customer engagement. Smart cities can not only provide billing and excellent service, but also provide insights to customers on how they can lower consumption and save money.   

Ian Bruyninckx, Lead Product Architect, MECOMS, A Ferranti Company 

ERP customers can extend their insights and reduce TCO by upgrading to Synapse Link  

If you are a Dynamics 365 for Finance and Operations (F&O) customer, we have exciting news to share. Synapse Link for Dataverse service built into Power Apps, the successor to the Export to Data Lake feature in finance and operations apps, is now generally available. By upgrading from Export to data lake feature in F&O to Synapse Link, you can benefit from the improved configuration and enhanced performance which translates to a reduction in the total cost of ownership (TCO).

To learn more about upgrading to Synapse Link, refer to https://aka.ms/TransitiontoSynapseLinkVideos.

Reference Dataverse records with Microsoft 365 Context IQ to efficiently access enterprise data 

One of the most time-consuming tasks, for any person that uses email, is sharing information in your line of business applications with colleagues. You must jump out of your Outlook web experience, open your line of business app, navigate to a record, and then copy and paste the link into your email. This is an incredibly time-consuming set of steps and actions. 

We are excited to announce general availability for Dataverse integration with Microsoft 365 Context IQ a new feature that makes it possible for users to access their most recently used data directly from Outlook Web Client using a simple gesture.   

graphical user interface, text, application

Please refer to the settings on how to Use /mention to link to records in Outlook web client – Power Platform | Microsoft Learn. 

Efficiently hyper(scale) your business applications  

In the age of AI, customers are challenged with managing data at hyperscale from a plethora of sources. As a polyglot hyperscale data platform for structured relational and unstructured non-relational data, Dataverse can support all your business scenarios with security, governance, and life cycle management, with no storage limitations.  

For very high scale data scenarios, such as for example utilizing third party data for time-sensitive marketing campaigns or driving efficiency with predictive maintenance for your IOT business, Dataverse elastic tables is a powerful addition to standard tables, supporting both relational and non-relational data. You can even optimize large volume data storage with time-defined auto-delete capability. 

Additionally, while Dataverse can support your business growth with no limit on active data, to meet your company’s compliance, regulatory or other organizational policy, you can retain in-active data with Dataverse long term data retention and save 50% or more on storage capacity.  

Click here to learn more about Microsoft Dataverse. 

The post Unlock new value with your Dynamics 365 data in Microsoft 365 and Fabric, while safeguarding your data and assets appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Enhance agent utilization and engagement with least active routing 

Enhance agent utilization and engagement with least active routing 

This article is contributed. See the original author and article here.

Organizations want to get the best out of their agents by maximizing their utilization, distributing work evenly, and providing enough breaks between calls. Least active routing, formerly known as most-idle routing, is an assignment strategy that can help achieve this. It assigns work to agents based on when they end their last conversation. It gives agents who are working on longer or more complex conversations a chance to take a break, while distributing the new conversations to other agents. Doing so helps improve workforce utilization and engagement. 

How does least active routing help? 

Contoso Health is a multinational healthcare and insurance provider. It has a large customer support organization covering more than 20 product lines, handled by more than 5,000 agents worldwide. Customers call the contact center to get their queries resolved.

In the contact center, an agent can talk to only one customer at a time. Eugenia, the director of customer support at Contoso, observes that some of her agents are being utilized up to 95% in their schedule. Meanwhile, others are occupied only for only 70-75%, and she wants to solve this problem. While doing that, she also wants to make sure not to impact the key metrics like customer satisfaction and SLAs. She comes across the least active routing assignment method and tries it for a queue.  

Kayla and Finn are two agents working in a voice queue. Kayla has a call that comes in at 1:00 PM. Finn takes a call at 1:05 PM. Kayla’s issue is complex and takes her 15 minutes to close. Finn solves his customer’s problem in five minutes. The next call comes in at 1:20 PM. The round robin method would assign the new call to Kayla since it is her turn, and she is available. But with the least active routing, the system considers the idle time of agents and assigns the call to Finn, as his last call ended earlier than Kayla’s. This kind of assignment, considering the idle time, improves the agent utilization.  

Least active routing assignment diagram

The Least active option is available in the Assignment method section of the queue.

timeline
Configuration screen for least active routing

The least active routing assignment method is currently available only for voice channel queues and is the default selection for new voice queues. Least active routing can also be used as an Order by condition in the custom assignment methods. 

Build custom reports to monitor an agent’s last capacity release time 

The least active assignment method works based on when the agent ended his or her last call. This data about the agent’s last call end time or last capacity release time, is available in the Dataverse entity ‘msdyn_agentchannelstateentity’. Organizations can use the Model customization feature in Dynamics 365 Customer Service to build a custom report that provides a view of this data.

Learn more

To learn more about least active routing, read the documentation: Assignment methods for queues | Microsoft Learn

The post Enhance agent utilization and engagement with least active routing  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.