Introducing Microsoft Dynamics 365 Copilot, bringing next-generation AI to every line of business

Introducing Microsoft Dynamics 365 Copilot, bringing next-generation AI to every line of business

This article is contributed. See the original author and article here.

Introducing the world’s first AI copilot in both CRM and ERP

Today we announced Microsoft Dynamics 365 Copilot, providing interactive, AI-powered assistance across business functionsfrom sales, service, and marketing to supply chain. With Dynamics 365 Copilot, we’re introducing the world’s first AI copilot natively built-in to both CRM and ERP applications.

Copilot brings the power of next-generation AI capabilities and natural language processing to Dynamics 365, working alongside business professionals to help them create ideas and content faster, complete time-consuming tasks, and get insights and next best actionsjust by describing what’s needed.

Copilot features are in preview across Dynamics 365 applications and Microsoft Viva Sales. View demos of the capabilities, as well as learn how to access the previews, below.

Copilot in Microsoft Dynamics 365 Sales and Viva Sales

Many sales teams struggle to scale seller experiences. AI can help any sales organization to be more productive and effective, enabling them to automate the sales process and know customers more deeply.

Microsoft Viva Sales, which is included in Dynamics 365 Sales and available for purchase separately for other CRM systems, including Salesforce, revolutionizes the way sellers work by integrating with Microsoft Outlook and Teams to augment a seller’s actions and decisions with AI-powered insights and actions. New Copilot features help sellers save time, boost productivity, lighten workloads, and stay focused on what matters most: connecting with customers and closing deals.

Communicate with customers more effectively with AI-generated emails

Last month, we announced that Viva Sales now generates content suggestions based on customer emails, such as a reply to an inquiry or a request for a proposalcomplete with data specifically relevant to the recipient, such as pricing, promotions, and deadlines. The seller simply selects the option to suit their needs and a reply is generated for the seller to review, edit to their liking, and send. The reply is enriched with the combined data from Microsoft Graphwhich provides access to people-centric data and insights in the Microsoft Cloud and the CRM system (Microsoft Dynamics 365 or Salesforce).

Today we are announcing an updateemail replies will now move into general availability and on March 15, we will add enhancements to create customizable emails. For example, a seller can generate an email that proposes a meeting time with a customer, complete with a proposed meeting date and time based on availability on the seller’s Outlook calendar.

In addition, a new feedback mechanism allows sellers to rate the AI-generated content with a thumbs up or thumbs down, which helps refine and improve future replies. Sellers can also refine generated results by providing a new prompt that creates an updated response that builds on the previously suggested draft and new context.

By auto-suggesting customizable content, sellers can spend less time composing emails and searching for sales data from colleagues and databases. 

Follow up with customers promptly with AI-generated meeting summaries

Sellers often spend hours each day on calls with prospects and customersand nearly as much time recapping action items for follow-through. To help with that, conversation intelligence provides automated summaries of key topics, issues, and concerns discussed during the meeting.

Now available for public preview in Viva Sales, the new Copilot feature uses natural language capabilities powered by Azure OpenAI Service to intelligently draft a recap of a call with action items and follow-up dates, based on CRM and meeting data. Summaries can be generated for a range of meeting types in Teams, including multi-participant and internal calls, helping sellers stay organized and on top of critical, content-heavy sales meetings.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

If you’re not a current customer, start a free 1-month trial of Viva Sales to try Copilot features.

Copilot in Microsoft Dynamics 365 Customer Service

Customer service agents are critical for maintaining customer loyalty, but they often face pressure to resolve multiple customer cases quickly, leading to burnout and decreased customer satisfaction. To address these challenges, agents need tools that can help them streamline tasks across both simple and complex cases, while still providing personalized service that demonstrates their commitment to resolving each issue thoroughly and efficiently.

Now available for limited preview, we’re announcing Copilot in Microsoft Dynamics 365 Customer Servicea range of next-generation AI capabilities that can expedite resolving customer issues and increase satisfaction scores. Copilot provides the agent with 24/7 AI-powered assistance to help them find resources that will help resolve issues faster, handle cases more efficiently, and automate time-consuming tasks so agents can focus on delivering high-quality service to their customers.

Turn service agents into superagents

With Copilot, agents can quickly craft a draft email or chat response to customers with a single click. Copilot understands the context based on the current live conversation; identifies relevant information from trusted websites and internal documents, including knowledgebase articles and previously resolved cases; and crafts a response that the agent can review and send to the customer.

For agents working on emails, Copilot can help create relevant and personalized email responses to customer queries in seconds. Once Copilot synthesizes information and suggests an email draft, agents can review and modify the content before sending it to the customer. Using conversational chat experience, agents can ask Copilot to help diagnose more complex customer issues, discover resolutions, and summarize draft responses with the right tone to the customeracross all communication channels.

With comprehensive and efficient assistance from Copilot, agents can significantly reduce the amount of time spent searching for content and drafting their response, resulting in improved agent productivity and customer experience.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

Copilot features in Dynamics 365 Customer Service are now in limited preview.

Enrich self-service with AI-powered conversational assistance

Organizations can provide an even more powerful conversational experience by leveraging Power Virtual Agents, now enhanced with generative AI capabilities and available today in preview. Customers can self-serve and get their needs more easily solved with highly intelligent conversational bots that use trusted websites and the company’s internal data to resolve customer issues. Additionally, with the continued investment in the open, flexible, and composable Microsoft Digital Contact Center Platform, Nuance is announcing new AI capabilities in Nuance Mix. Dynamics 365 Customer Service, together with Teams, Microsoft Power Platform, Nuance, and Azure, deliver truly transformative experiences for both agents and customers through the contact center.

Copilot in Microsoft Dynamics 365 Customer Insights and Marketing

To meet customer expectations, marketers need to deliver personalized marketing experiences across physical and digital channels. The next-generation AI empowers marketers to proactively target any audience segment in lockstep with market trends and customer demand. 

Get to data insights faster and easier with natural language

Marketers have traditionally relied on data analysts, leveraging their skillset to write queries in SQL, to uncover insights from their customer datawhich can take weeks for results. With Copilot in Dynamics 365 Customer Insights, marketing teams can engage directly with customer data and discover new information that they may not have been aware of, democratizing access to insights. 

With simple prompts, marketers can ask questions using natural language to explore, analyze, and understand customer segment sizes and preferencesno need to be a SQL expert or wait for other teams to process their requests. For example, a marketer might want to identify customers that reside in San Francisco, California, with a high customer lifetime value, who have also made a purchase in the last 90 days. With a few clicks, Copilot produces the results, along with information such as the customers’ average age, product preferences, or average purchase price. These insights can then be configured into a segment to support a campaign. With Copilot, marketers can now get a deeper understanding of their customers in near real-time.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

Copilot features in Dynamics 365 Customer Insights are now in limited preview.

Use natural language to create audience segments

Market segmentation is an important step to create personalized campaigns based on interests and needs. It also can be a time-consuming process, often requiring knowledge of the complex data structures defined by the marketing database. Query assist, a Copilot capability in Dynamics 365 Marketing, uses Azure OpenAI Service to save time creating or enhancing segments.  

Marketers can describe the target audience characteristics by typing a segment description, such as “all contacts under the age of 30 living in New York City.” Rather than manually select data tables, a marketer can simply type a description of the contacts they wish to engage with, and then add the results to the segment builder. By using this simple query approach, marketers can create real-time segments without the need for knowledge of their back-end data, saving hours of time. 

Create relevant, compelling, and personalized marketing content faster

Email marketing is a powerful way to engage audiences if the email content is compelling and relevant. Too often, content can begin to feel stale or repetitive over time. Content ideas, a Copilot feature in Dynamics 365 Marketing, inspires marketers by turning topics into suggested copy, helping them move from concept to completion faster.

When editing an email, a marketer can prompt Copilot with up to five key points to get across in the email. The content ideas capability uses Azure OpenAI Service to generate a set of text suggestionsunique content can be used as a starting point when composing marketing emails. It can analyze the organization’s existing emails, in addition to a range of internet sources, to increase the relevance of generated ideas. With Copilot, marketers can save hours of time brainstorming and editing, while keeping content fresh and engaging.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

Content ideas and query assist are both available in public preview today. If you are not a current customer, start a free trial for Microsoft Dynamics 365 Marketing to experience the new Copilot features.

Copilot in Microsoft Dynamics 365 Business Central

Compelling product descriptions in online stores boost sales by making products stand out from the crowd. However, producing consistently engaging descriptions on-demand can be challenging, especially when writing product descriptions for dozens or even hundreds of similar products.

Copilot in Microsoft Dynamics 365 Business Central help small and medium-sized businesses to bring new products to market faster by producing AI-generated product descriptions. This Copilot feature suggests copy for engaging product descriptions, tailored to your brand using a product’s title and key attributes, such as color, material, and size. Easily customize the text to your preferred writing style by choosing the tone and length, and make any needed edits before saving. Business Central customers can seamlessly publish the new product descriptions to their Shopify store with just a few clicks.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

Copilot in Dynamics 365 Business Central is now in limited preview.

Copilot in Microsoft Supply Chain Center and Dynamics 365 Supply Chain Management

In recent years, many businesses discovered that their current supply chain technologies are ill-equipped for an environment characterized by ongoing disruptions, constraints, and shortages. AI-enabled supply chain management can provide unprecedented visibility and insights, helping to solve disruptions before they happen.

Today we are announcing new Copilot capabilities for Microsoft Supply Chain Center, available to Dynamics 365 Supply Chain Management customers, to better predict and act on disruptions that occur across suppliers, weather, and geographies. Use intelligence from the news module to proactively flag external issues such as weather, financial, and geo-political news that may impact key supply chain processes. Predictive insights surface impacted orders across materials, inventory, carrier, distribution network, and more.

Copilot turns these insights into action with contextual email outreach generated by Azure OpenAI Service to help solve problems in real-time and with ease. With a custom and contextual reply, supply chain managers can save time and collaborate with impacted suppliers to quickly identify new ETAs and re-route purchase orders based on an identified disruption, like a weather event. They can also fulfill high-priority customer orders via an alternate distribution center to ensure they can meet customer demand in full and on time.

This embed requires accepting cookies from the embed’s site to view the embed. Activate the link to accept cookies and view the embedded content.

This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use.

Copilot features for Microsoft Supply Chain Center are now in limited preview.

Supercharge the entire customer journey with AIstarting today

With every biannual release wave, Microsoft is enhancing Dynamics 365 applications with new, more powerful AI capabilities to help organizations drive business outcomes, improve operational efficiency, and create exceptional customer experiences. 

The capabilities spotlighted today are just the start of AI-powered features to follow across business functions. Stay tuned to this blog, as well as the Dynamics 365 and Power Platform release planner, for further details.

a person sitting at a table using a laptop

Dynamics 365 Copilot

Interactive, AI-powered assistance across business functions

Microsoft is committed to ensuring that AI systems are developed responsibly and in ways that warrant people’s trust. Learn about our AI principles that empower impactful responsible AI practices at Microsoft, as well as provide a framework for implementing responsible AI practices at our customers’ organizations.  

The post Introducing Microsoft Dynamics 365 Copilot, bringing next-generation AI to every line of business appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Visualizing Top GitHub Programming Languages in Excel with Microsoft Graph .NET SDK

Visualizing Top GitHub Programming Languages in Excel with Microsoft Graph .NET SDK

This article is contributed. See the original author and article here.

Hi, I am Samson Amaugo, a Microsoft MVP and a Microsoft Learn Student Ambassador. I love writing and talking about all things DotNet. I am currently a student at the Federal University of Technology, Owerri. We could connect on Linkedin at My Linkedin Profile.

Have you ever thought about going through all your GitHub Repositories, taking note of the languages used, aggregating them, and visualizing it on Excel?


 


Well, that is what this post is all about except you don’t have to do it manually in a mundane way.


With the Aid of the Octokit GraphQL Library and Microsoft Graph .NET SDK, you could code up a cool that automates this process.


 


To build out this project in a sandboxed environment with the appropriate permissions I had to sign up on Microsoft 365 Dev Center to Create an Account that I could use to interact with Microsoft 365 products.



The outcome of the project could be seen below


samson2ky_0-1677433194567.png


 


Steps to Build


Generating an Access Token From GitHub



  • I went to https://github.com/settings/apps and selected the Tokens (classic) option. This is because that is what the Octokit GraphQL library currently supports.

  • I clicked on the Generate new token dropdown.

  • I clicked on the Generate new token (classic) option.


samson2ky_1-1677433228272.png


 



  • I filled in the required Note field and selected the repo scope which would enable the token to have access to both my private and public repositories. After that, I scrolled down and selected the Generate token button.


samson2ky_2-1677433259708.png


 



  • Next, I copied the generated token to use during my code development.


samson2ky_3-1677433278279.png


 


Creating a Microsoft 365 Developer Account



samson2ky_4-1677433313027.png


 


Registering an Application on Azure



  • To interact with Microsoft 365 applications via the graph API, I had to register an app on Azure Active Directory.

  • I signed in to https://aad.portal.azure.com/ (Azure Active Directory Admin Center) using the developer email gotten from the Microsoft 365 developer subscription (i.e. samson2ky@q8w6.onmicrosoft.com).

  • I clicked on the Azure Active Directory menu on the left menu pane.

  • I clicked on the App Registration option.

  • I clicked on the New registration option.


samson2ky_5-1677433336096.png


 



  • I filled in the application name and clicked on the Register button.
    samson2ky_7-1677433416335.png

     


     



  • I copied the clientId and the tenantId

  • I clicked on the API permissions menu on the left pane
    samson2ky_8-1677433438539.png

     




 


 



  • To grant the registered application access to manipulate files, I had to grant it read and write access to my files.

  • I clicked on the Add a permission option.
    samson2ky_9-1677433475329.png

     



  • I clicked on the Microsoft Graph API.


samson2ky_10-1677433502832.png


 



  • Since I wanted the application to run in the background without signing in as a user, I selected the Application permissions option, typed files in the available field for easier selection, checked the Files.ReadWrite.All permission and clicked on the Add permissions button.


samson2ky_11-1677433516544.png


 



  • At this point, I had to grant the application ADMIN consent before it would be permitted to read and write to my files.


 


samson2ky_12-1677433538363.png


 


samson2ky_13-1677433555615.png


 



  • Next, I had to generate the client’s secret by clicking on the Certificates & secrets menu on the left panel and clicking on the New client secret button.

    samson2ky_14-1677433583262.png

     



  • I filled up the client’s secret description and clicked on the Add button.
    samson2ky_15-1677433596498.png

     



  • Finally, I copied the Value of the generated secret key.
    samson2ky_16-1677433614069.png

     




Writing the Code



  • I created a folder, opened a terminal in the directory and executed the command below to bootstrap my console application.
    dotnet new console

  • I installed some packages which you can easily install by adding the item group below in your .csproj file within the Project tag.


 


    
    
    
    
    
    
 

 



  • Next, execute the command below to install them
    dotnet restore

  • To have my necessary configuration in a single place, I created an appsettings.json file at the root of my project and added the JSON data below:


 

{
    "AzureClientID": "eff50f7f-6900-49fb-a245-168fa53d2730",
    "AzureClientSecret": "vUx8Q~plb15_Q~2ZscyfxKnR6VrWm634lIYVRb.V",
    "AzureTenantID": "33f6d3c4-7d26-473b-a7f0-13b53b72b52b",
    "GitHubClientSecret": "ghp_rtPprvqRPlykkYofA4V36EQPNV4SK210LNt7",
    "NameOfNewFile": "chartFile.xlsx"
}

 



You would need to replace the credential above with yours.



  • In the above JSON, you can see that I populated the values of the key with the saved secrets and IDs I obtained during the GitHub token registration and from the Azure Directory Application I created.

  • To be able to bind the values in the JSON above to a C# class, I created a Config.cs file in the root of my project and added the code below:


 

using Microsoft.Extensions.Configuration;
namespace MicrosoftGraphDotNet
{
    internal class Config
    {
        // Define properties to hold configuration values
        public string? AzureClientId { get; set; }
        public string? AzureClientSecret { get; set; }
        public string? AzureTenantId { get; set; }
        public string? GitHubClientSecret { get; set; }
        public string? NameOfNewFile { get; set; }
        // Constructor to read configuration values from appsettings.json file
        public Config()
        {
            // Create a new configuration builder and add appsettings.json as a configuration source
            IConfiguration config = new ConfigurationBuilder()
                .SetBasePath(Directory.GetCurrentDirectory())
                .AddJsonFile("appsettings.json")
                .Build();

            // Bind configuration values to the properties of this class
            config.Bind(this);
        }
    }
}

 


 


In the Program.cs file I imported the necessary namespaces that I would be needing by adding the code below:


 


 

// Import necessary packages
using System.Text.Json;
using Octokit.GraphQL;
using Octokit.GraphQL.Core;
using Octokit.GraphQL.Model;
using Azure.Identity;
using Microsoft.Graph;
using MicrosoftGraphDotNet;

 



  • Next, I instantiated my, Config class:


 

// retrieve the config
var config = new Config();

 


 



  • I made use of the Octokit GraphQL API to query all my GitHub repositories for repository names and the languages used in individual repositories. Then I created a variable to hold the list of distinct languages available in all my repositories. After that, I deserialized the response into an array of a custom class I created (Repository class).



 

// Define user agent and connection string for GitHub GraphQL API
var userAgent = new ProductHeaderValue("YOUR_PRODUCT_NAME", "1.0.0");
var connection = new Connection(userAgent, config.GitHubClientSecret!);

// Define GraphQL query to fetch repository names and their associated programming languages
var query = new Query()
.Viewer.Repositories(
isFork: false,
affiliations: new Arg<IEnumerable>(
new RepositoryAffiliation?[] { RepositoryAffiliation.Owner })
).AllPages().Select(repo => new
{
    repo.Name,
    Languages = repo.Languages(null, null, null, null, null).AllPages().Select(language => language.Name).ToList()
}).Compile();

// Execute the GraphQL query and deserialize the result into a list of repositories
var result = await connection.Run(query);
var languages = result.SelectMany(repo => repo.Languages).Distinct().ToList();
var repoNameAndLanguages = JsonSerializer.Deserialize(JsonSerializer.Serialize(result));

 


 



  • Since I am using top-level statements in my code I decided to ensure the custom class I created would be the last thing in the Program.cs file.


 

// Define a class to hold repository data
class Repository
{
    public string? Name { get; set; }
    public List? Languages { get; set; }
}

 


 



  • Now that I’ve written the code to retrieve my repository data, the next step is to write the code to create an Excel File, Create a table, create rows and columns, populate the rows and columns with data and use that to plot a chart that visualizes the statistics of the top GitHub Programming languages used throughout my repositories.

  • I initialized the Microsoft Graph .NET SDK using the :


 

// Define credentials and access scopes for Microsoft Graph API
var tokenCred = new ClientSecretCredential(
config.AzureTenantId!,
config.AzureClientId!,
config.AzureClientSecret!);
var graphClient = new GraphServiceClient(tokenCred);

 


 



  • Next, I created an Excel file :


 

// Define the file name and create a new Excel file in OneDrive
var driveItem = new DriveItem
{
    Name = config.NameOfNewFile!,
    File = new Microsoft.Graph.File
    {
    }
};
var newFile = await graphClient.Drive.Root.Children
.Request()
.AddAsync(driveItem);

 


 



  • I created a table that spans the length of the data I have horizontally and vertically:


 

// Define the address of the Excel table and create a new table in the file
var address = "Sheet1!A1:" + (char)('A' + languages.Count) + repoNameAndLanguages?.Count();
var hasHeaders = true;
var table = await graphClient.Drive.Items[newFile.Id].Workbook.Tables
.Add(hasHeaders, address)
.Request()
.PostAsync();

 



  • I created a 2-Dimensional List that would represent my data in the format below


samson2ky_17-1677433689596.png


 


The code that represents the data above can be seen below:


 

// Define the first row of the Excel table with the column headers
var firstRow = new List { "Repository Name" }.Concat(languages).ToList();

// Convert the repository data into a two-dimensional list
List<List> totalRows = new List<List> { firstRow };
foreach (var value in repoNameAndLanguages!)
{
    var row = new List { value.Name! };
    foreach (var language in languages)
    {
        row.Add(value.Languages!.Contains(language) ? "1" : "0");
    }
    totalRows.Add(row);
}

// Add a new row to the table with the total number of repositories for each language
var languageTotalRow = new List();
// Add "Total" as the first item in the list
languageTotalRow.Add("Total");
// Loop through each programming language in the header row
for (var languageIndex = 1; languageIndex < totalRows[0].Count; languageIndex++)
{
    // Set the total count for this language to 0
    var languageTotal = 0;
    // Loop through each repository in the table
    for (var repoIndex = 1; repoIndex < totalRows.Count; repoIndex++)
    {
        // If the repository uses this language, increment the count
        if (totalRows[repoIndex][languageIndex] == "1")
        {
            languageTotal++;
        }
    }

    // Add the total count for this language to the languageTotalRow list
    languageTotalRow.Add(languageTotal.ToString());
}

// Add the languageTotalRow list to the bottom of the table
totalRows.Add(languageTotalRow);

 


 



  • I added the rows of data to the table using the code below:


 

// Create a new WorkbookTableRow object with the totalRows list serialized as a JSON document
var workbookTableRow = new WorkbookTableRow
{
    Values = JsonSerializer.SerializeToDocument(totalRows),
    Index = 0,
};

// Add the new row to the workbook table
await graphClient.Drive.Items[newFile.Id].Workbook.
Tables[table.Id].Rows
.Request()
.AddAsync(workbookTableRow);

 


 



  • Finally, I created a ColumnClustered chart using my data and logged the URL of the spreadsheet.


 

// Add a new chart to the worksheet with the language totals as data
await graphClient.Drive.Items[newFile.Id].Workbook.Worksheets["Sheet1"].Charts
.Add("ColumnClustered", "Auto", JsonSerializer.SerializeToDocument($"Sheet1!B2:{(char)('A' + languages.Count)}2, Sheet1!B{repoNameAndLanguages.Count() + 3}:{(char)('A' + languages.Count)}{repoNameAndLanguages.Count() + 3}"))
.Request()
.PostAsync();

// Print the URL of the new file to the console
Console.WriteLine(newFile.WebUrl);

 


 



  • After executing the command: dotnet run, I got the URL or link to the excel file as an output.


samson2ky_19-1677434589898.png


 



  • On clicking the link I was able to view the awesome visualization of the languages used across my GitHub repositories.


samson2ky_18-1677433718596.png


 


And that’s the end of this article. I hope you enjoyed it and got to see how I used Microsoft Graph .NET SDK to automate this process.


 


To learn more about Microsoft Graph API and SDKs:
Microsoft Graph https://developer.microsoft.com/graph 
Develop apps with the Microsoft Graph Toolkit – Training


Hack Together: Microsoft Graph and .NET 

Is a hackathon for beginners to get started building scenario-based apps using .NET and Microsoft Graph. In this hackathon, you will kick-start learning how to build apps with Microsoft Graph and develop apps based on the given Top Microsoft Graph Scenarios, for a chance to win exciting prizes while meeting Microsoft Graph Product Group Leaders, Cloud Advocates, MVPs and Student Ambassadors. The hackathon starts on March 1st and ends on March 15th. It is recommended for participants to follow the Hack Together Roadmap for a successful hackathon.



Demo/Sample Code
You can access the code for this project at https://github.com/sammychinedu2ky/MicrosoftGraphDotNet


 

MTC Weekly Roundup – March 3

MTC Weekly Roundup – March 3

This article is contributed. See the original author and article here.

Happy Friday, MTC’ers! I hope you’ve had a good week and that March is treating you well so far. Let’s dive in and see what’s going on in the community this week!


 


MTC Moments of the Week


 


First up, we are shining our MTC Member of the Week spotlight on a community member from Down Under, @Doug_Robbins_Word_MVP! True to his username, Doug has been a valued MVP for over 20 years and an active contributor to the MTC across several forums, including M365, Word, Outlook, and Excel. Thanks for being awesome, Doug!


 


Jumping to events, on Tuesday, we closed out February with a brand new episode of Unpacking Endpoint Management all about Group Policy migration and transformation. Our hosts @Danny Guillory and @Steve Thomas (GLADIATOR) sat down for a fun policy pizza party with guests @mikedano, @Aasawari Navathe, @LauraArrizza, and @Joe Lurie from the Intune engineering team. We will be back next month to have a conversation about Endpoint Data Loss Prevention and Zero Trust – click here to RSVP and add it to your calendar!


 


Then on Wednesday, we had our first AMA of the new month with the Microsoft Syntex team. This event focused on answering the community’s questions and listening to user feedback about Syntex’s new document processing pay-as-you-go metered services with a stacked panel of experts including @Ian Story, @James Eccles, @Ankit_Rastogi, @jolenetam, @Kristen_Kamath, @Shreya_Ganguly, and @Bill Baer. Thank you to everyone who contributed to the discussion, and stay tuned for more Syntex news!


 


 And last but not least, over on the Blogs, @Hung_Dang has shared a skilling snack for you to devour during your next break, where you can learn how to make the most of your time with the help of Windows Autopilot!


 


Unanswered Questions – Can you help them out?


 


Every week, users come to the MTC seeking guidance or technical support for their Microsoft solutions, and we want to help highlight a few of these each week in the hopes of getting these questions answered by our amazing community!


 


In the Exchange forum, @Ian Gibason recently setup a Workspace mailbox that is not populating the Location field when using Room Finder, leading to user requests being denied. Have you seen this before?


 


Perhaps you can help @infoatlantic over in the Intune forum, who’s looking for a way to sync a list of important support contacts to company owned devices.


 


Upcoming Events – Mark Your Calendars!


 



 



 


——————————–


 


For this week’s fun fact…


 


nurdle.jpg


 


Did you know that the little wave-shaped blob of toothpaste you squeeze onto your toothbrush has a name? It’s called a “nurdle” (the exact origin of this term isn’t clear), but there was even a lawsuit in 2010 that Colgate filed against Glaxo (the maker of Aquafresh) to prohibit them from depicting the nurdle in their toothpaste packaging. Glaxo countersued, and the case was settled after a few months. The more you know!


 


Have a great weekend, everyone!

How to generate pgBadger report from Azure Database for PostgreSQL Flexible Server

How to generate pgBadger report from Azure Database for PostgreSQL Flexible Server

This article is contributed. See the original author and article here.

PgBadger is one of the most comprehensive Postgres troubleshooting tools available. It allows users to have insight into a wide variety of events happening in the database including:


 



  • (Vacuums tab) Autovacuum actions – how many ANALYZE and VACUUM actions were triggered by autovacuum daemon including number of tuples and pages removed per table.

  • (Temp Files tab) Distribution of temporary files and their sizes, queries that generated temporary files.

  • (Locks tab) type of locks in the system, most frequent waiting queries, queries that waited the most; unfortunately, there is no information provided which query is holding the lock, only the queries that are locked are shown.

  • (Top tab) Slow query log

  • (Events tab) all the errors, fatals, warnings etc. aggregated.

  • And many more


You can see a sample pgBadger report here.


 


You can generate a pgBadger report from Azure Database for PostgreSQL Flexible Server in multiple ways:



  1. Using Diagnostic Settings and redirecting logs to a storage account; mount storage account onto VM with BlobFuse.

  2. Using Diagnostic Settings and redirecting logs to a storage account; download the logs from storage account to the VM.

  3. Using Diagnostic Settings and redirecting logs to Log Analytics workspace.

  4. Using plain Server Logs*


*Coming soon!


 


In this article we will describe the first solution – Using Diagnostic Settings and redirecting logs to a storage account. At the end of exercise we will have storage account filled with the logs from Postgres Flexible Server and a operational VM with direct access to the logs stored in the storage account like shown below in the picture:


 


generate pgBadger report from Azure Database for PostgreSQL Flexible Servergenerate pgBadger report from Azure Database for PostgreSQL Flexible Server


 


To be able to generate the report you need to configure the following items:



  1. Adjust Postgres Server configuration

  2. Create storage account (or use existing one)

  3. Create Linux VM (or use existing one)

  4. Configure Diagnostic Settings in Postgres Flexible Server and redirect logs to the storage account.

  5. Mount storage account onto VM using BlobFuse

  6. Install pgBadger on the VM

  7. Ready to generate reports!


 


Step 1 Adjust Postgres Server configuration


Navigate to the Server Parameters blade in the portal and modify the following parameters:


 


log_line_prefix = ‘%t %p %l-1 db-%d,user-%u,app-%a,client-%h ‘  #Please mind the space at the end!
log_lock_waits = on
log_temp_files = 0
log_autovacuum_min_duration = 0
log_min_duration_statement=0 # 0 is recommended only for test purposes, for production usage please consider much higher value, like 60000 (1 minute) to avoid excessive usage of resources

Adjust Postgres Server configurationAdjust Postgres Server configuration


After the change hit the “Save”:


Save changed Postgres parametersSave changed Postgres parameters


 


Step 2 Create storage account (or use existing one)


Please keep in mind that the storage account needs to be created in the same region as Azure Database for PostgreSQL Flexible Server. Please find the instruction here.


 


Step 3 Create Linux VM (or use existing one)


In this blog post we will use Ubuntu 20.04 as an example, but nothing stops you from using rpm-based system, the only difference will be in a way that BlobFuse and pgBadger is installed.


 


Step 4 Configure Diagnostic Settings in Postgres Flexible Server and redirect logs to the storage account.


Navigate to Diagnostic settings page in the Azure Portal, Azure Database for PostgreSQL Flexible Server instance and add a new diagnostic setting with storage account as a destination:


AlicjaKucharczyk_2-1677685295954.png


 


AlicjaKucharczyk_3-1677685295972.png


 


Hit save button.


 


Step 5 Mount storage account onto VM using BlobFuse


In this section you will mount storage account to your VM using BlobFuse. This way you will see the logs on the storage account as standard files in your VM. First let’s download and install necessary packages. Commands for Ubuntu 20.04 are as follows (feel free to simply copy and paste the following commands):


wget https://packages.microsoft.com/config/ubuntu/20.04/packages-microsoft-prod.deb
sudo dpkg -i packages-microsoft-prod.deb
sudo apt-get update -y

For other distributions please follow the official documentation.    


 


Use a ramdisk for the temporary path (Optional Step)


The following example creates a ramdisk of 16 GB and a directory for BlobFuse. Choose the size based on your needs. This ramdisk allows BlobFuse to open files up to 16 GB in size.


sudo mkdir /mnt/ramdisk
sudo mount -t tmpfs -o size=16g tmpfs /mnt/ramdisk
sudo mkdir /mnt/ramdisk/blobfusetmp
sudo chown /mnt/ramdisk/blobfusetmp


 


Authorize access to your storage account


You can authorize access to your storage account by using the account access key, a shared access signature, a managed identity, or a service principal. Authorization information can be provided on the command line, in a config file, or in environment variables. For details, see Valid authentication setups in the BlobFuse readme.


For example, suppose you are authorizing with the account access keys and storing them in a config file. The config file should have the following format:


accountName myaccount
accountKey storageaccesskey
containerName insights-logs-postgresqllogs


 


Please prepare the following file in editor of your choice. Values for the accountName and accountKey you will find in the Azure Portal and the container name is the same as in the example above. The accountName is the name of your storage account, and not the full URL.


Please navigate to your storage account in the portal and then choose Access keys page:


 

AlicjaKucharczyk_2-1677696874465.png


AlicjaKucharczyk_3-1677695846164.png


Copy accountName and accountKey and paste it to the file. Copy the content of your file and paste it to the fuse_connection.cfg file in your home directory, then mount your storage account container onto the directory in your VM:




vi fuse_connection.cfg
chmod 600 fuse_connection.cfg
mkdir ~/mycontainer
sudo blobfuse ~/mycontainer –tmp-path=/mnt/resource/blobfusetmp –config-file=/home//fuse_connection.cfg -o attr_timeout=240 -o entry_timeout=240 -o negative_timeout=120

sudo -i
cd /home//mycontainer/
ls # check if you see container mounted
# Please use tab key for directory autocompletion; do not copy and paste!
cd resourceId=/SUBSCRIPTIONS/<your subscription id>/RESOURCEGROUPS/PG-WORKSHOP/PROVIDERS/MICROSOFT.DBFORPOSTGRESQL/FLEXIBLESERVERS/PSQLFLEXIKHLYQLERJGTM/y=2022/m=06/d=16/h=09/m=00/
head PT1H.json # to check if file is not empty




At this point you should be able to see some logs being generated.


 


Step 6 Install pgBadger on the VM


Now we need to install pgBadger tool on the VM. For Ubuntu please simply use the command below:




sudo apt-get install -y pgbadger



For other distributions please follow the official documentation.


 


Step 7 Ready to generate reports!


Choose the file you want to generate pgBadger from and go to the directory where the chosen PT1H.json file is stored, for instance, to generate a report from 2022-05-23, 9 o’clock you need to go to the following directory:




cd /home/pgadmin/mycontainer/resourceId=/SUBSCRIPTIONS/***/RESOURCEGROUPS/PG-WORKSHOP/PROVIDERS/MICROSOFT.DBFORPOSTGRESQL/FLEXIBLESERVERS/PSQLFLEXIKHLYQLERJGTM/y=2022/m=05/d=23/h=09/m=00



Since PT1H.json file is a json file and the Postgres log lines are stored in the message and statement values of the json we need to extract the logs first. The most convenient tool for the job is jq which you can install using the following command on Ubuntu:


sudo apt-get install jq -y

Once jq is installed we need to extract Postgres log from json file and save it in another file (PTH1.log in this example):


jq -r ‘.properties | .message + .statement’ PT1H.json > PT1H.log

Finally we are ready to generate pgBadger report:


pgbadger –prefix=’%t %p %l-1 db-%d,user-%u,app-%a,client-%h ‘ PT1H.log -o pgbadgerReport.html

Now you can download your report either from Azure Portal – your storage account or by using scp command:


 


AlicjaKucharczyk_3-1677696889201.png


 


 

Happy Troubleshooting!