This article is contributed. See the original author and article here.
In this episode of MidDay Café hosts Tyrelle Barnes and Michael Gannotti discuss Human/AI partnership. Many organizations are trying to figure out AI strategy but seem to be taking a tech/product first approach. What Tyrelle and Michael discuss is how to anchor on people/employees first with AI/Tech in support.
This article is contributed. See the original author and article here.
Introduction
Azure OpenAI models provide a secure and robust solution for tasks like creating content, summarizing information, and various other applications that involve working with human language. Now you can operate these models in the context of your own data. Try Azure OpenAI Studio today to naturally interact with your data and publish it as an app from from within the studio.
Getting Started
Follow this quickstart tutorial for pre-requisites and setting up your Azure OpenAI environment.
In order to try the capabilities of the Azure OpenAI model on private data, I am uploading an ebook to the Azure OpenAI chat model. This e-book is about “Serverless Apps: Architecture, patterns and Azure Implementation” written by Jeremy Likness and Cecil Phillip. You can download the e-book here
Before uploading own data
Prior to uploading this particular e-book, the model’s response to the question on serverless design patterns is depicted below. While this response is relevant, let’s examine if the model is able to pick up the e-book related content during the next iteration
After uploading own data
This e-book has an exclusive section that talks in detail about different design patterns like Scheduling, CQRS, Event based processing etc.
After training the model on this PDF data, I asked a few questions and the following responses were nearly accurate. I also limited the model to only supply the information from the uploaded content. Here’s what I found.
Now when I asked about the contributors to this e-book, it listed everyone right.
Read more
With enterprise data ranging to large volumes in size, it is not practical to supply them in the context of a prompt to these models. Therefore, the setup leverages Azure services to create a repository of your knowledge base and utilize Azure OpenAI models to interact naturally with them.
The Azure OpenAI Service on your own data uses Azure Cognitive Search service in the background to rank and index your custom data and utilizes a storage account to host your content (.txt, .md, .html, .pdf, .docx, .pptx). Your data source is used to help ground the model with specific data. You can select an existing Azure Cognitive Search index, Azure Storage container, or upload local files as the source we will build the grounding data from. Your data is stored securely in your Azure subscription.
We also have another Enterprise GPT demo that allows you to piece all the azure building blocks yourself. An in-depth blog written by Pablo Castro chalks the detail steps here.
Getting started directly from Azure OpenAI studio allows you to iterate on your ideas quickly. At the time of writing this blog, the completions playground allow 23 different use cases that take advantage of different models under Azure OpenAI.
Summarize issue resolution from conversation
Summarize key points from financial report (extractive )
Summarize an article (abstractive)
Generate product name ideas
Generate an email
Generate a product description (bullet points)
Generate a listicle-style blog
Generate a job description
Generate a quiz
Classify Text
Classify and detect intent
Cluster into undefined categories
Analyze sentiment with aspects
Extract entities from text
Parse unstructured data
Translate text
Natural Language to SQL
Natural language to Python
Explain a SQL query
Question answering
Generate insights
Chain of thought reasoning
Chatbot
Resources
There are different resources to get you started on Azure OpenAI. Here’s a few:
This article is contributed. See the original author and article here.
Busy sales account managers prioritize their activities by mining information from their accounts. But manually making sense of all that unstructured data takes time and can lead to inaccurate assumptions. They can end up focusing on the wrong activities, which results in a lower impact on business outcomes. The most productive and successful account managers are the ones who focus on the right customers with the right priority. Dynamics 365 Sales account-based seller insights can help.
Account-based seller insights help drive priorities
Account-based seller insights help you set priorities and formulate the best engagement plan for your customers. These are automated, actionable insights that are derived from multiple sources of unstructured data and presented to you in the right context. For instance, you might be shown an upsell insight for an account based on past won opportunities for similar accounts, along with guidance on the next best action to take. Seller insights help you proactively manage the customer journey, from the first engagement to the final sale.
Behind the scenes with seller insights
Account-based seller insights can be generated in three ways:
Bring your own model. Use your own AI model, trained on your data, to generate insights, and work with them in the Dynamics 365 sales accelerator.
Use out-of-the-box models. The account-based seller insights solution comes with its own models, which mine the data in Dynamics 365 Sales to generate insights.
Build a back-end rule framework. You can build your own rule framework that uses Power Automate flows to generate insights when certain conditions are met.
How seller insights boost productivity
How can seller insights help you be a more effective sales account manager? Let’s look.
Insight list and actions
First, you get curated insights for all your accounts:
You only see insights that are relevant to you, not your team members.
The insights have expiration dates so that you know the information is fresh and relevant.
You can see the reasons an insight appears in the list.
And after you acknowledge an insight, you’re guided through the next best steps to act on it, optimizing the sales workflow for better results. You can also collaborate with team members while you’re working on your insights.
Insight assignment and distribution
Second, although your insights are curated, that doesn’t mean they’re siloed. Insights are assigned to the account owner. If the owner of an entity is a team, an insight can be automatically assigned to the appropriate salesperson on the team, based on role, through the flexible rule framework. Ownership can be transferred from one seller to another, and multiple sellers can work on a single insight.
Insight action history
Finally, you can find all the insights that have been generated for an account on the account’s Insights tab. The list includes status, type, due date, and other helpful information. Filter and sort it to focus on what’s most important. You can easily identify all seller activities for the insights on the timeline view of the account.
By helping you identify your most important and profitable accounts, understand their needs and preferences, tailor your messages and offers, and nurture long-term relationships with them, account-based seller insights can lead to higher revenues, shorter sales cycles, and better customer satisfaction.
It is often required to copy data from on-premises to Azure SQL database, Azure SQL Managed Instance or to any other data store for data analytics purposes. You may simply want to migrate data from on-premises data sources to Azure Database Services. You will most likely want to be able to do this data movement at scale, with minimal coding and complexity and require an automated and simple approach to handle such scenarios.
In the following example, I am copying 2 tables from an On-premises SQL Server 2019 database to Azure SQL Database using Microsoft Fabric. The entire migration is driven through a metadata table approach, so the copy pipeline is simple and easy to deploy. We have used this approach to copy hundreds of tables from one database to another efficiently. The monitoring UI provides flexibility and convenience to track the progress and rerun the data migration in case of any failures. The entire migration is driven using a database table that holds the information about the tables to copy from the source.
Architecture diagram
This architectural diagram shows the components of the solution from SQL Server on-premises to Microsoft Fabric.
I intend to copy two tables – Customer and Sales – from the source to the target. Let us insert these entries into the metadata table. Insert one row per table.
Ensure that the table is populated. The data pipelines will use this table to drive the migration.
Create Data Pipelines:
Open Microsoft Fabric and click create button to see the items you can create with Microsoft Fabric.
Click on “Data pipeline” to start creating a new data pipeline.
Let us name the pipeline “Copy_Multiple_Tables”.
Click on “Add pipeline activity” to add a new activity.
Choose Azure SQL Database from the list. We will create the table to hold metadata in the target.
Ensure that the settings are as shown in the screenshot.
Click the preview data button and check if you can view the data from the table.
Let us now create a new connection to the source. From the list of available connections, choose SQL Server, as we intend to copy data from SQL Server 2019 on-premises. Ensure that the gateway cluster and connection are already configured and available.
Add a new activity and set the batch count to copy tables in parallel.
We now need to set the Items property, which is dynamically populated at runtime. To set this click on this button as shown in the screenshot and set the value as:
@activity('Get_Table_List').output.value
Add a copy activity to the activity container.
Set the source Table attributes in the copy activity as shown in the screenshot. Click on the edit button and click the “Add dynamic content” button. Ensure that you paste the text only after you click the “Add dynamic content” button, otherwise, the text will not render dynamically during runtime.
Set the Table schema name to:
@item().SourceSchemaName
Set the Table name to:
@item().SourceTableName
Click on the destination tab and set the destination attributes as in the screenshot.
Set the Table schema name to:
@item().TargetSchemaName
Set the Table name to:
@item().TargetTableName
We have configured the pipeline. Now click on save to publish the pipeline.
Run pipeline:
Click the Run button from the top menu to execute the pipeline. Ensure the pipeline runs successfully. This will copy both tables from source to target.
Summary:
In the above example, we have used Microsoft Fabric pipelines to copy data from an on-premises SQL Server 2019 database to Azure SQL Database. You can modify the sink/destination in this pipeline to copy to other sources such as Azure SQL Managed Instance or Azure Database for PostgreSQL. If you are interested in copying data from a mainframe z/OS database, then you will find this blog post from our team also very helpful.
This article is contributed. See the original author and article here.
Dynamics 365 Customer Service is a powerful tool for managing your contact center. Its built-in analytic dashboards, such as the recently launched Omnichannel real-time analytics dashboard, provide a wealth of industry standard KPIs and metrics to help you monitor and improve performance. These dashboards are built on Power BI with two components: a data model (or data set) that houses the KPIs, and reports that visualize the data for viewers. Dynamics 365 Customer Service reads the data from Dataverse, performs transformation logic for each of the KPIs, and makes these KPIs available for you within the data model. You can customize data models using data within Dynamics or external data to view metrics tailored to your needs.
Every Dynamics organization that has analytics enabled gets their copy of this solution deployed and available only to them. While the data model is not editable, the reports are fully customizable through visual customization. This way, you can see and use the data in ways that make sense for your organization. You can view metrics outside of what’s in the out-of-box reports. You can also create additional pivots and dimensions to slice the data as needed.
We have received a lot of feedback from you around the need for customizations. You want to modify the data or logic used to calculate metrics in the data set. You also want to create your metrics in addition to the out-of-box metrics available in the data model. Additionally, you want to create variants of existing metrics or calculate metrics differently based on your organization’s unique processes. Another frequent request has been guidance around building custom dashboards that combine KPIs from Dynamics Customer Service with other applications.
To address these scenarios, Dynamics 365 Customer Service launched model customization. This feature deploys a copy of the data set used by the out-of-box reports into your organization’s Power BI workspace. Therefore, you can build composite models that connect to the Dynamics data model.
By leveraging the out-of-box model and only creating the metrics that are unique to your organization, you can reduce the risk of metric definitions going stale as Dynamics updates its capabilities. This also saves you valuable time and development effort. Furthermore, by using model customization, you can build custom reports and dashboards that combine data from multiple applications. This gives you a more complete picture of your contact center’s performance.
Overall, Dynamics 365 Customer Service provides a powerful set of tools for managing your contact center. Its built-in analytic dashboards offer the specific insights you need to improve contact center performance. And with model customization, you can tailor these to your specific needs.
Recent Comments