AI in CRM and ERP systems: 2024 trends, innovations, and best practices

AI in CRM and ERP systems: 2024 trends, innovations, and best practices

This article is contributed. See the original author and article here.

A new chapter in business AI innovation 

As we begin a new year, large companies and corporations need practical solutions that rapidly drive value. Modern customer relationship management (CRM) and enterprise resource planning (ERP) systems fit perfectly into this category. These solutions build generative AI, automation, and other advanced AI capabilities into the tools that people use every day. Employees can experience new, more effective ways of working and customers can enjoy unprecedented levels of personalized service.  

a person sitting in a chair using their phone

Upgrade your customer experience

Harness the power of AI and boost your sales

If you’re a business leader who has already embraced—or plans to embrace—AI-powered CRM and ERP systems in 2024, you’ll help your organization drive business transformation, innovation, and efficiency in three key ways: 

  • Streamline operations: Transform CRM and ERP systems from siloed applications into a unified, automated ecosystem, enhancing team collaboration and data sharing. 
  • Empower insightful decisions: Provide all employees with AI-powered natural language analysis, allowing them to quickly generate insights needed to inform decisions and identify new market opportunities. 
  • Elevate customer and employee experiences: Personalize customer engagements using 360-degree customer profiles. Also, boost productivity with AI-powered chatbots and automated workflows that free employees to focus on more strategic, high-value work. 

The time has come to think about AI as something much more than a technological tool. It’s a strategic imperative for 2024 and beyond. In this new year, adopting CRM AI for marketing, sales, and service and ERP AI for finance, supply chain, and operations is crucial to competing and getting ahead. 

2023: A transformative year for AI in CRM and ERP systems 

Looking back, 2023 was a breakthrough year for CRM AI and ERP AI. Microsoft rolled out new AI-powered tools and features in its CRM and ERP applications, and other solution providers soon followed. Among other accomplishments, Microsoft launched—and continues to enhance—Microsoft Copilot for Dynamics 365, the world’s first copilot natively built for CRM and ERP systems

Evolving AI technologies to this point was years, even decades, in the making. However, as leaders watched AI in business gradually gain momentum, many took steps to prepare. Some applied new, innovative AI tools and features in isolated pilot projects to better understand the business case for AI, including return on investment (ROI) and time to value. Others forged ahead and broadly adopted it. All wrestled with the challenges associated with AI adoption, such as issues around security, privacy, and compliance.   

In one example, Avanade, a Microsoft solutions provider with more than 5,000 clients, accelerated sales productivity by empowering its consultants with Microsoft Copilot for Sales. Consultants used to manually update client records in their Microsoft Dynamics 365 CRM system and search across disconnected productivity apps for insights needed to qualify leads and better understand accounts. Now, with AI assistance at their fingertips, they can quickly update Dynamics 365 records, summarize emails and meetings, and prepare sales information for client outreach. 

In another example, Domino’s Pizza UK & Ireland Ltd. helped ensure exceptional customer experiences—and optimized inventory and deliveries—with AI-powered predictive analytics in Microsoft Dynamics 365 Supply Chain Management. Previously, planners at Domino’s relied on time-consuming, error-prone spreadsheets to forecast demand at more than 1,300 stores. By using intelligent demand-planning capabilities, they improved their forecasting accuracy by 72%. They can also now quickly generate the insights needed to ensure each store receives the right resources at the right times to fill customer orders.  

All signs indicate that in the years to come organizations will continue to find new, innovative ways to use CRM AI and ERP AI—and that their employees will embrace the shift. 

In recent research that looks at how AI is transforming work, Microsoft surveyed hundreds of early users of generative AI. Key findings showed that 70% of users said generative AI helped them to be more productive, and 68% said it improved the quality of their work. Also, 64% of salespeople surveyed said generative AI helped them to better personalize customer engagements and 67% said it freed them to spend more time with customers.1 

Looking forward, the momentum that AI in business built in 2023 is expected to only grow in 2024. In fact, IDC predicts that global spending on AI solutions will reach more than USD500 billion by 2027. 

Some of the specific AI trends to watch for in 2024 include: 

  • Expansion of data-driven strategies and tactics. User-friendly interfaces with copilot capabilities and customizable dashboards with data visualizations will allow employees in every department to access AI-generated insights and put them in context. With the information they need right at their fingertips, employees will make faster, smarter decisions.  
  • Prioritization of personalization and user experiences. Predictive sales and marketing strategies will mature with assistance from AI in forecasting customer behaviors and preferences and mapping customer journeys, helping marketers be more creative and sellers better engage with customers. Also, AI-powered CRM platforms will be increasingly enriched with social media and other data, providing deeper insights into brand perception and customer behavior.  
  • Greater efficiencies using AI and cloud technologies. Combining the capabilities of AI-powered CRM and ERP tools with scalable, flexible cloud platforms that can store huge amounts of data will drive new efficiencies. Organizations will also increasingly identify new use cases for automation, then quickly build and deploy them in a cloud environment. This will further boost workforce productivity and process accuracy. 
  • Increased scrutiny of AI ethics. Responsible innovation requires organizations to adhere to ethical AI principles, which may require adjustments to business operations and growth strategies. To guide ethical AI development and use, Microsoft has defined responsible AI principles. It also helps advance AI policy, research, and engineering. 

AI innovations on the horizon for CRM and ERP systems

Keep an eye on technological and other innovations in the works across the larger AI ecosystem. For example, watch for continued advancements in low-code/no-code development platforms. With low-code/no-code tools, nontechnical and technical users alike can create AI-enhanced processes and apps that allow them to work with each other and engage with customers in fresh, new ways. 

Innovations in AI will also give rise to new professions, such as AI ethicists, AI integrators, AI trainers, and AI compliance managers. These emerging roles—and ongoing AI skills development—will become increasingly important as you transform your workforce and cultivate AI maturity.  

To learn more about the innovations that will drive—and be driven—by generative AI, read the Gartner® Hype Cycle™ for Artificial Intelligence, 2023.3  

Best practices for AI adoption in 2024 

To drive transformation with AI in CRM and ERP systems, you should carefully plan and implement an approach that works best for your organization. The following best practices for AI adoption, which continue to evolve, can help guide you: 

  • Strategic implementation: Formulate a long-term AI implementation strategy to empower employees and optimize business processes, emphasizing data-driven culture, relevant skills development, and scalable, user-friendly AI tools in CRM and ERP systems. 
  • Ethical adoption: Adhere to evolving ethical guidelines, starting with AI-enhanced process automation and progressing toward innovative value creation, while ensuring your organization is hyperconnected. 
  • Data quality and security: Maintain high data integrity and security standards, regularly auditing AI training data to avoid biases and ensure trustworthiness. 
  • Alignment with business goals: Align AI initiatives with strategic objectives, measuring their impact on business outcomes, and proactively managing any potential negative effects on stakeholders. 

As you and your organization learn more about AI and discover what you can do with it, don’t lose sight of the importance of human and AI collaboration. Strongly advocate for using AI to augment—rather than replace—human expertise and decision-making across your organization. Remember, although employees will appreciate automated workflows and AI-generated insights and recommendations, AI is not infallible. Successful business still depends on people making intelligent, strategic decisions.  

The importance of embracing AI in business 

Immense opportunities exist for organizations across industries to use AI-powered CRM and ERP systems to accelerate business transformation, innovation, and efficiency. According to Forrester Research, businesses that invest in enterprise AI initiatives will boost productivity and creative problem solving by 50% in 2024.Yet, without leaders who are fully engaged in AI planning and implementation, many organizations will struggle to realize AI’s full potential.  

Be a leader who prioritizes and champions AI in your business strategies for 2024. Your leadership must be visionary, calling for changes that span across roles and functions and even your entire industry. It must be practical, grounded in purposeful investments and actions. It must be adaptable, remaining open and flexible to shifting organizational strategies and tactics as AI technologies evolve.  

Team up with a leader in AI innovation 

Wherever your organization is in its AI adoption journey, take the next step by learning more about how AI works with Microsoft Dynamics 365, a comprehensive and customizable suite of intelligent CRM and ERP applications. 

With copilot and other AI-powered capabilities in Dynamics 365, your organization can create unified ecosystems, accelerate growth, and deliver exceptional customer experiences. It can also continually improve operational agility while realizing greater productivity and efficiency. Get started today to make 2024 a transformative year for your organization. 


End notes 

1 What Can Copilot’s Earliest Users Teach Us About Generative AI at Work? 

2 IDC Blog, Top 10 Worldwide IT Industry 2024 Predictions: Mastering AI Everywhere, 1 November 2023. 

3 Gartner, Hype Cycle for Artificial Intelligence, 2023, Afraz Jaffri, 19 July 2023.  

Gartner is a registered trademark and service mark, and Hype Cycle is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved. 

4 Forrester 2024 Predictions: Exploration Generates Progress, Forrester Research, Inc., October 2023. 

The post AI in CRM and ERP systems: 2024 trends, innovations, and best practices appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Logic Apps Aviators Newsletter – March 2024

Logic Apps Aviators Newsletter – March 2024

This article is contributed. See the original author and article here.

In this issue:






Aviators-Logo@0.75x.png


 


Ace Aviator of the Month


 


March’s Ace Aviator: Luís Rigueira


 

LuisHeadshot.jpg


What is your role and title? What are your responsibilities associated with your position?


I am an Integration Developer, and my key responsibilities consist of working with my team and alongside clients, making the transition and integration of their products and services smoother.


 


Can you provide some insights into your day-to-day activities and what a typical day in your role looks like?


Sure, merging a portion of my activities, what I could express as day-to-day would be: I start by checking for any issues in our clients’ production environments to ensure everything’s running smoothly, and then my main activities will be implementing cloud integration solutions with Azure Integration Services. Occasionally, I also help the team on on-premises projects using BizTalk Server.


 


Also, one of my big activities is going deep into Enterprise Integration features and crafting new ways to archive specific tasks. Do proof-of-concept in new features, explore existing or new services, test those solutions, and find alternatives, for example, creating Azure functions as an alternative to the Integration Account and testing inside Logic App flows to use those Azure functions.


 


I’m always on the hunt for new solutions to any problems we face, and in doing so, there’s a lot of documenting everything we do. This documentation is more than just busy work; it really helps by streamlining our processes and guides our team and community through troubleshooting. To ensure the importance of knowledge sharing, I actively produce informative content for our blog and YouTube Channel. This includes writing posts and creating videos that share our experiences, solutions, and insights with a broader audience.


 


I also contribute to enhancing our team’s productivity by creating tools tailored to address specific issues or streamline processes that are later shared with the community.


 


What motivates and inspires you to be an active member of the Aviators/Microsoft community?


What really drives me to engage with the Aviators/Microsoft community is my passion for tackling challenges and finding solutions. There’s something incredibly rewarding about cracking a tough problem and then being able to pass on that knowledge to others. I believe we’ve all had that moment of gratitude towards someone who’s taken the time to document and share a solution to the exact issue we were facing. That cycle of giving and receiving is what motivates me the most. It’s about contributing to a community that has been so important in my own learning and problem-solving journey, and I’m inspired to give back and assist others in the same way.


 


Looking back, what advice do you wish you would have been told earlier on that you would give to individuals looking to become involved in STEM/technology?


I could say something about always having a passion for new technologies and staying up to date with what you are pursuing. There would be nothing wrong with it, but those sound like already-at-hand phrases to be exchanged without considering each individual’s current state.


 


On a moment, and in a world where mental health is so important, let me share a simple tale that resonates with anyone at the crossroads of their career, whether they are new and confused about what to do, whether they’re just starting, or contemplating a shift in direction. It’s a gentle reminder that venturing into new territories can be daunting but immensely rewarding and that, at times, we may not even realize that our current paths could be detrimental to our well-being, professional growth, and personal relationships.


 


“There was once a man that went into the wilds of Africa, believing himself to be a hunter for many years. Despite his efforts, he found himself unable to catch any game. Overwhelmed by frustration and feeling lost, he sought the guidance of a Shaman from a nearby tribe.


Confessing to the Shaman, he said, “Hunting is what I live for, but I’m hitting a wall. There’s simply nothing out there for me to hunt, and I don’t know what to do.”


The Shaman, who had seen many seasons and had a kind of wisdom you don’t come across every day, simply put his arm on the hunter’s shoulder, looked him in the eyes and said, “Really? Nothing to hunt for? This land has fed us for generations. There is plenty of hunt out there and yet you cannot see it? Maybe the problem isn’t the land…allow me to ask you something very important, do you genuinely desire to be a hunter?”


 


This narrative goes much deeper than the act of hunting. It’s a reflection on our passions, how we confront our challenges, and the realization that our perspective might need a shift.


 


If our passions no longer ignite us, or if our efforts to chase them lead nowhere, it might be a sign to let go, not in defeat, but in liberation, because, in the end, I want everyone to be happy with the career path they have chosen, so that would be my advice, to read this simple tale, apply it to your current situation and ask yourself, “Do I really want to do keep doing what I am doing right now?” And if you find that your current path is not worth pursuing, if your mental health is not in shape, or if you are hitting a wall, then yes, it is time to take the step!


 


Imagine you had a magic wand that could create a feature in Logic Apps. What would this feature be and why?


In a world where AI is at such a fast pace, one feature that I would personally like to have on Logic Apps is prompted AI-generated Logic App flows. What that would mean is you give a prompt to the designer of what you pretend, and you would have a generated, most efficient flow for what you have described. Of course, you will still need to configure some things, but I think AI-generated flows could outline and cover many scenarios, making our processes faster and more efficient.


AI is here to stay, whether we like it or not; it just doesn’t go away, so we could take advantage of it to create better, faster, and more efficient products or stay behind while we see others do it.


 


What are some of the most important lessons you’ve learned throughout your career that surprised you?


One of the most surprising yet vital lessons from my career is the central role of relationships in keeping the ship sailing smoothly. Having positive communication and nurturing a positive work environment are crucial elements that empower a team to deliver top-notch results, remain driven, and maximize their daily potential. A car has four tires, and you need them all to get home safely.




Customer Corner:


Power transmission operator ELES gains agility with Microsoft Dynamics 365


 


ELES.jpg


 


Check out this customer success story on how Microsoft is helping to keep Slovenia’s lights on by improving and modernizing ELES’ operations. In 2012, ELES turned to Microsoft when they needed a new enterprise resource planning (ERP) solution. Today, ELES uses Azure Logic Apps to connect their ERP with other systems, improving collaboration between departments, streamline operations, and better manage their energy resources and infrastructure. 




News from our product group:


 





























feranto_4-1706920211996.png



Announcing the availability of TLS 1.3 in Azure API Management in Preview   


Read more about updates to TLS 1.3, the latest version of the internet’s most deployed security protocol. 


Pedro_M_Almeida_0-1708078171207.png



IBM MQ Built-in (In-App) connector and Queue Handles: The math behind Queue Handles   


For those using the IBM MQ Built-in (In-App) connector available in Logic Apps Standard, check out this article explain more on Handles and how to calculate the max value to set in your IBM MQ server. 


shahparth_0-1707165935171.png

Public Preview of Azure OpenAI and AI Search in-app connectors for Logic Apps (Standard)   


Check out this article on our two new built-in connectors (Azure OpenAI and Azure AI Search) that are now available for public preview. 


Saroja_L_Sattigeri_0-1707278985222.png



Common issues of Azure Automation connector in logic app(consumption) and logic app standard   


Learn about various issue scenarios related to the Azure Automation connector in both Logic App Consumption and Standard, along with its causes and resolutions. 


deprecated sql.png

Identify Logic Apps Consumption that are using deprecated V1 Actions and Triggers for SQL Connector   


V1 Actions/Triggers of the SQL Connector for Logic Apps will be deprecated by the end of March 2024. In this article, learn how to use a PowerShell Script to identify the Logic Apps still using the deprecated SQL Connectors so that you can change them to the V2 equivalent.




BlogPreview_default-blue.png



Integration Service Environment to Logic Apps Standard Migration    


ISE’s retirement date is August 31st, 2024, so make sure you migrate any Logic Apps running on ISE to Logic Apps Standard. Check out this guide video from our FastTrack team that walks you through the whole process!



 




News from our community:


From BizTalk to Logic Apps: A Real-World Integration Journey – February 2024 


Post by Houston Azure User Group 


 


Check out this recording from the February 2024 meetup for Houston Azure User Group where Azure customers dive into their journey from on-premises Biztalk to Azure Logic Apps hosted in an Integration Service Environment (ISE).


 


Deploying Multi-Line Values in ARM Templates   


Post by Harris Kristanto 


 


If you’re deploying an ARM template due to multi-line secrets stored in Azure Key Vault, then make sure to read this article from Harris!


 


Friday Fact: When a HTTP Request is Received trigger can accept more http methods other than POST   


Post by Luís Rigueira and Sandro Pereira


 


Check out this post and video from collaborators Luís and Sandro and learn about other HTTP methods you can use in your Logic Apps. 


 


Building an API to call Logic Apps Consumption   


Post by Mike Stephenson 


 


Want to make life a bit easier and learn how to build an APIM API that exposes Logic App Consumption with Terraform? Then check out Mike’s new video.


 


Using Managed Identity with Consumption and Standard Logic Apps with Azure Service Bus   


Post by Stephen W Thomas 


 


Read Stephen’s quick tips and tricks for using Azure Service Bus to grant access to specific consumption or standard edition Logic Apps.


 


Easy Auth | Standard Logic App with Azure API Management   


Post by Andrew Wilson 


 


Learn about using Easy Auth with Logic App Standard and restricting to API Management as a Backend.


 


Migrating from MuleSoft to Azure Integration Services (AIS): Why and How   


Post by Derek Marley and Tim Bieber 


 


Watch this recording from a webinar hosted by Derek and Tim as they talk about the benefits of Azure’s ecosystem and a step-by-step strategy for a smooth transition from MuleSoft to AIS.


 


Microsoft Fabric: Automating Fabric Capacity Scaling with Azure Logic Apps   


Post by Soheil Bakhshi 


 


Learn from Soheil about how to efficiently upscale and downscale your capacity based on specific workloads in this blog post.


 


What is Azure NAT Gateway | What is SNAT Port Exhaustion | Azure Standard Logic App Static Outbound IP Address   


Post by Sri Gunnala 


 


Learn about Azure NAT Gateway and how to use it with Azure Logic Apps to solve business problems from Sri.

Process Advisor for Supply Chain and Warehousing.

Process Advisor for Supply Chain and Warehousing.

This article is contributed. See the original author and article here.

Editor: Denis Conway

Who should use it, why and for what?

Introduction

Performance evaluation has been revolutionized by technology, extending its reach to the individual level. Consider health apps on your smartphone. They gather data breadcrumbs from your daily activities, providing analysis of your movement patterns. This isn’t a generic data compilation, but a near-accurate reflection of your physical activity during a specific period.

In the future, it’s conceivable that these apps might be equipped with an AI companion, or Copilot, to guide your next steps based on your past activities. It could suggest rest days or additional exercise to help you achieve your personal health goals.

This concept of performance evaluation based on collected data is the bedrock of process mining and process comparison. Our Copilot functionality adds a layer of assistance, enabling you to make informed decisions about your warehouse operations.

In this context, Copilot can help you optimize warehouse processes. It can identify bottlenecks in certain processes or compare different methods to achieve the same goal, empowering you to choose the most optimal method for your specific case.

In this blog, we’ll explore the essence of this feature, its intended audience, and how and why you should leverage it for your warehousing operations.

Process Mining Insights:

At first glance, using Process Advisor for material movement analysis is easy. The setup process is straightforward:

  1. Go to Warehouse Management > Setup > Process Mining > Warehouse material movement configuration. In the taskbar, select Deploy New Process.
  2. The configuration Wizard will open. Press Next, then enter the name of the process in the field Process Name, choose company, choose number of months to load (12 months = data from latest 12 months) and choose the appropriate Activity.  Press Next.
  3. Process is deployed.

The configuration wizard looks like this:

graphical user interface, text, application, Word
Image: Configuration wizard screenshot.

The easy part is now complete. We have set up a process, named it, and loaded 12 months of data to prepare for our analysis. The difficult part is making sense of our data and using it to make decisions to improve our warehouse output.

Therefore, we will provide you with some real-life examples on how to use the data analysis functionality to understand your processes, and a scenario where we evaluate two different methods and use the Process Advisor to figure out which method would be preferred for our business operations.

Analysis of data

There are multiple ways to analyze your process data to understand and compare your processes.

  1. Start with opening Power Automate and go to the tab Process Mining. The report is accessible on the main page.
  2. Report: When the report is loaded, it can look like this:
graphical user interface
Image: Process Mining Case Summary.

3. Select Map

Select the Map tab to display the process map:

Process Mining
Image: Process Mining Map.

This is a screenshot of the process map from our example. On the Map, there are separate locations on which actions(tasks) have taken place, as well as the time spent on this location and between locations. You can change the time unit to, let’s say mean duration, to see how long each activity in a particular location takes per average.  

4. Use the Co-Pilot to get started.

We provide you with suggestions for frequent prompts, but you can of course choose to enter whatever you want. In this case, we will use the suggested “provide the top insights” prompt.  

graphical user interface, application, Teams
Image: Process Mining map with Copilot.

5. Copilot Generates

The Copilot generates a response based on the data in your process map. In the example, we can see that the Copilot has found the “BULK” as the longest running activity, and provided us with a list of the activities with the greatest number of repetitions:

graphical user interface, application, Teams
Image: Process Mining map and Copilot generated answer.

6. Copilot Follow Up

We can also ask the Co-pilot follow-up questions. In this case, we will follow-up with the suggested “How to identify my bottleneck?” and “Find my Bottleneck” prompts. The Co-pilot generates a message explaining what the bottleneck is and its mean duration. In this instance, since we have selected the metric Mean duration, we will generate an answer reflecting this metric.

graphical user interface, application, Teams
Image: Process Mining map with Copilot generated answer.

The message we receive tells us that the Variant with the highest duration is “Variant 2” with a mean duration of 2 minutes and 47 seconds.
It also tells us that the activity with the highest mean duration is “BULK” with a mean duration of 15 minutes.

From this, we can draw the conclusion that “Variant 2” is the variant that takes the longest time to complete, and that the most amount of time is spent in the “BULK” location.

By using the process advisor for warehouse movement material analysis, we can streamline warehouse operations and ensure we don’t spend more time than we need on a particular task or operation.
Another example where the Process Advisor can be utilized to enhance operational fluidity in your warehouse is by comparing different methods of achieving a similar goal, to understand which method is more effective to reach your desired goal. We will try to explain how to conduct such a comparison to with a test-case.

In our test-case, we will compare two different methods of picking goods in the Warehouse to figure out which picking method takes less time, so we can increase the Warehouse output.

Test Case : “Single order picking” vs “Cluster picking”

In this test-case, the user wants to know which method of picking is faster, “Single order picking” vs “Cluster picking”. To compare the two, the user goes through the following steps. First, the user creates a Hypothesis for the purpose of this test-case. In this case, the user wants to determine which picking method is faster.

Secondly, the user decides the scope of the test. For both methods, the user will have 5 sales orders with one to five different items per order, in different quantities. Both methods will use identical sales orders for test purposes.
In the Work Details screen, we can see the work details for the work that has been created.
The Variants are the different Variants of work, so in this instance, for work ID USMF-002327 with Order number 002375 (displayed in the picture) the worker will “Pick” 1 piece of item LB0001 in 5 different variations (in this case colors), then “Put” these 5 items away in packing area (location “PACK”).

table
Image: Work details screenshot.
diagram
Image: Work details label(s).

With the “Single order picking” method, the worker picks one order at a time and puts it in the packing location. To clarify, the warehouse worker will go to each location where the item is located, pick and scan that item, repeat the process for each item in that order, take the order to pack location and then repeat with next order. 

Worker goes to 5 different locations to pick items, then proceeds to “PACK” location to put items away for packing. Then, the worker repeats the process for the other orders.

chart, box and whisker chart
Image: Picking locations

After we have constructed our hypothesis and determined the scope, we can go ahead and prepare for the analysis.

First, we will have to deploy our process comparison. We head into Warehouse Management > Setup > Process Mining > Warehouse material process configuration, and in the taskbar, we select Deploy New Process. We select a fitting description as the Process Name, select company and number of months to load. In this test case, we will only be loading one month of data since we don’t need more for this test’s purposes.

Usually, you would want as much correct data(not corrupted/faulty data since this will affect the analysis) and necessary (scope needs to determine how much and what is necessary) data as possible to get a high-quality analysis.  
When our process has been deployed, we can move on to the analysis and evaluate this process.

We load our process map into Power Automate, and in the beginning, it will look something like this:

Image: Process Map Starting view.

We can press the Play Animation button to get a representation of the process.

graphical user interface, application
Image: Process Map Starting view.

In the Statistics tab, we can see basic information of the process.

graphical user interface, application
Image: Process mining statistics tab overview.

In the Variants tab, we can view the different work-Variants. By selecting one, we can get an in-depth view of, in this case, “Variant 3”. We can see that in this variant, 6 cases occurred, the total duration was 8 minutes and 15 seconds, and the total active time was 8 minutes and 14 seconds.
In this case, the attribute selected is Zone. If we look closely at the Variants, we can see that “Variant 2” has 2 cases and the others have 1.

This means that two pieces of “work” that was scheduled were so similar that they could be grouped. This is because, from a warehouse management perspective, the operation is identical. This is because the worker goes to one location, picks item(s) 1, goes to another location and picks item(s) 2, then put them away in “PACK”. Thus, it is two “Pick” operations and one “Put”, and therefore they will be grouped in this view.    

graphical user interface, application
Image: Process mining variants tab zone overview.

We can also change the Variants’ view by changing the Attribute selected. In this case, we will change the attribute from Zone to Order number. This will change our view, so that we see different Variants based on work type. It will in this case show us 5 variants, which at first can seem confusing. A new variant is displayed with these settings, since this now displays Variants by order number instead of zone, which means that we get one variant for each Sales order we created, since all of them were different from each other. 

graphical user interface, application
Image: Process mining variants tab order number overview.

In this instance, we can see the order numbers in the legend on the right side. This view tells us that we have 5 different order numbers, and the boxes below Variants Overview represents the number of work operations performed per Order Number. The Case Count per order number, in the case of “Variant 2” there has been a total of 6 operations performed (pick, pick, pick, pick, pick, put, as mentioned previously) and in the case of Variant 4 and 5, there has been a total of 3 case count (Pick, Pick, Put).

For this scenario, it can be helpful to see how much work we are performing per event. If we want a view where we can see how much work we do per event, we can switch Attribute to Work Quantity. This will in this instance allow us to see the quantity of work that needs to be performed for each event. In the example of “Variant 2” the interface tells us that 6 events have taken place, in 5 of the events quantity has been 1, and in one of the events quantities was 5. To put this into a warehouse perspective, this means that we have performed 5 of the events 1 time each, which for Variant 2 is “Pick item 1, Pick item 2, Pick item 3, Pick item 4, Pick item 5” and one event where we “Put” away these items 5 times.
That single operation is performed 5 times and counts as one event because it is the same event occurring multiple times, whilst the other event, even though they are all “Pick” events, will count as individual events due to picking different products, which are all in different locations. When we “Put” away in “PACK” location, we don’t put the items in different locations, thus it counts as one event.

chart
Image: Process mining variants tab work quantity overview.

If we select Attribute by Work type, this becomes clear:

chart
Image: Process mining variants tab work type overview.

We might want to see the location where the events took place. To do that, we can set Attribute to Location, and the view will show us the locations of the events below the header Variants overview.

graphical user interface, application
Image: Process mining variants tab work location overview.

In this image, we can see the variants based on location. To put this into context, “Variant 6” tells us 6 events have taken place, all in different parts of the warehouse. For “Variant 10”, we can see that one event took place in “LEGOLOC301” and one in “PACK”.

Now, after we have made ourselves comfortable within the report, we can start analyzing our process. To do that, press the Process Compare button below Variants.

A view similar to this one will appear:

Image: Process compare variants tab location map overview.

In the process map displayed on the screen, we have set the Mining attribute to Location, and the Metric to Total duration. This will allow us to see the total amount of time spent in each location.

By changing the Metric to Total count, we can see the number of times an event took place in each location, as the picture below displays:

diagram
Image: Process compare variants tab location map overview.

The total amount of time spent in one location and number of cases per location might be valuable, but a more telling metric could be how much time we spent on average per location.

By switching metric to mean duration, we can see the average time spent per location. This gives us yet another hint on which part of the process takes the most amount of time to manage. But, if we want to see how it looks from a proportional perspective, by toggling the percentage sign next to the Metric drop-down menu, we will achieve exactly that.

Image: Process compare variants tab location and mean duration map overview.

As we can see from the image above, LEGOLOC 201 is the location in which we spend the largest percentage of our time.
If we want to further examine what is going on in that location, we can do so by pressing the bar. This will change the view slightly, and a card with detailed information will appear on the right of the screen.  

graphical user interface, text, application
Image: Process compare variants tab location map detailed view.

In the highlighted red box, we can see detailed performance data to further assess the performance in this location.

 Now, we have enough information to draw some conclusions on our own. We have identified zone LEGOLOC 201 as our “time-thief”, and we know that more than 1/3 of the time was spent on picking items in this zone.
To make the analysis process easier, Microsoft’s Copilot has been built into this feature.
By pressing the Copilot sign in the top-right corner, you will open the dialogue box where you can create a prompt and ask the Copilot about your process. The Copilot will suggest some common prompts, but you can of course create your own. In this case, we will ask the Copilot to summarize our process.   

diagram, engineering drawing
Image: Process compare map and Copilot dialogue.
diagram, engineering drawing
Image: Process compare map and Copilot generated answer.

As displayed in the picture, the Copilot will give us a summary of the process. Because we have selected to compare our first part of the test vs our default value (the red locations), it also summarizes the default value’s process.

We do get some information on how many events took place etc., but we did not get the total case time, which was the value we wanted to find to confirm or deny our hypothesis. By asking the Copilot what the average case duration and the total case duration was, we received the answer that mean case duration was 4 minutes and 18 seconds, and total duration was 21 minutes and 31 seconds.

So, our answer in this case is that the Single order picking took 21 minutes and 31 seconds to complete.

diagram, engineering drawing
Image: Process compare map and Copilot generated answer.

Now, we will compare the result to the cluster picking method, to see how they compare.

For context, cluster picking differs from single order picking in the sense that in cluster picking, workers pick multiple orders simultaneously and not one at a time. In this case, it means the worker will pick all 5 sales orders, then put them all away in the packing station at the same time, rather than picking an order, putting them away in the packing station, and repeating for next orders.

table
Image: Work clusters screenshot.

In this image, we can see the main difference between these picking methods. For cluster picking, we can see that the warehouse worker is tasked with picking 8 pieces of red Lego blocks (left image), and in the second screenshot (right) we can see how many and from which specific positions items should be picked.

graphical user interface, table
Image: Work clusters screenshot with illustrations.

When all items have been picked, the Work status will be updated so all Cluster positions are “In process”.

table
Image: Work Cluster in progress.

Next task is to put all items in the packing station. When we have done that, all Cluster position Work statuses will be changed to Closed.

graphical user interface, application, Word
Image: Cluster Put screenshot.

As we can see in the image below, work status has been changed to Closed across the board.

table
Image: Work Clusters status closed.

Now, let’s jump back to the analysis. Start by creating a new process in the same way we did for single order picking and open the process map in Power Automate. In our test case, this is what we are shown on our screen.

diagram
Image: Process Compare map.

As we have already covered how choosing different metrics affects the process map and the information on display, we will not do that for this part of the test, since we know we need to compare location as the Mining attribute, and total duration as the Metric.

We will again use the help of the Copilot to evaluate the process map. Once again, we ask for a summary of the process.

diagram
Image: Process Compare map and Copilot generated insight.
Test Case Results

The summary from the Copilot tells us that this process started November 6th and ended after 8 minutes and 45 seconds.

This means we have successfully confirmed our hypothesis by using process mining and the process advisor.
Now we know for a fact that for one picker with 5 sales orders constructed in this manner, cluster picking is a much more efficient picking method compared to single order picking, since identical amount of work took significantly less time to complete. Therefore, we can draw the conclusion that for all work with similar characteristics, we should prefer using cluster picking over single order picking, at least if we want to increase warehouse output.

Keep in mind, harnessing the power of Process Advisor requires an analytical mindset and a structured approach. The sheer volume of headers, variants, locations, and numbers can be overwhelming. To navigate this complexity, emulate the structured methodology illustrated in this example. By having a clear understanding of your comparison and measurement objectives, and a strategy to achieve them, you’ll significantly enhance the outcomes derived from Process Advisor.

Essential skills for effective process mining:

Use a fact-based approach with warehouse data as the base.

  • Use a strategic and tactical approach throughout the analysis.
  • Unlike this example, a great way of using process mining is by using continuous analysis, where you monitor something over time, rather than one-time analysis, which it can also be used for, as in this example.
  • Use quick data for immediate insights, and big data for continuous and conclusive analysis.
  • Master filtering to gain valuable insights and sort out what you believe is important.
Wealth of achievements made possible through process mining:
  • Identify areas in which processes can be improved.
  • Validate conformance of processes.
  • Do process simulation and predictive analysis.
  • Discover the most optimal paths for automatization.
Conclusion:

The power of Process Advisor extends far beyond what we’ve explored in this blog. It’s a versatile tool that can be adapted to a myriad of scenarios, and this guide merely scratches the surface of its potential. We’ve used it here to streamline warehouse operations, but the possibilities are truly limitless.

We encourage you to dive in and experiment with Process Advisor. Use the scenario we’ve outlined as a starting point, but don’t stop there. Input your own warehouse data and see firsthand how Process Advisor can illuminate opportunities for efficiency and growth. The journey towards optimizing your warehouse output begins with the Process Advisor.

Learn More

Related documentation:

Overview of process mining in Power Automate – Power Automate | Microsoft Learn

The post Process Advisor for Supply Chain and Warehousing. appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

MGDC for SharePoint FAQ: Is OneDrive included?

MGDC for SharePoint FAQ: Is OneDrive included?

This article is contributed. See the original author and article here.

1. SharePoint datasets and OneDrive


 


When I describe the SharePoint datasets in Microsoft Graph Data Connect to someone, I frequently get this question: do Sites and Sharing Permissions cover only SharePoint or do they include OneDrive? The short answer is that OneDrive is included, but there is much more to say here…


 


2. OneDrive is a type of SharePoint site


 


For most technical intents and purposes, a OneDrive in your Microsoft 365 tenant is a SharePoint site with a specific template and permissions. It is basically a SharePoint site collection for personal use that comes preconfigured with permissions for the owner and nobody else. After that, you can upload/create files and decide to keep them private or share with others from there.


 


This special type of site was initially called a “Personal Site”, later was referred to as a “My Site” or “MySite”, then a “OneDrive for Business” (commonly abbreviated to “ODfB” or simply “ODB”). These days, we usually just call it a OneDrive and you can figure out if we’re talking about the consumer or business variety based on context.


 


Along the way, the purpose has always been the same. To allow someone in a tenant to store information needed for your personal work, with the ability to share with others as necessary. As the name suggests, it’s your single drive in the cloud to store all your business-related personal files.


 


The personal sites for each user are typically created only when the user tries to access their OneDrive for the first time. SharePoint does offer administrators a mechanism to pre-provision accounts. You can read more about it at https://learn.microsoft.com/en-us/sharepoint/pre-provision-accounts.


 


But keep in mind that, when you use the Microsoft Graph Data Connect to pull the Sites dataset, you get all types of sites in the tenant and that does include OneDrives.


 


3. How can you tell them apart?


 


In the Sites dataset, you can tell a site is a OneDrive by looking at the RootWeb.WebTemplate (which is “SPSPERS” for OneDrive) or the RootWeb.WebTemplateId (which is 21 for OneDrive). Note that these are properties of the Root Web for the site (more on this later).


 


For the other Microsoft Graph Data Connect for SharePoint datasets, you can use the SiteId property to join with the Sites dataset and find the Template or Template Id. This is a reliable method and the recommended one.


 


Some of the datasets might also have a URL property which can be used to identify a OneDrive. For the Sharing Permissions dataset, for instance, an ItemURL that starts with “personal/” indicates a permission for a OneDrive. You can read more about OneDrive URLs at https://learn.microsoft.com/en-us/sharepoint/list-onedrive-urls.


 


Using the URL is probably OK for most tenants using OneDrive but might not work for other site types.


 


4. Root Web


 


It is good to clarify why the Template and TemplateId properties come from the RootWeb property and it’s not a property of the site itself.


 


For starters, it’s important to understand the main SharePoint entities:



  1. There are many tenants.

  2. Tenants have Sites, also known as Site Collections.

  3. Sites (Site Collections) have Webs, also known as Subsites.

  4. Webs (Subsites) have Lists, some of which are called libraries or document libraries.

  5. Lists have List Items (document libraries have folders and documents)


 


As you can see, there is a hierarchy.


 


HierarchyHierarchy


 


 



The relationship between Sites and Webs is particularly interesting. When you create a Site, you must tell SharePoint the type of Site you want. That is used to create the Site and the main Web inside, called the RootWeb.


 


Every Site Collection has at least one Web and most have only one (the Root Web). The Site’s name and type (template) ends up being stored in the Root Web. Most templates don’t even have an option to add more webs (subsites). I would recommend keeping things simple and having only one web per site.


 


Note: You will sometimes hear people refer to Webs as Sites, which is a term normally used for Site Collections. Since most Site Collections have only one Web, that is typically not a big issue. That can get a little confusing at times, so you might want to stick to using the unambiguous terms “Site Collections” and “Webs” to be extra clear.


 


5. Web Templates


 


When you create a Site Collection and its corresponding Root Web, you must choose a Web Template. Each Web Template comes with a few default lists and libraries.


 


Some of these Web Templates (like Team Sites and Communication Sites) help you get started with a new Site. Others are not meant to be created by end users but are used for specific scenarios (like the Compliance Policy Center, the Search Center or the Tenant Admin Site). As we mentioned before, one of these templates is the Personal Site or OneDrive.


 


Here’s a list of some common Web Templates used by SharePoint Online:


 







































































Web Template Id Web Template Description
1 STS Classic Team Site
16 TENANTADMIN Tenant Admin Site
18 APPCATALOG App Catalog Site
21 SPSPERS OneDrive (Personal Site)
54 SPSMSITEHOST My Site Host
56 ENTERWIKI Enterprise Wiki
64 GROUP Office 365 group-connected Team Site
68 SITEPAGEPUBLISHING Communication site
69 TEAMCHANNEL Team Channel
90 SRCHCENTERLITE Basic Search Center
301 REDIRECTSITE Redirect Site
3500 POLICYCTR Compliance Policy Center


 


Note: There are many more of these templates, not only the ones listed above. You can get a list of the templates available to you using the Get-SPOWebTemplate PowerShell cmdlet:


 


> Install-Module -Name Microsoft.Online.SharePoint.PowerShell
> Connect-SPOService -url https://archimedes-admin.sharepoint.com
> Get-SPOWebTemplate | Select Name,Title | Sort Name | Format-List

Name : BDR#0
Title : Document Center

Name : BICenterSite#0
Title : Business Intelligence Center

Name : BLANKINTERNETCONTAINER#0
Title : Publishing Portal

Name : COMMUNITY#0
Title : Community Site

Name : COMMUNITYPORTAL#0
Title : Community Portal

Name : DEV#0
Title : Developer Site

Name : EHS#1
Title : Team Site – SharePoint Online configuration

Name : ENTERWIKI#0
Title : Enterprise Wiki

Name : OFFILE#1
Title : Records Center

Name : PRODUCTCATALOG#0
Title : Product Catalog

Name : PROJECTSITE#0
Title : Project Site

Name : SITEPAGEPUBLISHING#0
Title : Communication site

Name : SRCHCEN#0
Title : Enterprise Search Center

Name : SRCHCENTERLITE#0
Title : Basic Search Center

Name : STS#0
Title : Team site (classic experience)

Name : STS#3
Title : Team site (no Microsoft 365 group)

Name : visprus#0
Title : Visio Process Repository


 


6. They are all in there…


 


So, I hope it’s clear that the Microsoft Graph Data Connect for SharePoint datasets (like Sites, Sharing Permissions and Groups) include information for all types of sites in the tenant, regardless of the Template they use. You can use the Sites dataset to understand Team Sites, OneDrives, and Communication Sites. The Sharing Permissions dataset includes permissions for all these different types of sites.


 


Note: For more information, visit the main blog at Links about SharePoint on MGDC.

Unlocking AI Skills: A Guide for Tech Students,  Microsoft UK AI Challenge

Unlocking AI Skills: A Guide for Tech Students, Microsoft UK AI Challenge

This article is contributed. See the original author and article here.

Artificial intelligence (AI) is transforming the world of work, creating new opportunities and challenges for businesses and workers alike. According to a recent report by Microsoft and PwC, AI could boost the UK economy by £232 billion by 2030, but it also requires a significant upskilling of the workforce to ensure that everyone can benefit from it.


If you are a technology student or a young professional who wants to develop AI skills and prepare for the future of work, here are some tips and resources that can help you: The Microsoft UK AI & Copilot Skills Challenge starts February 20, 2024 at 8:00 AM (8:00) GMT and ends on March 31, 2024 at 23:00 PM (11pm) GMT.

AISkills.jpeg



  • Learn the basics of AI and its applications. AI is a broad field that encompasses many subdomains, such as machine learning, computer vision, natural language processing, and more. To get started, you can take online courses, such as Microsoft Learn, edX, or Coursera, that cover the fundamentals of AI and how it can be used to solve real-world problems. You can also explore Learn AI Microsoft Resources learning paths and hands-on labs for various AI scenarios and communities.

  • Get hands-on experience with AI tools and platforms. To apply your AI knowledge and skills, you need to familiarize yourself with the tools and platforms that enable you to build, deploy, and manage AI solutions. For example, you can use Azure AI Studio, a cloud-based service that provides a comprehensive set of AI capabilities, such as cognitive services, machine learning, and conversational AI. You can also use Power Platform, a low-code/no-code platform that allows you to create AI-powered apps, workflows, and chatbots without writing code.

  • Join AI communities and events. One of the best ways to learn and grow your AI skills is to connect with other AI enthusiasts and experts, who can offer you guidance, feedback, and inspiration. You can join online or local AI communities, such as the Gobal AI Community, where you can network, share ideas, and collaborate on projects. You can also attend AI events, where you can hear from industry leaders, discover the latest trends, and showcase your work.

  • Keep up with the ethical and social implications of AI. As AI becomes more pervasive and powerful, it also raises important ethical and social questions, such as how to ensure fairness, accountability, transparency, and human dignity in AI systems. To be a responsible AI practitioner, you need to be aware of these issues and how to address them in your work. You can read books, articles, and reports, such as The Future Computed, AI Ethics, or Responsible AI, that explore the ethical and social dimensions of AI. You can also take courses, that teach you how to design and implement AI solutions that align with ethical principles and social values.


AI is a fast-growing and exciting field that offers many opportunities for technology students and professionals. By following these tips and resources, you can develop AI skills that will help you succeed in the future of work. Remember, AI is not only about technology, but also about people, society, and the world. So, be curious, be creative, and be ethical, and you will be ready to make a positive impact with AI. 



Learn and develop essential AI and Copilot skills with the UK AI Skills Challenge


Get ahead with immersive and curated AI, Generative AI and Copilot training content across Microsoft products and services with four engaging themed challenges. Once you complete a challenge, you will receive a Microsoft UK AI & Copilot Skills Challenge badge of completion. For more info refer to the official rules.


As you progress through the challenges, you’ll have the chance to explore additional experiences tailored to your learning preferences and goals. Join the vibrant technical community in your local region, attend live sessions, build a powerful network, and build in-demand AI skills for today’s job market.


 







Generative AI


This challenge focused on understanding Generative AI and Large Language Models. Discover the fundamentals of generative AI and get started with Azure OpenAI Service. You’ll learn more about prompt engineering, generating code with Azure OpenAI Services, large language models, and prompt flow to develop large language model apps.


Go to challenge



Copilot for Microsoft 365 – IT Pro Administration


This challenge is tailored for IT Pro Administrators seeking to leverage Copilot for Microsoft 365 effectively in their work environments. The series of modules covers a range of topics from basic introductions to advanced management techniques, ensuring a comprehensive learning experience.


Go to challenge





Copilot for Developers


This challenge is tailored for developers who want to learn how to build apps for Microsoft Teams and get to know Microsoft Copilot Studio. It includes a series of modules that will give you practical experience and valuable knowledge about creating, launching, and improving apps on these platforms.


Go to challenge



Machine Learning Challenge


Machine learning is at the core of artificial intelligence, and many modern services depend on predictive machine learning models. Learn how to use Azure Machine Learning to create and publish models without writing code. You’ll also explore the various developer tools you can use to interact with the workspace.


Go to challenge