Setup Power Platform Pipelines

Setup Power Platform Pipelines

Given that you need to setup Power Platform Pipelines, here’s a post for you!This post will walk you through on how you can setup Power Platform Pipelines. Pre-Requisites Here’s what you need to setup in order to enable Power Platform Pipelines – Setting up Environments Here’s how you can setup your Environments in the – … Continue reading Setup Power Platform Pipelines

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Logic Apps Aviators Newsletter – March 2024

Logic Apps Aviators Newsletter – March 2024

This article is contributed. See the original author and article here.

In this issue:






Aviators-Logo@0.75x.png


 


Ace Aviator of the Month


 


March’s Ace Aviator: Luís Rigueira


 

LuisHeadshot.jpg


What is your role and title? What are your responsibilities associated with your position?


I am an Integration Developer, and my key responsibilities consist of working with my team and alongside clients, making the transition and integration of their products and services smoother.


 


Can you provide some insights into your day-to-day activities and what a typical day in your role looks like?


Sure, merging a portion of my activities, what I could express as day-to-day would be: I start by checking for any issues in our clients’ production environments to ensure everything’s running smoothly, and then my main activities will be implementing cloud integration solutions with Azure Integration Services. Occasionally, I also help the team on on-premises projects using BizTalk Server.


 


Also, one of my big activities is going deep into Enterprise Integration features and crafting new ways to archive specific tasks. Do proof-of-concept in new features, explore existing or new services, test those solutions, and find alternatives, for example, creating Azure functions as an alternative to the Integration Account and testing inside Logic App flows to use those Azure functions.


 


I’m always on the hunt for new solutions to any problems we face, and in doing so, there’s a lot of documenting everything we do. This documentation is more than just busy work; it really helps by streamlining our processes and guides our team and community through troubleshooting. To ensure the importance of knowledge sharing, I actively produce informative content for our blog and YouTube Channel. This includes writing posts and creating videos that share our experiences, solutions, and insights with a broader audience.


 


I also contribute to enhancing our team’s productivity by creating tools tailored to address specific issues or streamline processes that are later shared with the community.


 


What motivates and inspires you to be an active member of the Aviators/Microsoft community?


What really drives me to engage with the Aviators/Microsoft community is my passion for tackling challenges and finding solutions. There’s something incredibly rewarding about cracking a tough problem and then being able to pass on that knowledge to others. I believe we’ve all had that moment of gratitude towards someone who’s taken the time to document and share a solution to the exact issue we were facing. That cycle of giving and receiving is what motivates me the most. It’s about contributing to a community that has been so important in my own learning and problem-solving journey, and I’m inspired to give back and assist others in the same way.


 


Looking back, what advice do you wish you would have been told earlier on that you would give to individuals looking to become involved in STEM/technology?


I could say something about always having a passion for new technologies and staying up to date with what you are pursuing. There would be nothing wrong with it, but those sound like already-at-hand phrases to be exchanged without considering each individual’s current state.


 


On a moment, and in a world where mental health is so important, let me share a simple tale that resonates with anyone at the crossroads of their career, whether they are new and confused about what to do, whether they’re just starting, or contemplating a shift in direction. It’s a gentle reminder that venturing into new territories can be daunting but immensely rewarding and that, at times, we may not even realize that our current paths could be detrimental to our well-being, professional growth, and personal relationships.


 


“There was once a man that went into the wilds of Africa, believing himself to be a hunter for many years. Despite his efforts, he found himself unable to catch any game. Overwhelmed by frustration and feeling lost, he sought the guidance of a Shaman from a nearby tribe.


Confessing to the Shaman, he said, “Hunting is what I live for, but I’m hitting a wall. There’s simply nothing out there for me to hunt, and I don’t know what to do.”


The Shaman, who had seen many seasons and had a kind of wisdom you don’t come across every day, simply put his arm on the hunter’s shoulder, looked him in the eyes and said, “Really? Nothing to hunt for? This land has fed us for generations. There is plenty of hunt out there and yet you cannot see it? Maybe the problem isn’t the land…allow me to ask you something very important, do you genuinely desire to be a hunter?”


 


This narrative goes much deeper than the act of hunting. It’s a reflection on our passions, how we confront our challenges, and the realization that our perspective might need a shift.


 


If our passions no longer ignite us, or if our efforts to chase them lead nowhere, it might be a sign to let go, not in defeat, but in liberation, because, in the end, I want everyone to be happy with the career path they have chosen, so that would be my advice, to read this simple tale, apply it to your current situation and ask yourself, “Do I really want to do keep doing what I am doing right now?” And if you find that your current path is not worth pursuing, if your mental health is not in shape, or if you are hitting a wall, then yes, it is time to take the step!


 


Imagine you had a magic wand that could create a feature in Logic Apps. What would this feature be and why?


In a world where AI is at such a fast pace, one feature that I would personally like to have on Logic Apps is prompted AI-generated Logic App flows. What that would mean is you give a prompt to the designer of what you pretend, and you would have a generated, most efficient flow for what you have described. Of course, you will still need to configure some things, but I think AI-generated flows could outline and cover many scenarios, making our processes faster and more efficient.


AI is here to stay, whether we like it or not; it just doesn’t go away, so we could take advantage of it to create better, faster, and more efficient products or stay behind while we see others do it.


 


What are some of the most important lessons you’ve learned throughout your career that surprised you?


One of the most surprising yet vital lessons from my career is the central role of relationships in keeping the ship sailing smoothly. Having positive communication and nurturing a positive work environment are crucial elements that empower a team to deliver top-notch results, remain driven, and maximize their daily potential. A car has four tires, and you need them all to get home safely.




Customer Corner:


Power transmission operator ELES gains agility with Microsoft Dynamics 365


 


ELES.jpg


 


Check out this customer success story on how Microsoft is helping to keep Slovenia’s lights on by improving and modernizing ELES’ operations. In 2012, ELES turned to Microsoft when they needed a new enterprise resource planning (ERP) solution. Today, ELES uses Azure Logic Apps to connect their ERP with other systems, improving collaboration between departments, streamline operations, and better manage their energy resources and infrastructure. 




News from our product group:


 





























feranto_4-1706920211996.png



Announcing the availability of TLS 1.3 in Azure API Management in Preview   


Read more about updates to TLS 1.3, the latest version of the internet’s most deployed security protocol. 


Pedro_M_Almeida_0-1708078171207.png



IBM MQ Built-in (In-App) connector and Queue Handles: The math behind Queue Handles   


For those using the IBM MQ Built-in (In-App) connector available in Logic Apps Standard, check out this article explain more on Handles and how to calculate the max value to set in your IBM MQ server. 


shahparth_0-1707165935171.png

Public Preview of Azure OpenAI and AI Search in-app connectors for Logic Apps (Standard)   


Check out this article on our two new built-in connectors (Azure OpenAI and Azure AI Search) that are now available for public preview. 


Saroja_L_Sattigeri_0-1707278985222.png



Common issues of Azure Automation connector in logic app(consumption) and logic app standard   


Learn about various issue scenarios related to the Azure Automation connector in both Logic App Consumption and Standard, along with its causes and resolutions. 


deprecated sql.png

Identify Logic Apps Consumption that are using deprecated V1 Actions and Triggers for SQL Connector   


V1 Actions/Triggers of the SQL Connector for Logic Apps will be deprecated by the end of March 2024. In this article, learn how to use a PowerShell Script to identify the Logic Apps still using the deprecated SQL Connectors so that you can change them to the V2 equivalent.




BlogPreview_default-blue.png



Integration Service Environment to Logic Apps Standard Migration    


ISE’s retirement date is August 31st, 2024, so make sure you migrate any Logic Apps running on ISE to Logic Apps Standard. Check out this guide video from our FastTrack team that walks you through the whole process!



 




News from our community:


From BizTalk to Logic Apps: A Real-World Integration Journey – February 2024 


Post by Houston Azure User Group 


 


Check out this recording from the February 2024 meetup for Houston Azure User Group where Azure customers dive into their journey from on-premises Biztalk to Azure Logic Apps hosted in an Integration Service Environment (ISE).


 


Deploying Multi-Line Values in ARM Templates   


Post by Harris Kristanto 


 


If you’re deploying an ARM template due to multi-line secrets stored in Azure Key Vault, then make sure to read this article from Harris!


 


Friday Fact: When a HTTP Request is Received trigger can accept more http methods other than POST   


Post by Luís Rigueira and Sandro Pereira


 


Check out this post and video from collaborators Luís and Sandro and learn about other HTTP methods you can use in your Logic Apps. 


 


Building an API to call Logic Apps Consumption   


Post by Mike Stephenson 


 


Want to make life a bit easier and learn how to build an APIM API that exposes Logic App Consumption with Terraform? Then check out Mike’s new video.


 


Using Managed Identity with Consumption and Standard Logic Apps with Azure Service Bus   


Post by Stephen W Thomas 


 


Read Stephen’s quick tips and tricks for using Azure Service Bus to grant access to specific consumption or standard edition Logic Apps.


 


Easy Auth | Standard Logic App with Azure API Management   


Post by Andrew Wilson 


 


Learn about using Easy Auth with Logic App Standard and restricting to API Management as a Backend.


 


Migrating from MuleSoft to Azure Integration Services (AIS): Why and How   


Post by Derek Marley and Tim Bieber 


 


Watch this recording from a webinar hosted by Derek and Tim as they talk about the benefits of Azure’s ecosystem and a step-by-step strategy for a smooth transition from MuleSoft to AIS.


 


Microsoft Fabric: Automating Fabric Capacity Scaling with Azure Logic Apps   


Post by Soheil Bakhshi 


 


Learn from Soheil about how to efficiently upscale and downscale your capacity based on specific workloads in this blog post.


 


What is Azure NAT Gateway | What is SNAT Port Exhaustion | Azure Standard Logic App Static Outbound IP Address   


Post by Sri Gunnala 


 


Learn about Azure NAT Gateway and how to use it with Azure Logic Apps to solve business problems from Sri.

Process Advisor for Supply Chain and Warehousing.

Process Advisor for Supply Chain and Warehousing.

This article is contributed. See the original author and article here.

Editor: Denis Conway

Who should use it, why and for what?

Introduction

Performance evaluation has been revolutionized by technology, extending its reach to the individual level. Consider health apps on your smartphone. They gather data breadcrumbs from your daily activities, providing analysis of your movement patterns. This isn’t a generic data compilation, but a near-accurate reflection of your physical activity during a specific period.

In the future, it’s conceivable that these apps might be equipped with an AI companion, or Copilot, to guide your next steps based on your past activities. It could suggest rest days or additional exercise to help you achieve your personal health goals.

This concept of performance evaluation based on collected data is the bedrock of process mining and process comparison. Our Copilot functionality adds a layer of assistance, enabling you to make informed decisions about your warehouse operations.

In this context, Copilot can help you optimize warehouse processes. It can identify bottlenecks in certain processes or compare different methods to achieve the same goal, empowering you to choose the most optimal method for your specific case.

In this blog, we’ll explore the essence of this feature, its intended audience, and how and why you should leverage it for your warehousing operations.

Process Mining Insights:

At first glance, using Process Advisor for material movement analysis is easy. The setup process is straightforward:

  1. Go to Warehouse Management > Setup > Process Mining > Warehouse material movement configuration. In the taskbar, select Deploy New Process.
  2. The configuration Wizard will open. Press Next, then enter the name of the process in the field Process Name, choose company, choose number of months to load (12 months = data from latest 12 months) and choose the appropriate Activity.  Press Next.
  3. Process is deployed.

The configuration wizard looks like this:

graphical user interface, text, application, Word
Image: Configuration wizard screenshot.

The easy part is now complete. We have set up a process, named it, and loaded 12 months of data to prepare for our analysis. The difficult part is making sense of our data and using it to make decisions to improve our warehouse output.

Therefore, we will provide you with some real-life examples on how to use the data analysis functionality to understand your processes, and a scenario where we evaluate two different methods and use the Process Advisor to figure out which method would be preferred for our business operations.

Analysis of data

There are multiple ways to analyze your process data to understand and compare your processes.

  1. Start with opening Power Automate and go to the tab Process Mining. The report is accessible on the main page.
  2. Report: When the report is loaded, it can look like this:
graphical user interface
Image: Process Mining Case Summary.

3. Select Map

Select the Map tab to display the process map:

Process Mining
Image: Process Mining Map.

This is a screenshot of the process map from our example. On the Map, there are separate locations on which actions(tasks) have taken place, as well as the time spent on this location and between locations. You can change the time unit to, let’s say mean duration, to see how long each activity in a particular location takes per average.  

4. Use the Co-Pilot to get started.

We provide you with suggestions for frequent prompts, but you can of course choose to enter whatever you want. In this case, we will use the suggested “provide the top insights” prompt.  

graphical user interface, application, Teams
Image: Process Mining map with Copilot.

5. Copilot Generates

The Copilot generates a response based on the data in your process map. In the example, we can see that the Copilot has found the “BULK” as the longest running activity, and provided us with a list of the activities with the greatest number of repetitions:

graphical user interface, application, Teams
Image: Process Mining map and Copilot generated answer.

6. Copilot Follow Up

We can also ask the Co-pilot follow-up questions. In this case, we will follow-up with the suggested “How to identify my bottleneck?” and “Find my Bottleneck” prompts. The Co-pilot generates a message explaining what the bottleneck is and its mean duration. In this instance, since we have selected the metric Mean duration, we will generate an answer reflecting this metric.

graphical user interface, application, Teams
Image: Process Mining map with Copilot generated answer.

The message we receive tells us that the Variant with the highest duration is “Variant 2” with a mean duration of 2 minutes and 47 seconds.
It also tells us that the activity with the highest mean duration is “BULK” with a mean duration of 15 minutes.

From this, we can draw the conclusion that “Variant 2” is the variant that takes the longest time to complete, and that the most amount of time is spent in the “BULK” location.

By using the process advisor for warehouse movement material analysis, we can streamline warehouse operations and ensure we don’t spend more time than we need on a particular task or operation.
Another example where the Process Advisor can be utilized to enhance operational fluidity in your warehouse is by comparing different methods of achieving a similar goal, to understand which method is more effective to reach your desired goal. We will try to explain how to conduct such a comparison to with a test-case.

In our test-case, we will compare two different methods of picking goods in the Warehouse to figure out which picking method takes less time, so we can increase the Warehouse output.

Test Case : “Single order picking” vs “Cluster picking”

In this test-case, the user wants to know which method of picking is faster, “Single order picking” vs “Cluster picking”. To compare the two, the user goes through the following steps. First, the user creates a Hypothesis for the purpose of this test-case. In this case, the user wants to determine which picking method is faster.

Secondly, the user decides the scope of the test. For both methods, the user will have 5 sales orders with one to five different items per order, in different quantities. Both methods will use identical sales orders for test purposes.
In the Work Details screen, we can see the work details for the work that has been created.
The Variants are the different Variants of work, so in this instance, for work ID USMF-002327 with Order number 002375 (displayed in the picture) the worker will “Pick” 1 piece of item LB0001 in 5 different variations (in this case colors), then “Put” these 5 items away in packing area (location “PACK”).

table
Image: Work details screenshot.
diagram
Image: Work details label(s).

With the “Single order picking” method, the worker picks one order at a time and puts it in the packing location. To clarify, the warehouse worker will go to each location where the item is located, pick and scan that item, repeat the process for each item in that order, take the order to pack location and then repeat with next order. 

Worker goes to 5 different locations to pick items, then proceeds to “PACK” location to put items away for packing. Then, the worker repeats the process for the other orders.

chart, box and whisker chart
Image: Picking locations

After we have constructed our hypothesis and determined the scope, we can go ahead and prepare for the analysis.

First, we will have to deploy our process comparison. We head into Warehouse Management > Setup > Process Mining > Warehouse material process configuration, and in the taskbar, we select Deploy New Process. We select a fitting description as the Process Name, select company and number of months to load. In this test case, we will only be loading one month of data since we don’t need more for this test’s purposes.

Usually, you would want as much correct data(not corrupted/faulty data since this will affect the analysis) and necessary (scope needs to determine how much and what is necessary) data as possible to get a high-quality analysis.  
When our process has been deployed, we can move on to the analysis and evaluate this process.

We load our process map into Power Automate, and in the beginning, it will look something like this:

Image: Process Map Starting view.

We can press the Play Animation button to get a representation of the process.

graphical user interface, application
Image: Process Map Starting view.

In the Statistics tab, we can see basic information of the process.

graphical user interface, application
Image: Process mining statistics tab overview.

In the Variants tab, we can view the different work-Variants. By selecting one, we can get an in-depth view of, in this case, “Variant 3”. We can see that in this variant, 6 cases occurred, the total duration was 8 minutes and 15 seconds, and the total active time was 8 minutes and 14 seconds.
In this case, the attribute selected is Zone. If we look closely at the Variants, we can see that “Variant 2” has 2 cases and the others have 1.

This means that two pieces of “work” that was scheduled were so similar that they could be grouped. This is because, from a warehouse management perspective, the operation is identical. This is because the worker goes to one location, picks item(s) 1, goes to another location and picks item(s) 2, then put them away in “PACK”. Thus, it is two “Pick” operations and one “Put”, and therefore they will be grouped in this view.    

graphical user interface, application
Image: Process mining variants tab zone overview.

We can also change the Variants’ view by changing the Attribute selected. In this case, we will change the attribute from Zone to Order number. This will change our view, so that we see different Variants based on work type. It will in this case show us 5 variants, which at first can seem confusing. A new variant is displayed with these settings, since this now displays Variants by order number instead of zone, which means that we get one variant for each Sales order we created, since all of them were different from each other. 

graphical user interface, application
Image: Process mining variants tab order number overview.

In this instance, we can see the order numbers in the legend on the right side. This view tells us that we have 5 different order numbers, and the boxes below Variants Overview represents the number of work operations performed per Order Number. The Case Count per order number, in the case of “Variant 2” there has been a total of 6 operations performed (pick, pick, pick, pick, pick, put, as mentioned previously) and in the case of Variant 4 and 5, there has been a total of 3 case count (Pick, Pick, Put).

For this scenario, it can be helpful to see how much work we are performing per event. If we want a view where we can see how much work we do per event, we can switch Attribute to Work Quantity. This will in this instance allow us to see the quantity of work that needs to be performed for each event. In the example of “Variant 2” the interface tells us that 6 events have taken place, in 5 of the events quantity has been 1, and in one of the events quantities was 5. To put this into a warehouse perspective, this means that we have performed 5 of the events 1 time each, which for Variant 2 is “Pick item 1, Pick item 2, Pick item 3, Pick item 4, Pick item 5” and one event where we “Put” away these items 5 times.
That single operation is performed 5 times and counts as one event because it is the same event occurring multiple times, whilst the other event, even though they are all “Pick” events, will count as individual events due to picking different products, which are all in different locations. When we “Put” away in “PACK” location, we don’t put the items in different locations, thus it counts as one event.

chart
Image: Process mining variants tab work quantity overview.

If we select Attribute by Work type, this becomes clear:

chart
Image: Process mining variants tab work type overview.

We might want to see the location where the events took place. To do that, we can set Attribute to Location, and the view will show us the locations of the events below the header Variants overview.

graphical user interface, application
Image: Process mining variants tab work location overview.

In this image, we can see the variants based on location. To put this into context, “Variant 6” tells us 6 events have taken place, all in different parts of the warehouse. For “Variant 10”, we can see that one event took place in “LEGOLOC301” and one in “PACK”.

Now, after we have made ourselves comfortable within the report, we can start analyzing our process. To do that, press the Process Compare button below Variants.

A view similar to this one will appear:

Image: Process compare variants tab location map overview.

In the process map displayed on the screen, we have set the Mining attribute to Location, and the Metric to Total duration. This will allow us to see the total amount of time spent in each location.

By changing the Metric to Total count, we can see the number of times an event took place in each location, as the picture below displays:

diagram
Image: Process compare variants tab location map overview.

The total amount of time spent in one location and number of cases per location might be valuable, but a more telling metric could be how much time we spent on average per location.

By switching metric to mean duration, we can see the average time spent per location. This gives us yet another hint on which part of the process takes the most amount of time to manage. But, if we want to see how it looks from a proportional perspective, by toggling the percentage sign next to the Metric drop-down menu, we will achieve exactly that.

Image: Process compare variants tab location and mean duration map overview.

As we can see from the image above, LEGOLOC 201 is the location in which we spend the largest percentage of our time.
If we want to further examine what is going on in that location, we can do so by pressing the bar. This will change the view slightly, and a card with detailed information will appear on the right of the screen.  

graphical user interface, text, application
Image: Process compare variants tab location map detailed view.

In the highlighted red box, we can see detailed performance data to further assess the performance in this location.

 Now, we have enough information to draw some conclusions on our own. We have identified zone LEGOLOC 201 as our “time-thief”, and we know that more than 1/3 of the time was spent on picking items in this zone.
To make the analysis process easier, Microsoft’s Copilot has been built into this feature.
By pressing the Copilot sign in the top-right corner, you will open the dialogue box where you can create a prompt and ask the Copilot about your process. The Copilot will suggest some common prompts, but you can of course create your own. In this case, we will ask the Copilot to summarize our process.   

diagram, engineering drawing
Image: Process compare map and Copilot dialogue.
diagram, engineering drawing
Image: Process compare map and Copilot generated answer.

As displayed in the picture, the Copilot will give us a summary of the process. Because we have selected to compare our first part of the test vs our default value (the red locations), it also summarizes the default value’s process.

We do get some information on how many events took place etc., but we did not get the total case time, which was the value we wanted to find to confirm or deny our hypothesis. By asking the Copilot what the average case duration and the total case duration was, we received the answer that mean case duration was 4 minutes and 18 seconds, and total duration was 21 minutes and 31 seconds.

So, our answer in this case is that the Single order picking took 21 minutes and 31 seconds to complete.

diagram, engineering drawing
Image: Process compare map and Copilot generated answer.

Now, we will compare the result to the cluster picking method, to see how they compare.

For context, cluster picking differs from single order picking in the sense that in cluster picking, workers pick multiple orders simultaneously and not one at a time. In this case, it means the worker will pick all 5 sales orders, then put them all away in the packing station at the same time, rather than picking an order, putting them away in the packing station, and repeating for next orders.

table
Image: Work clusters screenshot.

In this image, we can see the main difference between these picking methods. For cluster picking, we can see that the warehouse worker is tasked with picking 8 pieces of red Lego blocks (left image), and in the second screenshot (right) we can see how many and from which specific positions items should be picked.

graphical user interface, table
Image: Work clusters screenshot with illustrations.

When all items have been picked, the Work status will be updated so all Cluster positions are “In process”.

table
Image: Work Cluster in progress.

Next task is to put all items in the packing station. When we have done that, all Cluster position Work statuses will be changed to Closed.

graphical user interface, application, Word
Image: Cluster Put screenshot.

As we can see in the image below, work status has been changed to Closed across the board.

table
Image: Work Clusters status closed.

Now, let’s jump back to the analysis. Start by creating a new process in the same way we did for single order picking and open the process map in Power Automate. In our test case, this is what we are shown on our screen.

diagram
Image: Process Compare map.

As we have already covered how choosing different metrics affects the process map and the information on display, we will not do that for this part of the test, since we know we need to compare location as the Mining attribute, and total duration as the Metric.

We will again use the help of the Copilot to evaluate the process map. Once again, we ask for a summary of the process.

diagram
Image: Process Compare map and Copilot generated insight.
Test Case Results

The summary from the Copilot tells us that this process started November 6th and ended after 8 minutes and 45 seconds.

This means we have successfully confirmed our hypothesis by using process mining and the process advisor.
Now we know for a fact that for one picker with 5 sales orders constructed in this manner, cluster picking is a much more efficient picking method compared to single order picking, since identical amount of work took significantly less time to complete. Therefore, we can draw the conclusion that for all work with similar characteristics, we should prefer using cluster picking over single order picking, at least if we want to increase warehouse output.

Keep in mind, harnessing the power of Process Advisor requires an analytical mindset and a structured approach. The sheer volume of headers, variants, locations, and numbers can be overwhelming. To navigate this complexity, emulate the structured methodology illustrated in this example. By having a clear understanding of your comparison and measurement objectives, and a strategy to achieve them, you’ll significantly enhance the outcomes derived from Process Advisor.

Essential skills for effective process mining:

Use a fact-based approach with warehouse data as the base.

  • Use a strategic and tactical approach throughout the analysis.
  • Unlike this example, a great way of using process mining is by using continuous analysis, where you monitor something over time, rather than one-time analysis, which it can also be used for, as in this example.
  • Use quick data for immediate insights, and big data for continuous and conclusive analysis.
  • Master filtering to gain valuable insights and sort out what you believe is important.
Wealth of achievements made possible through process mining:
  • Identify areas in which processes can be improved.
  • Validate conformance of processes.
  • Do process simulation and predictive analysis.
  • Discover the most optimal paths for automatization.
Conclusion:

The power of Process Advisor extends far beyond what we’ve explored in this blog. It’s a versatile tool that can be adapted to a myriad of scenarios, and this guide merely scratches the surface of its potential. We’ve used it here to streamline warehouse operations, but the possibilities are truly limitless.

We encourage you to dive in and experiment with Process Advisor. Use the scenario we’ve outlined as a starting point, but don’t stop there. Input your own warehouse data and see firsthand how Process Advisor can illuminate opportunities for efficiency and growth. The journey towards optimizing your warehouse output begins with the Process Advisor.

Learn More

Related documentation:

Overview of process mining in Power Automate – Power Automate | Microsoft Learn

The post Process Advisor for Supply Chain and Warehousing. appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Setup Power Platform Pipelines

Pass Entity parameter from Power Automate Flow to an Action

Let’s say that you want to run a Power Automate Flow on a set of Dataverse records and those records will be referenced in your C# Plugins. And the next steps is you’ll create an Action for this and register it in the Plugin Registration Tool. In case you are new to plugins in CRM, … Continue reading Pass Entity parameter from Power Automate Flow to an Action

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

MGDC for SharePoint FAQ: Is OneDrive included?

MGDC for SharePoint FAQ: Is OneDrive included?

This article is contributed. See the original author and article here.

1. SharePoint datasets and OneDrive


 


When I describe the SharePoint datasets in Microsoft Graph Data Connect to someone, I frequently get this question: do Sites and Sharing Permissions cover only SharePoint or do they include OneDrive? The short answer is that OneDrive is included, but there is much more to say here…


 


2. OneDrive is a type of SharePoint site


 


For most technical intents and purposes, a OneDrive in your Microsoft 365 tenant is a SharePoint site with a specific template and permissions. It is basically a SharePoint site collection for personal use that comes preconfigured with permissions for the owner and nobody else. After that, you can upload/create files and decide to keep them private or share with others from there.


 


This special type of site was initially called a “Personal Site”, later was referred to as a “My Site” or “MySite”, then a “OneDrive for Business” (commonly abbreviated to “ODfB” or simply “ODB”). These days, we usually just call it a OneDrive and you can figure out if we’re talking about the consumer or business variety based on context.


 


Along the way, the purpose has always been the same. To allow someone in a tenant to store information needed for your personal work, with the ability to share with others as necessary. As the name suggests, it’s your single drive in the cloud to store all your business-related personal files.


 


The personal sites for each user are typically created only when the user tries to access their OneDrive for the first time. SharePoint does offer administrators a mechanism to pre-provision accounts. You can read more about it at https://learn.microsoft.com/en-us/sharepoint/pre-provision-accounts.


 


But keep in mind that, when you use the Microsoft Graph Data Connect to pull the Sites dataset, you get all types of sites in the tenant and that does include OneDrives.


 


3. How can you tell them apart?


 


In the Sites dataset, you can tell a site is a OneDrive by looking at the RootWeb.WebTemplate (which is “SPSPERS” for OneDrive) or the RootWeb.WebTemplateId (which is 21 for OneDrive). Note that these are properties of the Root Web for the site (more on this later).


 


For the other Microsoft Graph Data Connect for SharePoint datasets, you can use the SiteId property to join with the Sites dataset and find the Template or Template Id. This is a reliable method and the recommended one.


 


Some of the datasets might also have a URL property which can be used to identify a OneDrive. For the Sharing Permissions dataset, for instance, an ItemURL that starts with “personal/” indicates a permission for a OneDrive. You can read more about OneDrive URLs at https://learn.microsoft.com/en-us/sharepoint/list-onedrive-urls.


 


Using the URL is probably OK for most tenants using OneDrive but might not work for other site types.


 


4. Root Web


 


It is good to clarify why the Template and TemplateId properties come from the RootWeb property and it’s not a property of the site itself.


 


For starters, it’s important to understand the main SharePoint entities:



  1. There are many tenants.

  2. Tenants have Sites, also known as Site Collections.

  3. Sites (Site Collections) have Webs, also known as Subsites.

  4. Webs (Subsites) have Lists, some of which are called libraries or document libraries.

  5. Lists have List Items (document libraries have folders and documents)


 


As you can see, there is a hierarchy.


 


HierarchyHierarchy


 


 



The relationship between Sites and Webs is particularly interesting. When you create a Site, you must tell SharePoint the type of Site you want. That is used to create the Site and the main Web inside, called the RootWeb.


 


Every Site Collection has at least one Web and most have only one (the Root Web). The Site’s name and type (template) ends up being stored in the Root Web. Most templates don’t even have an option to add more webs (subsites). I would recommend keeping things simple and having only one web per site.


 


Note: You will sometimes hear people refer to Webs as Sites, which is a term normally used for Site Collections. Since most Site Collections have only one Web, that is typically not a big issue. That can get a little confusing at times, so you might want to stick to using the unambiguous terms “Site Collections” and “Webs” to be extra clear.


 


5. Web Templates


 


When you create a Site Collection and its corresponding Root Web, you must choose a Web Template. Each Web Template comes with a few default lists and libraries.


 


Some of these Web Templates (like Team Sites and Communication Sites) help you get started with a new Site. Others are not meant to be created by end users but are used for specific scenarios (like the Compliance Policy Center, the Search Center or the Tenant Admin Site). As we mentioned before, one of these templates is the Personal Site or OneDrive.


 


Here’s a list of some common Web Templates used by SharePoint Online:


 







































































Web Template Id Web Template Description
1 STS Classic Team Site
16 TENANTADMIN Tenant Admin Site
18 APPCATALOG App Catalog Site
21 SPSPERS OneDrive (Personal Site)
54 SPSMSITEHOST My Site Host
56 ENTERWIKI Enterprise Wiki
64 GROUP Office 365 group-connected Team Site
68 SITEPAGEPUBLISHING Communication site
69 TEAMCHANNEL Team Channel
90 SRCHCENTERLITE Basic Search Center
301 REDIRECTSITE Redirect Site
3500 POLICYCTR Compliance Policy Center


 


Note: There are many more of these templates, not only the ones listed above. You can get a list of the templates available to you using the Get-SPOWebTemplate PowerShell cmdlet:


 


> Install-Module -Name Microsoft.Online.SharePoint.PowerShell
> Connect-SPOService -url https://archimedes-admin.sharepoint.com
> Get-SPOWebTemplate | Select Name,Title | Sort Name | Format-List

Name : BDR#0
Title : Document Center

Name : BICenterSite#0
Title : Business Intelligence Center

Name : BLANKINTERNETCONTAINER#0
Title : Publishing Portal

Name : COMMUNITY#0
Title : Community Site

Name : COMMUNITYPORTAL#0
Title : Community Portal

Name : DEV#0
Title : Developer Site

Name : EHS#1
Title : Team Site – SharePoint Online configuration

Name : ENTERWIKI#0
Title : Enterprise Wiki

Name : OFFILE#1
Title : Records Center

Name : PRODUCTCATALOG#0
Title : Product Catalog

Name : PROJECTSITE#0
Title : Project Site

Name : SITEPAGEPUBLISHING#0
Title : Communication site

Name : SRCHCEN#0
Title : Enterprise Search Center

Name : SRCHCENTERLITE#0
Title : Basic Search Center

Name : STS#0
Title : Team site (classic experience)

Name : STS#3
Title : Team site (no Microsoft 365 group)

Name : visprus#0
Title : Visio Process Repository


 


6. They are all in there…


 


So, I hope it’s clear that the Microsoft Graph Data Connect for SharePoint datasets (like Sites, Sharing Permissions and Groups) include information for all types of sites in the tenant, regardless of the Template they use. You can use the Sites dataset to understand Team Sites, OneDrives, and Communication Sites. The Sharing Permissions dataset includes permissions for all these different types of sites.


 


Note: For more information, visit the main blog at Links about SharePoint on MGDC.