This article is contributed. See the original author and article here.
Hey! Rob Greene again. Been on a roll with all things crypto as of late, and you are not going to be disappointed with this one either!
Background
Many know that Remote Desktop Services uses a self-signed certificate for its TLS connection from the RDS Client to the RDS Server over the TCP 3389 connection by default. However, Remote Desktop Services can be configured to enroll for a certificate against an Enterprise CA, instead of continuing to use those annoying self-signed certificates everywhere.
I know there are other blogs out there that cover setting up the certificate template, and the group policy, but what if I told you most of the blogs that I have seen on this setup are incomplete, inaccurate, and do not explain what is happening with the enrollment and subsequent renewals of the RDS certificate!? I know… Shocker!!!
If you are a pretty regular consumer of the AskDS blog content you know how we love to recommend using one certificate on the server for a specific Enhanced Key Usage (EKU), and make sure that you have all the information required on the certificate so that it works with all applications that need to use the certificate.
This certificate is no different. I would recommend that the certificate that is used ONLY has the EKU for Remote Desktop Authentication and DOES NOT have an EKU of Server Authentication at all. The reason for this is that this certificate should not be controlled / maintained via Autoenrollment/renewal behaviors. This needs to be maintained by the Remote Desktop Configuration service, and you do not want certificates being used by other applications being replaced by a service like this as it will cause an issue in the long run.
There is a group policy setting that can be enabled to configure the Remote Desktop Service to enroll for the specified certificate and gives the NT AuthorityNetworkService account permission to the certificates private key which is a requirement for this to work.
The interesting thing about this is that you would think that the Remote Desktop Service service would be the service responsible for enrolling for this certificate, however it is the Remote Desktop Configuration (SessionEnv) service that is responsible for initial certificate requests as well as certificate renewals.
It is common to see the RDS Authentication Certificate template configured for autoenrollment, however this is one of the worse things you can do, and WILL cause issues with Remote Desktop Services once the certificate renewal timeframe comes in. Autoenrollment will archive the existing certificate causing RDS to no longer be able to find the existing certificate; then when you require TLS on the RDS Listener, users will fail to connect to the server. Then, at some point, Remote Desktop Configuration service will replace the newly issued certificate with a new one because it maintains the Thumbprint of the certificate that RDS should be using within WMI. When it tries to locate the original thumbprint and cannot find it, it will then attempt to enroll for a new one at the next service start. This is generally when we see the cases rolling in to the Windows Directory Services team because it appears to be a certificate issue even though this is a Remote Desktop Services configuration issue.
What we want to do is first make sure that all the steps are taken to properly configure the environment so that the Remote Desktop Configuration service is able to properly issue certificates.
The Steps
Like everything in IT (information technology), there is a list of steps that need to be completed to get this setup properly.
Configure the certificate template and add it to a Certification Authority to issue the template.
Configure the Group Policy setting.
Configuring the Certificate Template
The first step in the process is to create and configure the certificate template that we want to use:
Log on to a computer that has the Active Directory Certificate Services Tools Remote Server Administration Tools (RSAT) installed or a Certification Authority within the environment.
Launch: CertTmpl.msc (Certificate Template MMC)
Find the template named Computer, right click on it and select Duplicate Template.
On the Compatibility tab, select up to Windows Server 2012 R2 for Certification Authority and Certificate recipient. Going above this might cause issues with CEP / CES environments.
On the General tab, we need to give the template a name and validity period.
Type in a good descriptive name in the Template display name field.
If you would like to change the Validity period, you can do that as well.
You should NOT check the box Publish certificate in Active Directory.
NOTE: Make sure to copy the value in the Template name field, as this is the name that you will need to type in the group policy setting. Normally it will be the display name without any spaces in the name, but do not rely on this. Use the value you see during template creation or when looking back at the template later.
6. On the Extensions tab, the Enhanced Key Usage / Application Policies need to be modified.
a. Select Application Policies, and then click on the Edit button.
b. Multi select or select individually Client Authentication and Server Authentication and click the Remove button.
c. Click the Add button, and then click on the New button if you need to create the Application Policy for Remote Desktop Authentication. Otherwise find the Remote Desktop Authentication policy in the list and click the OK button.
d. If you need to create the Remote Desktop Authentication application policy, click the Add button, and then for the Name type in Remote Desktop Authentication, and type in 1.3.6.1.4.1.311.54.1.2 for the Object identifier value, and click the OK button.
e. Verify the newly created Remote Desktop Authentication application policy, and then click the OK button twice.
7. Remote Desktop service can use a Key Storage Provider (KSP). So, if you would like to change over from a Legacy Cryptographic Service Provider (CSP) to using a Key Storage Provider this can be done on the Cryptography tab.
8. Get the permissions set properly. To do this click on the Security tab.
a. Click the Add button and add any specific computer or computer groups you want to enroll for a certificate.
b. Then Make sure to ONLY select Allow Enroll permission. DO NOT select Autoenroll.
NOTE: Please keep in mind that Domain Controllers DO NOT belong to the Domain Computers group, so if you want all workstations, member server and Domain Controllers to enroll for this certificate, you will need Domain Computers and Enterprise Domain Controllers or Domain Controllers groups added with the security permission of Allow – Enroll.
9. When done making other changes to the template as needed, click the OK button to save the template.
Configure the Group Policy
After working through getting the certificate template created and configured to your liking, the next step in the process is to setup the Group Policy Object properly.The group policy setting that needs to be configured is located at: Computer ConfigurationPoliciesAdministrative TemplatesWindows ComponentsRemote Desktop ServicesRemote Desktop Session HostSecurity
With the Policy “Server authentication certificate template“
When adding the template name to this group policy it will accept one of two things:
Certificate template name, again this is NOT the certificate template display name.
Certificate templates Object Identifier value. Using this is not common, however some engineers will recommend this over the template name.
If you use the certificate template display name, the Remote Desktop Configuration service (SessionEnv) will successfully enroll for the certificate, however the next time the policy applies it will enroll for a new certificate again. This causes enrollments to happen and can make a CA very busy.
Troubleshoot issues of certificate issuance
Troubleshooting problems with certificate issuance is usually easy once you have a good understanding of how Remote Desktop Services goes about doing the enrollment, and there are only a few things to check out.
Investigating what Certificate Remote Desktop Service is configured to use.
The first thing to investigate is figuring out what certificate, if any,the Remote Desktop Services is currently configured to use. This is done by running a WMI query and can be done via PowerShell or good’ol WMIC. (Note: WMIC is deprecated and will be removed at a future date.)
WMIC: wmic /namespace:rootcimv2TerminalServices PATH Win32_TSGeneralSetting Get SSLCertificateSHA1Hash
We are interested in the SSLCertificateSHA1Hash value that is returned. This will tell us the thumbprint of the certificate it is attempting to load.
Keep in mind that if the Remote Desktop Service is still using the self-signed certificate, it can be found by:
launch the local computer certificate store (CertLM.msc).
Once the Computer store opened look for the store named: Certificates – Local ComputerRemote DesktopCertificates.
We would then double click on the certificate, then click on the Details tab, and find the field named Thumbprint.
Then validate if this value matches the value of SSLCertificateSHA1Hash from the output.
If there is no certificate in the Remote Desktop store, or if the SSLCertificateSHA1Hash value does not match any of the certificates in the store Thumbprint field, then it would be best to visit the Certificates – Local ComputerPersonalCertificates store next. Look for a certificate that has the Thumbprint field matching the SSLCertificateSHA1Hash value.
Does the Remote Desktop Service have permission to the Certificate private key
Once the certificate has been tracked down, we then must figure out if the certificate has a private key and if so, does the account running the service have permission to the private key?
If you are using Group Policy to deploy the certificate template information and the computer has permissions to enroll for the certificate, then the permissions in theory should be configured properly for the private key and have the NT AuthorityNetworkService with Allow – Read permissions to the private key.
If you are having this problem, then more than likely the environment is NOT configured to deploy the certificate template via the group policy setting, and it is just relying on computer certificate autoenrollment and a certificate that is valid for Server Authentication. Relying on certificate autoenrollment is not going to configure the correct permissions for the private key and add Network Service account permissions.
To check this, follow these steps:
launch the local computer certificate store (CertLM.msc).
Once the Computer store opened look for the store named: Certificates – Local ComputerPersonalCertificates.
Right click on the certificate that you are interested in, then select All Tasks, and click on Manage Private Keys.
4. Verify that Network Service account has Allow – Read Permissions. If not, then add it.
a. Click the Add button.
b. In the Select Users or Groups, click the Locations button, and select the local computer in the list.
c. Type in the name “Network Service”
d. Then click the Check Names button, and then click the OK button.
5. If the certificate does not appear to have a private key associated with it in via the Local Computer Certificate store snapin, then you may want to run the following CertUtil command to see if you can repair the association. CertUtil -RepairStore My [* / CertThumbprint].
How to change the certificate that Remote Desktop Services is using
If you have determined that Remote Desktop Services is using the wrong certificate, there are a couple of things that we can do to resolve this.
We can delete the certificate from the Computer Personal store and then cycle the Remote Desktop Configuration (SessionEnv) service. This would cause immediate enrollment of a certificate using the certificate template defined in the group policy.
PowerShell: $RDPSettings = Get-WmiObject -Class “Win32_TSGeneralSetting” -Namespace Rootcimv2Terminalservices -Filter “TerminalName=’rdp-tcp'” CertUtil -DelStore My $RDPSettings.SSLCertificateSHA1Hash Net Stop SessionEnv Net Start SessionEnv
2. We could update the Thumbprint value in WMI to reference another certificates thumbprint.
WMIC: wmic /namespace:rootcimv2TerminalServices PATH Win32_TSGeneralSetting Set SSLCertificateSHA1Hash = “CERTIFICATETHUMBPRINT”
Conclusion
The first thing to remember is deploying certificates for Remote Desktop Services is best done by the Group Policy setting and to NOT setup the certificate template for autoenrollment. Setting the template up for autoenrollment will cause certificate issuance problems within the environment from multiple angles.
Unless you modify the certificate templates default Key Permissions setting found on the Request Handling tab, the account running the Remote Desktop Service will not have permission to the private key if the certificate is acquired via autoenrollment. This is not something that we would recommend.
This will cause a scenario where even if the SSLCertificateSHA1Hash value is correct, it will not be able to use the certificate because it will not have permission to use the private key. If you do have the template configured for custom Private Key permissions, you could again still have issues with the WMI SSLCertificateSHA1Hash value not being correct.
2. Configure the group policy setting properly as well as the certificate template. It is best to manage this configuration via group policy and you can ensure consistent experience for all RDS connections.
I know that a lot of you might have deeper questions about how the Remote Desktop Configuration service does this enrollment process, however, please keep in mind that the Remote Desktop Service is really owned by the Windows User Experience team in CSS, and so us Windows Directory Services engineers may not have that deeper level knowledge. We just get called in when the certificates do not work or fail to get issued. This is how we tend to know so much about the most common misconfigurations for this solution.
Rob “Why are RDS Certificates so complicated” Greene
This article is contributed. See the original author and article here.
Mv3 High Memory General Availability
Executing on our plan to have our third version of M-series (Mv3) powered by 4th generation Intel® Xeon® processors (Sapphire Rapids) across the board, we’re excited to announce that Mv3 High Memory (HM) virtual machines (VMs) are now generally available. These next-generation M-series High Memory VMs give customers faster insights, more uptime, lower total cost of ownership and improved price-performance for their most demanding workloads. Mv3 HM VMs are supported for RISE with SAP customers as well. With the release of this Mv3 sub-family and the sub-family that offers around 32TB memory, Microsoft is the only public cloud provider that can provide HANA certified VMs from around 1TB memory to around 32TB memory all powered by 4th generation Intel® Xeon® processors (Sapphire Rapids).
Key features on the new Mv3 HM VMs
The Mv3 HM VMs can scale for workloads from 6TB to 16TB.
Mv3 delivers up to 40% throughput over our Mv2 High Memory (HM), enabling significantly faster SAP HANA data load times for SAP OLAP workloads and significant higher performance per core for SAP OLTP workloads over the previous generation Mv2.
Powered by Azure Boost, Mv3 HM provides up to 2x more throughput to Azure premium SSD storage and up to 25% improvement in network throughput over Mv2, with more deterministic performance.
Designed from the ground up for increased resilience against failures in memory, disks, and networking based on intelligence from past generations.
Available in both disk and diskless offerings allowing customers the flexibility to choose the option that best meets their workload needs.
During our private preview, several customers such as SwissRe unlocked gains from the new VM sizes. In their own words:
“Mv3 High Memory VM results are promising – in average we see a 30% increase in the performance without any big adjustment.”
SwissRe
Msv3 High Memory series (NVMe)
Size
vCPU
Memory in GiB
Max data disks
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416s_6_v3
416
5,696
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M416s_8_v3
416
7,600
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M624s_12_v3
624
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_12_v3
832
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_16_v3
832
15,200
64
130,000/ 8,000
260,000/ 8,000
8
40,000
Msv3 High Memory series (SCSI)
Size
vCPU
Memory in GiB
Max data disks
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416s_6_v3
416
5,696
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M416s_8_v3
416
7,600
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M624s_12_v3
624
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_12_v3
832
11,400
64
130,000/4,000
130,000/4,000
8
40,000
Standard_M832s_16_v3
832
15,200
64
130,000/ 8,000
130,000/ 8,000
8
40,000
Mdsv3 High Memory series (NVMe)
Size
vCPU
Memory in GiB
Temp storage (SSD) GiB
Max data disks
Max cached* and temp storage throughput: IOPS / MBps
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416ds_6_v3
416
5,696
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M416ds_8_v3
416
7,600
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M624ds_12_v3
624
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_12_v3
832
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_16_v3
832
15,200
400
64
250,000/1,600
130,000/ 8,000
260,000/ 8,000
8
40,000
Mdsv3 High Memory series (SCSI)
Size
vCPU
Memory in GiB
Temp storage (SSD) GiB
Max data disks
Max cached* and temp storage throughput: IOPS / MBps
Max uncached Premium SSD throughput: IOPS/MBps
Max uncached Ultra Disk and Premium SSD V2 disk throughput: IOPS/MBps
Max NICs
Max network bandwidth (Mbps)
Standard_M416ds_6_v3
416
5,696
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M416ds_8_v3
416
7,600
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M624ds_12_v3
624
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_12_v3
832
11,400
400
64
250,000/1,600
130,000/4,000
130,000/4,000
8
40,000
Standard_M832ds_16_v3
832
15,200
400
64
250,000/1,600
130,000/ 8,000
130,000/ 8,000
8
40,000
*Read iops is optimized for sequential reads
Regional Availability and Pricing
The VMs are now available in West Europe, North Europe, East US, and West US 2. For pricing details, please take a look here for Windows and Linux.
We are thrilled to unveil the latest and largest additions to our Mv3-Series, Standard_M896ixds_32_v3 and Standard_M1792ixds_32_v3 VM SKUs. These new VM SKUs are the result of a close collaboration between Microsoft, SAP, experienced hardware partners, and our valued customers.
Key features on the new Mv3 VHM VMs
Unmatched Memory Capacity: With close to 32TB of memory, both the Standard_M896ixds_32_v3 and Standard_M1792ixds_32_v3 VMs are ideal for supporting very large in-memory databases and workloads.
High CPU Power: Featuring 896 cores in the Standard_M896ixds_32_v3 VM and 1792 vCPUs** in the Standard_M1792ixds_32_v3 VM, these VMs are designed to handle high-end S/4HANA workloads, providing more CPU power than other public cloud offerings. Enhanced Network and Storage Bandwidth: Both VM types provide the highest network and storage bandwidth available in Azure for a full node VM, including up to 200-Gbps network bandwidth with Azure Boost.
Optimal Performance for SAP HANA: Certified for SAP HANA, these VMs adhere to the SAP prescribed socket-to-memory ratio, ensuring optimal performance for in-memory analytics and relational database servers.
This article is contributed. See the original author and article here.
Microsoft Fabric seamlessly integrates with generative AI to enhance data-driven decision-making across your organization. It unifies data management and analysis, allowing for real-time insights and actions.
With Real Time Intelligence, keeping grounding data for large language models (LLMs) up-to-date is simplified. This ensures that generative AI responses are based on the most current information, enhancing the relevance and accuracy of outputs. Microsoft Fabric also infuses generative AI experiences throughout its platform, with tools like Copilot in Fabric and Azure AI Studio enabling easy connection of unified data to sophisticated AI models.
Check out GenAI experiences with Microsoft Fabric.
00:00 — Unify data with Microsoft Fabric 00:35 — Unified data storage & real-time analysis 01:08 — Security with Microsoft Purview 01:25 — Real-Time Intelligence 02:05 — Integration with Azure AI Studio
As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.
-If you want to bring custom Gen AI experiences to your app so that users can interact with them using natural language, the better the quality and recency of the data used to ground responses, the more relevant and accurate the generated outcome.
-The challenge, of course, is that your data may be sitting across multiple clouds, in your own data center and also on the edge. Here’s where the complete analytics platform Microsoft Fabric helps you to unify data wherever it lives at unlimited scale, without you having to move it.
-It incorporates a logical multi-cloud data lake, OneLake, for unified data storage and access and separately provides a real-time hub optimized for event-based streaming data, where change data capture feeds can be streamed from multiple cloud sources for analysis in real time without the need to pull your data. Then with your data unified, data professionals can work together in a collaborative workspace to ingest and transform it, analyze it, and also endorse it as they build quality data sets.
-And when, used with Microsoft Purview, this can be achieved with an additional layer of security where you can classify and protect your schematized data with protections flowing as everyone from your engineers, data analysts to your business users works with data in the Fabric workspace. Keeping grounding data for your LLMs up to date is also made easier by being able to act on it with Real Time Intelligence.
-For example, you might have a product recommendation engine on an e-commerce site and using Real Time Intelligence, you can create granular conditions to listen for changes in your data, like new stock coming in, and update data pipelines feeding the grounding data for your large language models.
-So now, whereas before the gen AI may not have had the latest inventory data available to it to ground responses, with Real Time Intelligence, generated responses can benefit from the most real-time, up-to-date information so you don’t lose out on sales. And as you work with your data, gen AI experiences are infused throughout Fabric. In fact, Copilot in Fabric experiences are available for all Microsoft Fabric workloads to assist you as you work.
-And once your data set is complete, connecting it from Microsoft Fabric to ground large language models in your gen AI apps is made easy with Azure AI Studio, where you can bring in data from OneLake seamlessly and choose from some of the most sophisticated large language models hosted in Azure to build custom AI experiences on your data, all of which is only made possible when you unify your data and act on it with Microsoft Fabric.
This article is contributed. See the original author and article here.
Discover how Copilot for Dynamics 365 Commerce can help you deliver personalized customer experiences, optimize product management, and streamline retail operations for store associates, managers, and back-office staff with AI.
Transformative AI in Dynamics 365 Commerce
The retail industry is facing significant challenges and opportunities in today’s digital world. AI can help you create value for your customers and stand out from your competitors. It can help you solve critical challenges such as improving customer service, refining product management, increasing store associate productivity, and simplifying finances.
Dynamics 365 Commerce now includes Copilot, which helps you improve customer satisfaction, boost sales, increase profit margins, and enhance workforce productivity with automated insights, validations, and summaries that reduce the clicks and searches needed to find information, creating a near “one-click” retail experience.
Watch this brief video to see Copilot in action.
Video of Copilot in Dynamics 365 Commerce announcement on YouTube.
Customer insights
Copilot simplifies the process of understanding your customers. It aggregates data on customer preferences from Dynamics 365 Commerce, giving you an in-depth and comprehensive view of your customers, such as their favorite product categories, preferred price ranges, and lifetime value based on recency, frequency, and monetary metrics. You can view their previous interactions with a quick glance, making it easier to resume conversations or give customized follow-ups for better customer relationships and more personalized and effective engagement. Learn more about Copilot customer insights.
Copilot customer insights in Dynamics 365 Commerce.
Product insights
Whether you’re introducing new products or welcoming new store employees, keeping everyone informed and prepared is essential. Copilot provides comprehensive product insights, including clear and concise descriptions, key benefits, inventory levels, and discount details, empowering your staff to elevate product sales. Additionally, employees can access information on related items like accessories and bundles, promoting a cross-selling environment that enhances the shopping experience and boosts sales. By equipping store employees with the knowledge and confidence to engage customers effectively, Copilot turns interactions into opportunities, increasing customer satisfaction, sales, and the all-important average order value.
Copilot product insights in Dynamics 365 Commerce.
Report insights
Envision a scenario where synthesizing insights to assess the performance of your retail channels becomes seamless. With Copilot, it’s easy. Copilot provides instant insights, generating narrative summaries for channel reports. You’ll receive a precise and succinct overview of critical metrics such as sales, revenue, profit, margin, and overall store performance—right at your fingertips. Copilot’s real-time analysis keeps you ahead, updating summaries as new data arrives, empowering your store associates to communicate results effectively, accurately, instantly. Embrace the future of retail intelligence with Copilot and revolutionize the way you interact with your data. Learn more about Copilot report insights in Dynamics 365 Commerce.
Copilot report insights in Dynamics 365 Commerce.
Retail statement insights
Copilot can summarize posted and unposted retail statements, highlighting key insights such as the number of affected transactions, total sales amount, and risks like returns without receipts, expense transactions, and price overrides. These insights into retail statements allow for straightforward and efficient management of financial reports and help you detect and correct discrepancies and risks in your retail statements by providing a clear summary of anomalies in transactional activity. By using Copilot-powered insights, you can identify issues without wading through numerous forms, promptly take corrective measures, and reduce the need for support inquiries to solve problems. Learn more about Copilot retail statement insights in Dynamics 365 Commerce.
Copilot statement insights in Dynamics 365 Commerce.
Merchandise more efficiently
Merchandising is a complex and time-consuming process that involves configuring products, categories, catalogs, and attributes for each channel. Merchandisers need to ensure that their products are displayed correctly and accurately on the online store, and that they comply with each channel’s business rules and policies. However, manual configuration is prone to human error, and it doesn’t scale for businesses that have millions of products, thousands of attributes, and hundreds of categories and catalogs across hundreds of stores.
Copilot enhances merchandiser efficiency by streamlining merchandising workflows, summarizing product settings, and automating data validation by checking for errors and inconsistencies in your product merchandising data. From the Copilot summaries, you can navigate to a list of issues and act without losing context to address problems promptly and efficiently. Your products are always correctly configured and displayed, enhancing customer satisfaction and boosting sales. Learn more about Copilot merchandising insights in Dynamics 365 Commerce.
Copilot merchandising insights in Dynamics 365 Commerce.
Ensuring the ethical use of AI technology
Microsoft is committed to the ethical deployment of AI technologies. Through our Responsible AI practices, we ensure that all AI-powered features in Dynamics 365 adhere to stringent data privacy laws and ethical AI usage standards, promoting transparency, fairness, and accountability.
Conclusion
Copilot features for Dynamics 365 Commerce are revolutionizing the retail experience by bringing AI to store associates, store managers, channel managers in the back office, and merchandisers. They’re simplifying complex data analysis, personalizing customer service, optimizing product management, and driving business growth by improving customer loyalty, increasing sales, and enhancing profitability.
If you’re ready to take your retail business to the next level, contact us today to learn more about how Copilot can help you transform your retail business.
Copilot functionalities in Store Commerce are available starting with the following versions:
10.0.39, from Proactive Quality Update 4 (PQU-4) onwards (CSU: 9.49.24184.3, Store Commerce App 9.49.24193.1)
10.0.40, from Proactive Quality Update 1 (PQU-1) onwards (CSU: 9.50.24184.2, Store Commerce App 9.50.24189.1)
Copilot functionalities in back office (Headquarters) are available starting with the following versions:
10.0.38 from Proactive Quality Update 5 (PQU-5) or subsequent updates
10.0.39 from Proactive Quality Update 3 (PQU-3) or later versions
This article is contributed. See the original author and article here.
This is the next segment of our blog series highlighting Microsoft Learn Student Ambassadors who achieved the Gold milestone, the highest level attainable, and have recently graduated from university. Each blog in the series features a different student and highlights their accomplishments, their experience with the Student Ambassador community, and what they’re up to now.
Today we meet Flora who recently graduated with a bachelor’s in biotechnology from Federal University of Technology Akure in Nigeria.
Responses have been edited for clarity and length.
When did you join the Student Ambassadors community?
July 2021
What was being a Student Ambassador like?
Being a student ambassador was an amazing experience for me. I joined the program at a crucial time when I was just beginning my tech journey and was on the verge of giving up on a tech career, thinking it might not be for me. Over the three years I served as an ambassador, I not only enhanced my technical skills but also grew as an individual. I transformed from a shy person into someone who could confidently address an audience, developing strong presentation and communication skills along the way. As an ambassador, I made an impact on both small and large scales, excelled in organizing events, and mentored other students to embark on their own tech career paths.
Was there a specific experience you had while you were in the program that had a profound impact on you and why?
One significant impact I made as an ambassador was organizing a Global Power Platform event in Nigeria, which is presumed the largest in West Africa, with around 700 students attending. During this event, I collaborated with MVPs in the Power Platform domain to upskill students in Power BI and Power Apps technology. Leveraging my position as a Microsoft ambassador, I secured access to school facilities, including computer systems for students to use for learning. This pivotal experience paved the way for me to organize international events outside the ambassador program.
These experiences aside helped me develop skills in project management, networking, and making a large-scale impact.
Tell us about a technology you had the chance to gain a skillset in as a Student Ambassador. How has this skill you acquired helped you in your post-university journey?
During my time as an ambassador, I developed a strong skillset in data analytics. I honed my abilities using various Microsoft technologies, including Power BI, Excel, and Azure for Data Science. I shared this knowledge with my community through classes, which proved invaluable in my post-university journey. Additionally, I honed my technical writing skills by contributing to the Microsoft Blog, with one of my articles becoming one of the top most viewed blogs of the year. This experience helped me secure an internship while in school and side-gigs via freelancing, and ultimately landing a job before graduating.
What is something you want all students, globally, to know about the Microsoft Learn Student Ambassador Program?
I want students worldwide to know that the Microsoft Learn Student Ambassador program is for everyone, regardless of how new they are to tech. It offers opportunities to grow, learn, and expand their skills, preparing them for success in the job market. They shouldn’t view it as a program only for geniuses but as a place that will shape them in ways that traditional academics might not
Flora and other Microsoft Learn Student Ambassadors in her university.
What advice would you give to new Student Ambassadors, who are just starting in the program?
I would advise students just starting in the program to give it their best and, most importantly, to look beyond the SWAG! Many people focus on the swag and merchandise, forgetting that there’s much more to gain, including developing both soft and technical skills. So, for those just starting out, come in, make good connections, and leverage those connections while building your skills in all areas.
Share a favorite quote with us! It can be from a movie, a song, a book, or someone you know personally. Tell us why you chose this. What does it mean for you?
Maya Angelou’s words deeply resonate with me: ‘Whatever you want to do, if you want to be great at it, you have to love it and be willing to make sacrifices.’ This truth became evident during my journey as a student ambassador. I aspired to be an effective teacher, presenter, and communicator. To achieve that, I knew I had to overcome my shyness and embrace facing the crowd. Making an impact on a large scale requires stepping out of my comfort zone. Over time, I transformed into a different person from when I first joined the program.
Tell us something interesting about you, about your journey.
One fascinating aspect of my involvement in the program and my academic journey was when I assumed the role of community manager. Our goal was to elevate the MLSA community to a prominent position within the school, making it recognizable to both students and lecturers. However, through collaborative efforts and teamwork with fellow ambassadors, we achieved significant growth. The community expanded to nearly a thousand members, and we successfully registered it as an official club recognized by the Vice-Chancellor and prominent lecturers. I owe a shout-out to Mahmood Ademoye and other ambassadors from FUTA who played a pivotal role in shaping our thriving community.
Recent Comments