MVP’s Favorite Content: Microsoft Teams and DevOps

MVP’s Favorite Content: Microsoft Teams and DevOps

This article is contributed. See the original author and article here.

In this blog series dedicated to Microsoft’s technical articles, we’ll highlight our MVPs’ favorite article along with their personal insights.


 


Onyinye Madubuko, M365 MVP, Ireland


Onyinye Madubuko.jpg


Clear Teams cache – Microsoft Teams | Microsoft Learn


“This was helpful in solving new Teams application for users experiencing issues.”


*Relevant Blog: Teams Window keeps flickering and not launching (techiejournals.com)


 


Laurent Carlier, M365 MVP, France


Laurent Carlier.jpg


Overview of meetings, webinars, and town halls – Microsoft Teams | Microsoft Learn


“Teams meetings have evolved significantly over the past few years, with the end of live Team events, the introduction of Town Halls, and the strengthening of Teams Premium features. It’s not always easy to understand what is and isn’t included in Teams Premium licences, or to explain the benefits of purchasing this new plan. This documentation and its comparison tables make my job a lot easier today.”


 


Edward Kuo, Microsoft Azure MVP, Taiwan


Edward Kuo.jpg


Introduction to Azure DevOps – Training | Microsoft Learn


“I am a DevOps expert and an Azure specialist, primarily responsible for guiding enterprises in using Azure DevOps and establishing DevOps teams.”


*Relevant Blog: DevOps – EK.Technology Learn (edwardkuo.dev)


 


Kazushi Kamegawa, Developer Technologies MVP, Japan


Kazushi Kamegawa.jpg


Managed DevOps Pools – The Origin Story – Engineering@Microsoft


“Using Azure Pipelines for CI/CD in a closed network environment requires the use of self-hosted agents, and managing these images was a very labor-intensive task. Even with automation, updates took 5-6 hours and had to be done once or twice a month. It was probably a challenge for everyone.


In this context, the announcement of the Managed DevOps Pools on this blog was very welcome news. It’s not just me; it’s likely the solution everyone was hoping for, and I am very much looking forward to it.”


(In Japanese: Azure Pipelinesを使って閉域環境でのCI/CDはセルフホストエージェントを使わなければならない上に、イメージの管理は非常に大変な作業でした。更新作業には自動化していても5-6時間かかる上に、月に1-2度は行わなくてはなりません。おそらく皆さん大変だったでしょう。


そんな中、Managed DevOps Poolのアナウンスが本ブログで行われました。私だけではなく、おそらく皆さんが望んだソリューションであり、大変期待しています。)


*Relevant event: Azure DevOpsオンライン Vol.11 ~ Managed DevOps Pool解説 – connpass

Monitoring GPU Metrics in AKS with Azure Managed Prometheus, DCGM Exporter and Managed Grafana

Monitoring GPU Metrics in AKS with Azure Managed Prometheus, DCGM Exporter and Managed Grafana

This article is contributed. See the original author and article here.

Azure Monitor managed service for Prometheus provides a production-grade solution for monitoring without the hassle of installation and maintenance. By leveraging these managed services, we can focus on extracting insights from your metrics and logs rather than managing the underlying infrastructure.


 


The integration of essential GPU metrics—such as Framebuffer Memory Usage, GPU Utilization, Tensor Core Utilization, and SM Clock Frequencies—into Azure Managed Prometheus and Grafana enhances the visualization of actionable insights. This integration facilitates a comprehensive understanding of GPU consumption patterns, enabling more informed decisions regarding optimization and resource allocation.


 


Azure Managed Prometheus recently announced general availability of Operator and CRD support, which will enable customers to customize metrics collection and add scraping of metrics from workloads and applications using Service and Pod Monitors, similar to the OSS Prometheus Operator.


 


This blog will demonstrate how we leveraged the CRD/Operator support in Azure Managed Prometheus and used the Nvidia DCGM Exporter and Grafana to enable GPU monitoring.


 


GPU monitoring


 


As the use of GPUs has skyrocketed for deploying large language models (LLMs) for both inference and fine-tuning, monitoring these resources becomes critical to ensure optimal performance and utilization. Prometheus, an open-source monitoring and alerting toolkit, coupled with Grafana, a powerful dashboarding and visualization tool, provides an excellent solution for collecting, visualizing, and acting on these metrics.


 


Essential metrics such as Framebuffer Memory Usage, GPU Utilization, Tensor Core Utilization, and SM Clock Frequencies serve as fundamental indicators of GPU consumption, offering invaluable insights into the performance and efficiency of graphics processing units, and thereby enabling us to reduce our COGs and improve operations.


 


Using Nvidia’s DGCM Exporter with Azure Managed Prometheus


 


The DGCM Exporter is a tool developed by Nvidia to collect and export GPU metrics. It runs as a pod on Kubernetes clusters and gathers various metrics from Nvidia GPUs, such as utilization, memory usage, temperature, and power consumption. These metrics are crucial for monitoring and managing the performance of GPUs.


 


You can integrate this exporter with Azure Managed Prometheus. The section below in blog describes the steps and changes needed to deploy the DCGM Exporter successfully.


 


Prerequisites


 


Before we jump straight to the installation, ensure your AKS cluster meets the following requirements:



  1. GPU Node Pool: Add a node pool with the required VM SKU that includes GPU support.

  2. GPU Driver: Ensure the NVIDIA Kubernetes device plugin driver is running as a DaemonSet on your GPU nodes.

  3. Enable Azure Managed Prometheus and Azure Managed Grafana on your AKS cluster.


 


Refactoring Nvidia DCGM Exporter for AKS: Code Changes and Deployment Guide


 


Updating API Versions and Configurations for Seamless Integration


As per the official documentation, the best way to get started with DGCM Exporter is to install it using Helm. When installing over AKS with Managed Prometheus, you might encounter the below error:


 


 


 


 

Error: Installation Failed: Unable to build Kubernetes objects from release manifest: resource mapping not found for name: "dcgm-exporter-xxxxx" namespace: "default" from "": no matches for kind "ServiceMonitor" in version "monitoring.coreos.com/v1". Ensure CRDs are installed first.

 


 


 


 


 


 


To resolve this, follow these steps to make necessary changes in the DCGM code:


 



  1. Clone the Project: Go to the GitHub repository of the DCGM Exporter and clone the project or download it to your local machine.

  2. Navigate to the Template Folder: The code used to deploy the DCGM Exporter is located in the template folder within the deployment folder.

  3. Modify the service-monitor.yaml File: Find the file service-monitor.yaml. The apiVersion key in this file needs to be updated from monitoring.coreos.com/v1 to azmonitoring.coreos.com/v1. This change allows the DCGM Exporter to use the Azure managed Prometheus CRD.


 


 


 

apiVersion: azmonitoring.coreos.com/v1

 


 


 


 


4. Handle Node Selectors and Tolerations: GPU node pools often have tolerations and node selector tags. Modify the values.yaml file in the deployment folder to handle these configurations:


 


 


 


 

nodeSelector:
  accelerator: nvidia

tolerations:
- key: "sku"
  operator: "Equal"
  value: "gpu"
  effect: "NoSchedule"

 


 


 


 


 


Helm: Packaging, Pushing, and Installation on Azure Container Registry


We followed the MS Learn documentation for pushing and installing the package through Helm on Azure Container Registry. For a comprehensive understanding, you can refer to the documentation. Here are the quick steps for installation:


 


After making all the necessary changes in the deployment folder on the source code, be on that directory to package the code. Log in to your registry to proceed further.


 


1. Package the Helm chart and login to your container registry:


 


 


 

helm package .
helm registry login  --username $USER_NAME --password $PASSWORD

 


 


 


 


 


2. Push the Helm Chart to the Registry:


 


 


 

helm push dcgm-exporter-3.4.2.tgz oci:///helm

 


 


 


 


3. Verify that the package has been pushed to the registry on Azure portal.


 


4. Install the chart and verify the installation:


 


 


 

helm install dcgm-nvidia oci:///helm/dcgm-exporter -n gpu-resources
#Check the installation on your AKS cluster by running:
helm list -n gpu-resources
#Verify the DGCM Exporter:
Kubectl get po -n gpu-resources
Kubectl get ds -n gpu-resources

 


 


 


 


You can now check that the DGCM Exporter is running on the GPU nodes as a DaemonSet.


 


Exporting GPU Metrics and Configuring Azure Managed Grafana Dashboard


Once the DGCM Exporter DaemonSet is running across all GPU node pools, you need to export the GPU metrics generated by this workload to Azure Managed Prometheus. This is accomplished by deploying a PodMonitor resource. Follow these steps:


 



  1. Deploy the PodMonitor: Apply the following YAML configuration to deploy the PodMonitor:


 


 


 

apiVersion: azmonitoring.coreos.com/v1
kind: PodMonitor
metadata:
  name: nvidia-dcgm-exporter
  labels:
    app.kubernetes.io/name: nvidia-dcgm-exporter
spec:
  selector:
    matchLabels:
      app.kubernetes.io/name: nvidia-dcgm-exporter
  podMetricsEndpoints:
  - port: metrics
    interval: 30s
  podTargetLabels:

 


 


 


 


2. Check if the PodMonitor is deployed and running by executing:


 


 


 

kubectl get podmonitor -n 

 


 


 


 


 


3. Verify Metrics export: Ensure that the metrics are being exported to Azure Managed Prometheus on the portal by navigating to the “Metrics” page on your Azure Monitor Workspace.


 


gpu-metrics.png


 


 


 


 


 


 


 


 


 


 


 


Create the DGCM Dashboard on Azure Managed Grafana


The GitHub repository for the DGCM Exporter includes a JSON file for the Grafana dashboard. Follow the MS Learn documentation to import this JSON into your Managed Grafana instance.


 


After importing the JSON, the dashboard displaying GPU metrics will be visible on Grafana.


dashboard.png


 


 


 


 


 

Azure Infra Girls LATAM: Certificación AZ-900 (Azure Fundamentals)

Azure Infra Girls LATAM: Certificación AZ-900 (Azure Fundamentals)

This article is contributed. See the original author and article here.

La computación en la nube está revolucionando diversas áreas de la tecnología, incluyendo programación, datos, inteligencia artificial (IA) y seguridad. Para ayudar a los profesionales a especializarse en esta área en constante evolución, Microsoft está lanzando la iniciativa Azure Infra Girls. Este programa ofrece una serie de cuatro clases en vivo, gratuitas y en español, que se llevarán a cabo del 3 al 24 de septiembre, a las 12:30pm (GMT-6, Ciudad de México).






 


REGÍSTRATE AQUÍ: aka.ms/AzureInfraGirlsLATAM


 


cynthiazanoni_1-1723035711889.png

Durante estas sesiones, los participantes tendrán la oportunidad de profundizar sus conocimientos en computación en la nube y prepararse para la certificación AZ-900 (Azure Fundamentals), a través de una ruta de aprendizaje con cursos certificados en Microsoft Learn.


 


Todas nuestras sesiones comenzaran en base a la zona horaria de Ciudad de México. 


 

























Sesión Descripción

Conceptos de computación en la nube con Microsoft Azure 


Sept 3, 12:30PM Mexico City (GMT-6) 


En el primer episodio de Azure Infra Girls, aprenderás sobre los conceptos básicos de la programación en la nube. Comprenderás qué son las nubes públicas, privadas e híbridas, los beneficios y los tipos de servicios, como IaaS, PaaS, SaaS, Serverless, y cómo usar Azure para desarrollar tus aplicaciones. 

Arquitectura y servicios de Azure 


Sept 11, 12:30PM Mexico City (GMT-6) 


En la segunda sesión, profundizaremos en los conceptos de arquitectura y servicios de Azure, con algunos ejercicios prácticos para crear un Escritorio Virtual (Azure Virtual Desktop) y alojar un recurso en Azure. 

Administración y gobernanza de Azure 


Sept 17, 12:30PM Mexico City (GMT-6) 


En la tercera sesión, abordaremos los servicios de administración y gobernanza de Azure, analizando el control de costos, las funcionalidades y herramientas de gobernanza, el cumplimiento y la supervisión. 

Simulacion del examen Azure Fundamentals AZ-900  


Sept 24, 12:30PM Mexico City (GMT-6) 


En esta sesión, realizaremos un simulacro del examen con preguntas relacionadas con los temas. Estarán relacionados, sobre conceptos de nube, arquitectura y servicios de Azure. Asimismo, con administración y gobernanza. 





 


cynthiazanoni_0-1723036363759.png


 


Para aquellos que desean profundizar en la computación en la nube y prepararse para la certificación AZ-900 Azure Fundamentals, tenemos una ruta de aprendizaje completa disponible en Microsoft Learn. Esta ruta aborda las principales áreas de la certificación, permitiéndote estudiar de forma gratuita y a tu propio ritmo. Además, al completar los cursos, podrás obtener certificados que pueden ser añadidos a tu perfil de LinkedIn, destacando tus nuevas habilidades y conocimientos.


 


Los módulos de Microsoft Learn para la certificación AZ-900 incluyen:



 


Práctica para el examen:



  • Evaluación de sus conocimientos: estas evaluaciones le proporcionarán una infomacion general del estilo, la redacción y la dificultad de las preguntas que probablemente verá en el examen. A través de estas valoraciones, puede evaluar su preparación, determinar dónde necesita preparación adicional y llenar los vacíos de conocimiento para aumentar la probabilidad de aprobar el examen.

  • Demostración de experiencia: aquí puede experimentar con el aspecto del examen antes de realizarlo. Podrá interactuar con diferentes tipos de preguntas en la misma interfaz de usuario que usará durante el examen.


 



Regístrate y Participa

No te pierdas esta increíble oportunidad para seguir aprendiendo y avanzar en tu carrera tecnológica. Únete a Azure Infra Girls y comienza tu viaje hacia la especialización en computación en la nube con Microsoft. Esperamos verte en nuestras sesiones en vivo y ayudarte a alcanzar tus objetivos profesionales en el mundo de la tecnología.

REGÍSTRATE AQUÍ: aka.ms/AzureInfraGirlsLATAM

Monitoring GPU Metrics in AKS with Azure Managed Prometheus, DCGM Exporter and Managed Grafana

Learn about AppJetty’s ISV Success for Business Applications solution in Microsoft AppSource

This article is contributed. See the original author and article here.

Microsoft ISV Success for Business Applications offers platforms, resources, and support designed to help partners develop, publish, and market business apps. Learn more about this offer from AppJetty:


 









AppJetty Logo.png

MappyField 365: MappyField 365 is a powerful geo-mapping plugin for Microsoft Dynamics 365 that boosts business productivity with advanced features like live tracking, geographic data visualization, proximity search, auto-scheduling, auto check-ins, territory management, and heat maps. Accelerate your business across organizations with location intelligence from AppJetty.


Announcing public preview of Dynamics 365 Store Commerce Self-checkout

Announcing public preview of Dynamics 365 Store Commerce Self-checkout

This article is contributed. See the original author and article here.

Despite the convenience of online shopping, many shoppers still value the hands-on experience of visiting retail stores. The instant gratification, social interactions, and serendipity of physical shopping continues to attract buyers. With the evolution of technology, retailers are seeking more automated ways to fulfill their customers’ shopping needs. Self-service checkout solutions have become a crucial component of retail businesses’ strategies aimed at enhancing the overall shopping experience.

Long lines at checkout can result in decreased sales and unhappy customers. Modern shoppers seek control, ease, and security while purchasing, leading to a preference for self-service. Retailers are adopting self-checkout (SCO) systems to offer more personal and confidential buying experiences. The growth in SCO is partly due to labor shortages and rising wage costs. RBR research predicts self-checkout terminals will grow by 90% annually worldwide, indicating a trend toward faster, self-reliant service.

While there are clear advantages, it’s essential to acknowledge and address some of the challenges through technology. These challenges encompass issues related to scanning, the overall usability of checkout devices, losses attributed to theft and inadvertent misuse, as well as the absence of personal interaction.

Discover the benefits of Microsoft Dynamics 365 Commerces Self-checkout Preview. Self-checkout is available as a public preview.

Self-checkout in Dynamics 365 Commerce

The new self-checkout solution in Dynamics 365 Commerce is the same point-of-sale application (Store Commerce app) in self-checkout mode. Payment integrations, localization support, hardware integrations and any extensions built for the fixed till will also work for the self-checkout app. This allows retailers to quickly turn on self-checkout by leveraging their existing investments on Store commerce app for Windows. The Store Commerce app in self-checkout mode supports the following:

  • Simplified login that allows cashiers to access the registers while also allowing shoppers to self-checkout.
  • Out-of-box self-checkout layout for a quick start allowing users to scan items, support loyalty, and pay with credit or debit.
  • Intuitive interface for shoppers that provides only the supported actions in self-checkout while disabling store associate actions.
  • Call for assistance to allow shoppers to request assistance for elevated actions like voids, overrides and discounts.
  • Browse operation that allows shoppers to browse for products that are not scannable or too big or too small to scan.
  • The ability to restrict certain products from being purchased via self-checkout using a configuration in Headquarters.
  • Offline support for business continuity even during network outage.
  • Support for store commerce peripherals such as Scanners, payment terminals and printers for self-checkout.
  • Adyen payment integration out-of-box.

Self-checkout to meet every retailer’s need

Retail sectors have diverse needs for point of sale and self-service checkout systems. Fashion retailers might prefer kiosk-based solutions for efficient scan-and-pay transactions. Grocery stores require self-checkouts integrated with weighing and bagging scale capabilities, along with cash handling machines. Store commerce self-checkout solution is built on commerce SDK and therefore is fully extensible for customers. Here are a few ways retailers can tailor the solution for their business needs.

Retailers can easily configure the default self-checkout layout to add operations that fit their business needs. For example, they can include an operation to apply coupons.

The Store Commerce self-checkout system is hardware agnostic and works across a variety of different hardware. Retailers cater to their unique needs regarding certain hardware peripherals through development of custom integrations with either the supported OPOS drivers or tailor-made SDKs.

Moreover, retailers have the advantage of integrating their existing localizations, payment methods, and additional extensions that are established within the cashier-managed workflows directly into the self-service checkout procedures.

In scenarios where cashier intervention might be necessary, such as when items with particular discounts are scanned, retailers can employ out-of-box extensions to promptly request cashier assistance.

Theft and losses in self-checkout

While self-checkout drives efficiency, there is still a high risk of theft and accidental loss as it’s easy for customers to by-pass scanning items or make honest mistakes. Retailers need to balance the efficiency of self-checkout with the need to thwart theft. Some retailers have achieved this by limiting the number of items in the checkout stand, some have eliminated cash as a payment method.

In addition, new theft detection systems are now available in the market using cameras and algorithms for spotting thefts. There are image-recognition algorithms used in combination with multiple cameras to detect shopper’s movements for theft.  Microsoft’s Azure vision, allows retailers to train the model with their own catalog and use camera-based image detection during checkout to identify and add items to the cart thereby reducing the risk of theft.

Retailers can use additional mechanisms to trigger cashier intervention for dubious scenarios such as repeated scanning of identical barcodes, unusual scanning of multiple low-priced items, items missing in the bagging area, etc .


Copilot

As we introduce Copilot features in Store Commerce, they could be leveraged easily for self-checkout. For instance, copilot can play a role while a customer is doing a price check or browsing for product availability. Voice-assistance in Copilot will help make the shopper experience smoother. Furthermore, we expect that Copilot scenarios such as product discovery, product suggestions, and personalized offers will be of high value for a shopper using a kiosk.

Future of self-checkout

Traditional self-checkout (SCO) methods often utilize kiosks, but retailers are also exploring scan-and-go options for added convenience. These allow customers to use their own devices or the store’s device to scan items and pay with their chosen method. Additionally, smart carts with integrated computerized screens are emerging, enabling shoppers to avoid traditional checkout lines for increased efficiency. However, while these innovations are gaining popularity, they might not be ideal for all merchandise types, could elevate theft risks, and might be more appropriate for stores with smaller footprints.

As Store commerce self-checkout gains wide adoption by multiple retailers in various industry segments such as apparel and fashion, department and grocery store we will keep a close eye on customers’ need and incorporate their feedback into the product.

For instance, our customers have requested that self-checkout systems include interruption features for assistance calls tailored to the retailer’s specific requirements for theft prevention or validation and provide an option for shoppers to select their preferred language.

It is becoming clear that retailers favor a hybrid model that combines human interaction with automated convenience. With the power and efficiency of Microsoft Dynamics 365 Store commerce point of sale alongside Store commerce self-checkout, we aim to provide customers and shoppers with exceptional shopping experience.

To enable Store commerce self-checkout today, please visit: Enable self-checkout in the Store Commerce app – Commerce | Dynamics 365 | Microsoft Learn.

The post Announcing public preview of Dynamics 365 Store Commerce Self-checkout appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.