Introducing Skills in Microsoft Viva, a new AI-powered service to grow and manage talent

Introducing Skills in Microsoft Viva, a new AI-powered service to grow and manage talent

This article is contributed. See the original author and article here.

We’re excited to announce a new AI-powered Skills in Viva service that will help organizations understand workforce skills and gaps, and deliver personalized skills-based experiences throughout Microsoft 365 and Viva applications for employees, business leaders, and HR.  

The post Introducing Skills in Microsoft Viva, a new AI-powered service to grow and manage talent appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Easily Bring your Machine Learning Models into Production with the AzureML Inference Server

Easily Bring your Machine Learning Models into Production with the AzureML Inference Server

This article is contributed. See the original author and article here.

Taking your machine learning (ML) models from local development into production can be challenging and time consuming. It requires creating a HTTP layer above your model to process incoming requests, integrate with logging services, and handle errors safely. What’s more, the code required for pre- and post-processing, model loading, and model inference vary across models and must integrate smoothly with the HTTP layer. 


 


Today, we are excited to announce the General Availability (GA) and open sourcing of the Azure Machine Learning Inference Server. This easy-to-use python package provides an extensible HTTP layer that enables you to rapidly prepare your ML models for production scenarios at scale. The package takes care of request processing, logging, and error handling. It also provides a score script interface that allows for custom, user-defined, pre- and post-processing, model loading, and inference code for any model. 


 


Summary of the AzureML Inference HTTP Server 


 


The Azure Machine Learning Inference Server is a Python package that exposes your ML model as a HTTP endpoint. The package contains a Flask-based server run via Gunicorn and is designed to handle production scale requests. It is currently the default server used in the Azure Machine Learning prebuilt Docker images for inference. And, while it is built for production, it is also designed to support rapid local development. 


 


Figure 1: How the Azure Machine Learning Inference Server Handles Incoming RequestsFigure 1: How the Azure Machine Learning Inference Server Handles Incoming Requests


 


Score Script 


 


The score script (sometimes referred to as the “scoring script” or “user code”) is how you can provide your model to the server. It consists of two parts, an init() function, executed on server startup, and a run() function, executed when the server receives a request to the “/score” route.  


 


On server startup… 


The init() function is designed to hold the code for loading the model from the filesystem. It is only run once. 


 


On request to “/score” route… 


The run() function is designed to hold the code to handle inference requests. The code written here can be simple: passing raw JSON input to the model loaded in the init() function and returning the output. Or, it can be complex: running several pre-processing functions defined across multiple files, delegating inference to a GPU, and running content moderation on the model output before returning results to the user. 


 


The score script is designed for maximum extensibility. Any code can be placed into init() or run() and it will be run when those functions are called as described above. 


 


An example score script can be found here – azureml-examples – Github. 


 


Key AzureML Inference Server Scenarios 


 


Local Score Script Development and Debugging


 


Developing a complex score script may require iterative debugging and often it’s not feasible to redeploy an online endpoint several times to debug potential issues. The AzureML Inference Server allows you to run a score script locally to test both model loading and inference request handling. It easily integrates with the VS Code debugger and allows you to step through potentially complex processing or inference steps. 


 


Note: To test your docker image in addition to your score script please refer to the AzureML Local Endpoints documentation. 


 


Validation gates for CI/CD Pipeline 


 


The Azure Machine Learning Inference Server can also be used to create validation gates in a continuous integration and deployment (CICD) pipeline. For example, you can start the server with a candidate score script and run a test suite against this local instance directly in the pipeline, enabling a safe, efficient, and automatable deployment process.


 


Production Deployments 


 


The Azure Machine Learning Inference Server is designed to support production-scale inference. Once local testing is complete, you can feel confident using the score script you developed alongside the Azure Machine Learning prebuilt inference images to deploy your model as an AzureML Managed Online Endpoint. 


 


Safely bring your models into production using the Azure Machine Learning Inference Server and AzureML Managed Inference by referencing the resources below. 


 


Learn More 


 




 


 

Microsoft is recognized as a Leader in the 2023 Gartner® Magic Quadrant™ for Cloud ERP for Product-Centric Enterprises

Microsoft is recognized as a Leader in the 2023 Gartner® Magic Quadrant™ for Cloud ERP for Product-Centric Enterprises

This article is contributed. See the original author and article here.

We are excited and honored that Gartner has recognized Microsoft as a Leader in their 2023 Magic Quadrant™ for Cloud ERP for Product-Centric Enterprises.* This evaluation of Microsoft was based on specific criteria that analyzed our overall Completeness of Vision and Ability to Execute. This is the third year in a row that we’ve been recognized as a Leader.

Agile enterprise resource planning (ERP) system for new ways of working

The way we do business has fundamentally changed. New business models are disrupting the way companies sell products and services, blurring industry lines and transforming customer experiences. ERP systems need to evolve from mere systems of transaction to systems of reasoning, offering their users prescriptive actions that they can take in their functional areas to accelerate growth.

Microsoft Dynamics 365 has already been helping thousands of organizations optimize finance and supply chains to create a connected enterprise by infusing automation and analytics powered by AI into the various ERP processes. Now, with Dynamics 365 Copilot in our ERP portfolio included in Microsoft Dynamics 365 Supply Chain Management, Microsoft Dynamics 365 Finance, and Microsoft Dynamics 365 Project Operations, we can enable every person in every organization to be more productive, collaborative, and deliver high-performance results.

For instance, with Copilot, organizations can supercharge productivity of procurement professionals and collections agents. Procurement professionals can efficiently handle purchase order changes at scale and assess the impact of changes downstream to production and distribution before making the right decision. Copilot enables quick collaboration with internal and external stakeholders that brings relevant information into Outlook and Microsoft Teams using natural language to meet customer and partner needs.  

Collections managers with quick access to credit and payment history can prioritize and personalize customer communication and increase successful collection rates while proactively keeping customers in good standing. With Copilot, project managers can rapidly create new project plans for new engagements in minutes, automate status reports, identify risks, and suggest mitigation plans on a continuous basis, saving a significant amount of time, preventing project delays and budget overruns.

At Microsoft, we are fully committed to revolutionizing the future of ERP systems by harnessing the power of intelligent, composable technologies. The ERP portfolio from Dynamics 365, powered by generative AI technology, has the ability to speed time to insight, intelligently automate processes, and foster productivity ensuring that organizations can stay ahead of their competition in an increasingly complex business landscape.

Cloud-native ERP systems on a composable platform

One of the key strengths of Dynamics 365 Supply Chain Management and Dynamics 365 Finance is their extensibility. The ERP portfolio is built on a composable platform, making it easy to extend the solution with Microsoft Power Platform, providing low-code tools like Microsoft Power Apps and Microsoft Power Automate.

Where ERP customizations were once a heavy, time-consuming task, these tools empower businesses to customize their solutions and build apps with a modern user experience so that they can adapt to their bespoke industry specific needs and end users can work the way they want. Furthermore, companies and users can leverage prebuilt customizations and industry-specialized solutions from our ISV partner network to help speed development even further.

One of our customers, Nestlé, chose Dynamics 365 as the preferred platform for agile and speedy business system requests for mergers and acquisitions (M&A) activities. Nestlé needed business applications that would provide flexibility to adapt to different business models across geographies that could be reused multiple times. The company needed rich out-of-the-box features that could be extended with low-code/no-code capabilities. With Dynamics 365, Nestlé was able to create reusable strategies and blueprints for migrating business data and operations that would enable faster and more efficient acquisitions and divestitures easily with limited disruptions to customers and employees. This also helped them adhere to compliance, security, and data privacy regulations effectively. In just four short months after the project kicked off, Nestlé went live with Dynamics 365 Finance, Supply Chain Management, and Commerce. 

AIM for the future with Microsoft today

In conclusion, running a business on Dynamics 365 offers numerous benefits for organizations. From seamless integration and enhanced productivity to real-time analysis and smart decision-making capabilities, Dynamics 365 empowers businesses to thrive in today’s dynamic market. Microsoft is committed to empowering customers to take advantage of AI capabilities in every line of business.

Organizations relying on on-premises applications will struggle to compete with peers embracing these AI-powered technologies in the cloud. It is paramount for companies to migrate their critical business processes to the cloud now. That is why we introduced AIM (Accelerate, Innovate, Move) earlier. AIM offers organizations a tailored path to move critical processes to the cloud with confidence. It provides qualified customers with access to a dedicated team of migration advisors, expert assessments, investment offers, tools, and migration support.

Get started with AIM today.

For more information on generative AI-powered capabilities in Dynamics 365 ERP systems, you can request a demo or take a tour today.

Male working remotely from his home office on a Dell Latitude 13 device, logo/product shot.

Microsoft is named a Leader

Gartner recognizes Microsoft for cloud ERP for product-centric enterprises.

Magic Quadrant reports are a culmination of rigorous, fact-based research in specific markets, providing a wide-angle view of the relative positions of the providers in markets where growth is high and provider differentiation is distinct. Providers are positioned into four quadrants: Leaders, Challengers, Visionaries, and Niche Players. The research enables you to get the most from market analysis in alignment with your unique business and technology needs. View a complimentary copy of the Magic Quadrant report to learn more.

*Gartner is a registered trademark and service mark and Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

**This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Microsoft.

Source: Gartner, “Magic Quadrant for Cloud ERP for Product-Centric Enterprises,” Greg Leiter, Robert Anderson, Dixie John, Tomas Kienast, David Penny, September 26, 2023.


The post Microsoft is recognized as a Leader in the 2023 Gartner® Magic Quadrant™ for Cloud ERP for Product-Centric Enterprises appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

In-memory table in Azure SQL DB doesn’t release memory- Msg 41823, Level 16, State 109, Line 1

In-memory table in Azure SQL DB doesn’t release memory- Msg 41823, Level 16, State 109, Line 1

This article is contributed. See the original author and article here.

Issue


We recently encountered a support case where a customer using In-memory tables in an Azure SQL DB, receives an error message while trying to insert data into the table that also has a clustered columnstore index. The customer then deleted the entire data from the In-memory Tables (With the clustered columnstore index), however it appeared that the Index Unused memory was still not released. Here’s the memory allocation the customer could see:


Tanayankar_Chakraborty_0-1696653811279.png


 


Error


In addition to the error above- here is the error text:


Msg 41823, Level 16, State 109, Line 1


Could not perform the operation because the database has reached its quota for in-memory tables. This error may be transient. Please retry the operation. See ‘http://go.microsoft.com/fwlink/?LinkID=623028‘ for more information


 


Workaround


To reproduce the issue, we created two tables in our premium tier Azure SQL DB, one with a clustered columnstore Index while the other just had a regular clustered index. Also, the columnstore index was created with the option- MEMORY_OPTIMIZED=ON.


Tanayankar_Chakraborty_1-1696653900880.png


Then we went ahead and inserted data in both the tables and ran the script below to find the memory consumption of the indexes (Notice the 97 MB reported by the Index_Unused_memory column below in the table containing the columnstore Index):


 


IF(    SELECT COUNT(1)    FROM sys.data_spaces    WHERE type = ‘FX’) > 0


    BEGIN


        SELECT OBJECT_NAME(object_id) AS tblName,


               CAST(memory_used_by_table_kb / 1024.00 AS DECIMAL(10, 2)) [Total used Memory MB],


               CAST(memory_allocated_for_table_kb / 1024.00 AS DECIMAL(10, 2)) – CAST(memory_used_by_table_kb / 1024.00 AS DECIMAL(10, 2)) [Total Unused Memory MB],


               CAST(memory_used_by_indexes_kb / 1024.00 AS DECIMAL(10, 2)) [Index used Memory MB],


               CAST(memory_allocated_for_indexes_kb / 1024.00 AS DECIMAL(10, 2)) – CAST(memory_used_by_indexes_kb / 1024.00 AS DECIMAL(10, 2)) [Index Unused Memory MB]


        FROM sys.dm_db_xtp_table_memory_stats


        ORDER by 2 desc;


    END;


 


Tanayankar_Chakraborty_2-1696653977445.png


 


Now we went ahead and deleted all data from the table (with the columnstore Index) and ran the same query above:


Tanayankar_Chakraborty_3-1696654016940.png


 


Tanayankar_Chakraborty_5-1696654058418.png


 


The test above proves that it is not the data contained in an In-memory table that consumes the memory, but it is rather the Columnstore Index that consumes the memory and occupies it till the index stays on the table. Even if we delete the data from the table, the memory will still remain in the Index Unused memory. The only possible option to release the Index Unused memory is to drop the clustered Columnstore Index.


Moreover, it is also recommended to use a Columnstore Index only for tables with a lot of data (Millions or even billions) only if using it helps achieve the overall performance levels expected.


 


References


In-Memory OLTP in Azure SQL Database | Azure Blog | Microsoft Azure


In-memory technologies – Azure SQL | Microsoft Learn


Should table or stored procedure be ported to in-memory OLTP – SQL Server | Microsoft Learn


 

Meet a recent Microsoft Learn Student Ambassador graduate: Vidushi Gupta

Meet a recent Microsoft Learn Student Ambassador graduate: Vidushi Gupta

This article is contributed. See the original author and article here.

This is the next segment of our blog series highlighting Microsoft Learn Student Ambassadors who achieved the Gold milestone, the highest level attainable, and have recently graduated from university. Each blog in the series features a different student and highlights their accomplishments, their experience with the Student Ambassador community, and what they’re up to now.  


   


Today we meet Vidushi Gupta, who recently graduated with a bachelor of computer science from SRM Institute of Science and Technology in India. 
 
Responses have been edited for clarity and length.    


 


When did you join the Student Ambassadors community?  


 


I joined the Student Ambassadors community in January 2021. This was the time when I started to learn about tech communities, and MLSA was my first.

What was being a Student Ambassador like?  

Being a Microsoft Learn Student Ambassador was a transformative experience in my tech journey. It provided me with a supportive community and an exceptional program team, creating a safe space for me to learn and grow. Through this opportunity, I not only expanded my knowledge of new technologies but also made significant advancements in my existing tech skills. The program encouraged me to participate in hackathons, where I not only utilized my skills, but also emerged as a winner in some instances. Along the way, I had the privilege of meeting exceptional individuals who shared my passion for technology. Overall, being a Student Ambassador has been an incredible journey, filled with continuous learning, personal growth, and the development of unwavering confidence.


Was there a specific experience you had while you were in the program that had a profound impact on you and why?

During my time as a Microsoft Learn Student Ambassador, there were three experiences that had a profound impact on me. In 2021, I was awarded the Microsoft advocacy sponsorship to attend the Grace Hopper Celebration (GHC). This experience highlighted the importance of diversity and inclusion, and witnessing the safe space provided to women and gender minorities at the conference was inspiring. Since then, I have maintained my association with GHC, attending the conference in 2021 and serving as a mentor in 2022. I am currently aiming to attend the conference again this year.


 


Student_Developer_Team_1-1696627458420.png


Vidushi, Gold Student Ambassador, in Amsterdam during her student exchange program experience where she learned how to improve business by using data to drive decisions.


Tell us about a technology you had the chance to gain a skillset as a Student Ambassador. How has this skill you acquired helped you in your post-university journey?  


As a Student Ambassador, I collaborated with Jasleen, a fellow Ambassador, on Microsoft’s data science and machine learning curriculums. This experience enhanced my skills in R, a language not commonly used in many projects. Acquiring proficiency in R has been invaluable in developing my data science portfolio and giving me a head start in my career. It has equipped me with the confidence and practical knowledge to tackle data-driven challenges and extract insights from complex datasets.


What is something you want all students, globally, to know about the Microsoft Learn Student Ambassador Program?  


The MLSA program is an inclusive community with an amazing and supportive program team. It emphasizes the power of community and peer-to-peer learning, providing a safe space for diverse voices to be heard. Through MLSA, I learned the value of collaborating with fellow ambassadors, gaining support, guidance, and lifelong connections. I encourage all students worldwide to join this program and experience the transformative impact it can have on their tech journey.


What advice would you give to new Student Ambassadors, who are just starting in the program?  


Five words – Trust the process and learn.


 


Look beyond the swag, look at the network you’re going to build, and grow!


Share a favorite quote with us! It can be from a movie, a song, a book, or someone you know personally. Tell us why you chose this. What does it mean for you?  


“You v/s You.” This has always been my favorite quote. It always reiterates for me that I am my only competition. This helps me to work on being a little better than what I was yesterday. This quote also helps me to stay away from the comparison loop because yesterday’s Vidushi is my only baseline and no one else is!


 


Student_Developer_Team_2-1696627458653.png


Vidushi with fellow Student Ambassadors at the Microsoft office.

Tell us something interesting about you, about your journey.  

When I joined, I had already experienced gender discrimination in tech. That experience led me to believe that women have to put in a lot of work to stay at the table. I was disheartened but wanted to get involved with a global community like MLSA to understand the importance of women and gender minorities in tech. I started off being doubtful about tech, even though I enjoyed it. Through my experience in the MLSA program, I became a confident public speaker, a mentor, a tech enthusiast, a data storyteller, a diversity and inclusion evangelist, and so much more!  

You can follow Vidushi here:   


Personal Portfolio Website https://vidushig.com


LinkedIn https://www.linkedin.com/in/vidushi-gupta07


Instagram https://www.instagram.com/vidzene_


GitHub: https://github.com/Vidushi-Gupta


Twitter https://twitter.com/Vidushi_Gupta7