This article is contributed. See the original author and article here.
Recently, I received a question about unattended uninstall for SQL Server Express edition. This article describes how to perform this task.
We need to take in consideration before to proceed
User that performs the process must be a local administrator with permissions to log on as a service. You can review more information about required permissions here.
If the machine has theminimumrequired amount of physical memory, increase the size of the page file to two times the amount of physical memory. Insufficient virtual memory can result in an incomplete removal of SQL Server.
On a system with multiple instances of SQL Server, the SQL Server browser service is uninstalled only once the last instance of SQL Server is removed. The SQL Server Browser service can be removed manually fromPrograms and Featuresin theControl Panel.
Uninstalling SQL Server deletestempdbdata files that were added during the install process. Files with tempdb_mssql_*.ndf name pattern are deleted if they exist in the system database directory.
This article is contributed. See the original author and article here.
As enterprises are asked to manage increasingly complex business processes and data environments, context-aware AI summarization by Copilot in Microsoft Dynamics 365 streamlines operations by synthesizing data from multiple sources across Supply Chain Management, Finance, Commerce, and Human Resources. By delivering clear, actionable insights from ERP data, this generative AI feature eliminates context-switching and allows users to make better decisions faster.
Transformative AI summarization in Dynamics 365
Copilot generative AI features are revolutionizing the user experience in Supply Chain Management, Finance, Commerce, and Human Resources. Insights that used to require literally dozens of clicks, searches, and views in multiple windows—and a lot of deep thinking about complex data—are now presented to the right user, at the right time, automatically. Let’s take a closer look at how Copilot aggregates data from multiple sources and displays it in easily digestible and context-aware summaries.
Vendor summary streamlines understanding of vendor performance and financials
What do we mean by “context-aware”? One meaning is that Copilot summarizes data based on the user’s role to deliver real-time, role-specific insights. Take the vendor summary, for example. Traditionally, procurement managers had to navigate multiple forms to understand vendor performance. Copilot summaries streamline these insights by providing quick access to crucial information, such as active contracts, purchase orders, late deliveries, and overdue payments. For accounts payable teams, however, the vendor summary presents essential financial details about a vendor. For both roles, the vendor summary enables faster, data-driven decisions for better vendor interactions.
Sales order and purchase order summaries pinpoint critical items in open orders
Another perspective on “context-aware” is AI summarization based on task. Consider purchase and sales orders. Procurement and sales teams often spend significant time following up on open orders. Getting a comprehensive overview or pinpointing lines that need attention can be challenging, because the necessary data is typically spread across multiple forms. Copilot summaries consolidate the information, enabling users to easily identify critical items.
It’s not just about summarizing data, though. AI summarization also facilitates quicker action on next steps. Copilot’s summary includes convenient one-click filtering options, allowing users to swiftly access the information they need to act.
Customer summary streamlines insights by role for more effective customer relationships
When it comes to customer information, “context-aware” refers to everything that creates a relationship between an organization and its customers—information that’s often found in multiple, disparate tables, reports, and modules. Copilot addresses the challenges faced by roles such as accounts receivable agents, sales order agents, and customer account managers, who need comprehensive and role-specific information about customers that’s often scattered across multiple systems. For example, while accounts receivable teams need quick access to open invoices, sales order teams require details on open orders and shipments. Copilot consolidates all relevant data into a single, context-aware summary that’s specific to each role, allowing agents and account managers to tailor their interactions with customers, strengthen relationships, and enhance operational efficiency.
Warehouse worker home screen brings warehouse teams up to speed quickly
“Context-aware” can also refer to a user’s surroundings and situation. Warehouse start-of-shift stand-up meetings can miss important updates, and they don’t cover changes that happen throughout the day. Copilot’s dynamic operational summary on the Warehouse Management home screen brings warehouse workers up to speed at the start of their shifts and keeps them on top of the situation as they go about their day, helping them quickly adapt to changes and ensure daily goals are met.
Workflow history summary streamlines review and approval of invoices and expense reports
AI summarization streamlines examination of workflows by providing a concise overview of recent actions and comments, allowing approvers to quickly act without navigating through separate detail screens. Copilot summaries apply to workflows in Dynamics 365 Supply Chain Management, Finance, Commerce, and Human Resources, aiding review and approval processes and supporting informed decisions for things like vendor invoices, time-off requests, and expense reports.
Product preview summary consolidates product details for quick consumption
Procurement managers typically must navigate multiple forms to gather product details such as name, description, dimensions, hierarchy, life cycle state, and release policy. Copilot consolidates this information and other key product attributes in a single, concise summary, making these details quick and easy to consume.
When a warehouse manager views the product detail page, Copilot’s summary focuses on relevant information that would take multiple clicks to find, such as on-hand inventory levels, purchase information like main vendor, and batch numbers that are expiring soon.
Employee workspace summary makes leave management easier for both HR and employees
An organization’s success relies on both employees and customers. Effective time-off management is crucial for employees to make informed decisions and for the organization to optimize time-off utilization and manage financial liabilities from unused leave. Time-off information is scattered across multiple screens in the employee self-service portal. Copilot consolidates key details like vacation and sick leave balances and potential forfeitures due to policy, and includes a link to submit leave requests, all in one summary view.
Retail statement summary provides insights about risky transactions across multiple stores
Physical stores send cash-and-carry transactions to Dynamics 365 Commerce for inventory and financial updates. The store operations team must ensure proper posting, but identifying pending transactions can be difficult across multiple stores. Summaries of posted and unposted retail statements highlight stores needing attention and flag risky transactions like returns without receipts or price overrides. Brief error summaries for failed statements aid in quick resolution, enhancing store management efficiency.
For retail merchandisers, the challenge lies in managing complex product configurations without errors. Copilot addresses this challenge by streamlining merchandising workflows, offering a clear summary of settings, automating data validation, and providing a risk preview to anticipate issues. Here, context-aware AI summarization enhances efficiency, reduces the risk of lost sales, and drives growth.
More benefits of context-aware AI summarization of ERP data
Beyond the specific benefits we described earlier, Copilot summaries in Dynamics 365 Supply Chain Management, Finance, Commerce, and Human Resources enhance user experience and operational efficiency in multiple ways.
Enhanced productivity: With key data points automatically summarized, users spend less time analyzing vast datasets and can focus on strategic decision-making and core activities.
Proactive problem-solving: With real-time summaries, users can anticipate challenges and address them proactively, improving business agility and resilience.
Improved accuracy and insight: Copilot highlights critical information and trends, reducing the risk of human error in interpreting complex data. Analysis is more accurate and insightful, crucial for effective decision-making.
Customized user experiences: Each summarization feature is tailored to the specific needs of different roles within an organization, ensuring that every user receives the most relevant and actionable insights.
Seamless integration: AI features integrate seamlessly into your existing Dynamics 365 framework, providing a smooth user experience without the need for extensive setup or training.
Scalable decision support: Whether for small tasks or large-scale strategic decisions, Copilot summaries meet the needs of businesses of all sizes, scenarios, and requirements.
These benefits collectively contribute to a more streamlined, efficient, and informed ERP environment, setting the stage for more advanced AI features to come.
Introducing generative AI responsibly
Integrating generative AI into ERP products presents challenges. It requires ensuring that the AI features are reliable and robust enough for mission-critical business settings. It also requires building customer trust in the AI capabilities. Our vision is an autonomous ERP system that automates and optimizes business processes with minimal human intervention. However, this is a journey we’re embarking on together to instill confidence in the results and encourage greater adoption over time.
Our approach is to gradually introduce low-risk AI features that provide immediate benefits and time savings, gather user feedback, and build excitement. This way, we can improve the AI features based on user needs and business operations, laying the foundation for more advanced AI features in the future. We prioritize the safe deployment and continuous improvement of AI features in our ERP suite and are leading the way for responsible and impactful integration of AI in the ERP landscape.
Ensuring the ethical use of AI technology
Microsoft is committed to the ethical deployment of AI technologies. Through our Responsible AI practices, we ensure that all AI-powered features in Dynamics 365 adhere to stringent data privacy laws and ethical AI usage standards, promoting transparency, fairness, and accountability.
Learn more about AI summarization in Dynamics 365
Interested in learning more about the power of AI summarization to transform your business processes with unparalleled efficiency and insight? Here’s how you can dive deeper:
This article is contributed. See the original author and article here.
In this post, we’ll cover some details on how to track the lifecycle of a SharePoint Site in the Microsoft Graph Data Connect (MGDC), using the date columns in the SharePoint Site dataset. If you’re not familiar with MGDC for SharePoint, start with https://aka.ms/SharePointData.
All Dates in the Sites Dataset
One of the most common scenarios in MGDC for SharePoint is tracking the lifecycle of a site, which includes understanding when the site was created, how it grows over time, when it stops growing and when it becomes inactive or abandoned.
The SharePoint Sites dataset includes several columns that can be used to understand the site lifecycle in general. For instance, here are the datetime columns available:
Site Created
Creation date is straightforward. There is a column (CreatedTime) with the date when the site was created. As with all other dates, it uses the UTC time zone.
Last Modified
In the Sites dataset, you also have the date and time when any items under the root web were last modified (RootWeb.LastItemModifiedDate). This includes the last time when files were created, updated or deleted. This is a great indication that the Site is still in active use.
You also have the date the site security was last modified (LastSecurityModifiedDate). This shows when permissions were granted, updated or revoked. That includes permissions granted through the manage access interface and permissions granted through sharing links.
Last Accessed
Last access is available at the site level (LastUserAccessDate). This shows when an item in the site was last accessed (this includes simply reading the file). This is an important indicator to help understand when the site is becoming inactive or abandoned.
Note that, while there is an effort to identify here only access performed directly by users, this date might also include automated actions by applications, including internal SharePoint applications.
Snapshot Date
Please note that there is one more date (SnapshotDate), but that one is not relevant to the site lifecycle. The snapshot date simple tracks when the data was retrieved by MGDC.
File Actions
Besides what’s captured in these datetime columns in the Sites dataset, you also have the option to capture detailed file activity in the site using the SharePoint File Actions and accumulate those over time.
Keep in mind that MGDC for SharePoint only keeps actions for the last 21 days due to compliance issues. More specifically, you can get file actions between today minus 2 days and today minus 23 days. For instance, if today is June 30th, you can get file actions between June 8th and June 28th.
If you query this information daily, you could build a longer history of file actions over time. For instance, you could keep the last 90 days of data from the File Actions dataset. With that you could find recent access or otherwise say “no access in the last 90 days”.
This would also let you know more details about recent file activities, like who last accessed the site, which file or extension was last accessed, what was the last action, etc. You need to decide if you can rely solely on the date columns provided in the Sites dataset or if it is useful to keep these additional details.
Please do check with the compliance team in your company to make sure there are no restrictions on keeping this information for longer periods of time in your country. There might be regulatory restrictions on how long you can keep this type of personally identifiable information.
Calculated Columns
Keep in mind that these dates use a datetime data type, so grouping by one of them can sometimes be a challenge. If you’re using Power BI, you can show them as a date hierarchy and get a summary by year, quarter, month or day.
It might also be useful to create calculated columns to help with grouping and visualization. For instance, you can create a new date column (without the time portion) for daily summaries. Here’s how to calculate that in Power BI:
I hope this clarifies what is available in MGDC for SharePoint to track the lifecycle of a SharePoint site.
Let us know in the comments if you think we should consider additional lifecycle information.
For further details about the schema of all SharePoint datasets in MGDC, including SharePoint Sites and SharePoint File Actions, see https://aka.ms/SharePointDatasets.
This article is contributed. See the original author and article here.
Azure SQL Managed Instance with Zone Redundancy to begin billing in Italy-North, Israel-Central, and West-Europe
Microsoft Azure continues to enhance its services, ensuring that customers have access to the latest innovations and features. The latest update is particularly exciting for businesses operating in Italy, Israel, and West Europe: Azure SQL Managed Instance with Zone Redundancy is available in these regions and starts billing for all configurations.
What is Azure SQL Managed Instance?
Azure SQL Managed Instance is a fully managed database service that offers the best of SQL Server with the operational and financial benefits of an intelligent, fully managed service. It provides a near-100% compatibility with the latest SQL Server (Enterprise Edition) database engine, which makes it easy to migrate your SQL Server databases to Azure without changing your apps.
Understanding Zone Redundancy
Zone Redundancy is a feature designed to improve the availability and resilience of your database instances. In the context of Azure SQL Managed Instance, Zone Redundancy means that your instances are replicated across multiple availability zones within a region. Each availability zone is a physically separate location with independent power, cooling, and networking. This separation ensures that even in the event of a data center outage, your database remains available and operational.
Benefits of Zone Redundancy
1. Increased Resilience
By replicating your data across multiple zones, you safeguard your applications from data center failures. This redundancy minimizes the risk of downtime and ensures that your critical business applications remain online, providing a more reliable service to your users.
2. Improved Business Continuity
With Zone Redundancy, you can achieve higher availability SLAs. For many businesses, this means meeting stringent uptime requirements and maintaining customer trust by ensuring their services are always available.
3. Cost Efficiency
While Zone Redundancy does come with additional costs, the benefits of reduced downtime and the potential financial impact of data loss often outweigh these expenses. In essence, investing in Zone Redundancy can save your business money in the long run by avoiding costly downtime and data recovery efforts.
How to Enable Zone Redundancy
Enabling Zone Redundancy for your Azure SQL Managed Instance is straightforward:
During Instance Creation:
When creating a new managed instance, you can specify Zone Redundancy in the configuration options.
For Existing Instances:
If you have an existing instance, you can modify its settings to enable Zone Redundancy in configure blade
This article is contributed. See the original author and article here.
Introduction
Welcome to Part 2 of our exploration into generative AI, where we delve deeper into the practical applications and creative potential of this innovative technology.
This article highlights concrete examples from students projects of the course ‘Prompt Engineering’ at Fondazione Bruno Kessler (FBK) in Trento (Italy). The aim is to showcase how students leveraged generative AI in unique ways. In particular, we’ll focus on two fascinating projects: “Generative Music” and “Personal Chef,” which exemplify the versatility and impact of generative AI in diverse fields.
Core element of these projects is the use of a structured framework known as the Card Model to define and organize generative AI tasks. In the context of this course, a card refers to a structured format or template used to define a specific task or objective for generating content or output using generative AI techniques. The Flow of these cards, meaning the logical sequence and interaction between them, is crucial for the coherent generation of complex outputs. For a detailed explanation of Card and Flow concepts read the 1st part of this blog series.
Our students have been actively experimenting with generative AI, producing remarkable results in their projects. Here, we present detailed insights and experiences from their hands-on work, demonstrating the practical applications of prompt engineering with non-tech students.
Generative Music
The “Generative Music” project leverages generative AI technology to innovate the music creation process. Central to this project is the use of Generative AI Cards that define various musical parameters and guide the AI in generating unique compositions. Generative AI Cards specify key musical elements such as genre, number of chords, melody length, key, and instrumentation, including bass and guitar (Fig. 1). Each card represents a distinct aspect of the music, allowing for precise control over the generated content. By configuring these cards, the team can tailor the AI’s output to meet specific creative goals.
Card Configuration
The process begins with the selection and configuration of these cards. Initial configurations often require multiple iterations to achieve satisfactory results. Each card’s parameters are adjusted to optimize the music generation, focusing on refining the elements to create a harmonious and appealing output.
Fig 1: Example of Cards from the Music Project.
Flow Generation
Flow generation involves the structured combination of these AI Cards to produce a coherent piece of music. This stage is crucial as it dictates the sequence and interaction of different musical components defined by the cards. The project utilizes tools like Canva to aid in visualizing and organizing the flow of these components, ensuring a smooth and logical progression in the music. During the flow generation process, the order of the AI Cards is experimented to explore different musical outcomes. However, the team found that altering the sequence did not significantly affect the final output, indicating that the cards’ individual configurations are more critical than their order.
Iterative Refinement and Human Interaction
A significant aspect of the project is the iterative refinement process, where generated music undergoes multiple evaluations and adjustments. Human intervention is essential at this stage to validate the quality of the output. Listening to the music is the primary method for assessing its adequacy, as human judgment is necessary to determine whether the AI’s creation meets the desired standards. The team continuously modifies the prompts and configurations of the AI Cards based on feedback, refining the generative process to improve the music quality. This iterative cycle of generation, evaluation, and adjustment ensures that the final product aligns with the creative vision (Fig. 2).
The “Generative Music” project demonstrates the potential of generative AI in the field of music creation. By using Generative AI Cards and structured flow generation, the project showcases a methodical approach to producing unique musical compositions. Despite the need for substantial human involvement in the refinement process, this innovative use of AI represents a significant step forward in integrating technology with artistic creativity.
Personal Chef
The “Personal Chef” project utilizes generative AI to assist individuals in planning balanced meals efficiently. The primary goal of this project is to save time and resources, enhance creativity, and provide valuable insights for meal planning. Generative AI Cards are central to this project, serving as modular components that define specific meal planning parameters. Each card encapsulates different aspects of meal creation, such as the type of dish (e.g., balanced dish, vegetarian alternative), the ingredients required, and the nutritional composition (Fig. 3). These cards help in structuring the meal planning process by providing detailed instructions and alternatives based on user preferences and dietary needs.
Card Configuration
For instance, one AI Card might focus on generating a list of high-protein foods, while another might ensure the meal components are seasonal. These cards are iteratively refined based on user feedback to ensure they deliver precise and relevant outputs. The language used in these cards is carefully chosen, as even small changes can significantly impact the results. The feedback loop is crucial here, as it allows continuous improvement and ensures that the AI provides more accurate and context-specific suggestions over time.
Fig 3: Example of Cards from the Personal Chef Project.
Flow Generation
Flow generation in this project involves the logical sequencing and combination of Generative AI Cards to create coherent and balanced meal plans. This process ensures that the output not only meets nutritional guidelines but also aligns with the user’s preferences and constraints. The flow of these cards is designed to cover various stages of meal planning, from selecting ingredients to proposing complete dishes (Fig. 4). For example, a flow might start with an AI Card that provides a balanced dish recipe, followed by another card that suggests a vegetarian alternative, and then a card that customizes the dish based on seasonal ingredients. This structured approach ensures a logical progression and maintains the relevance and coherence of the meal plans.
Fig 4: Example of Flows from the Personal Chef Project.
Iterative Refinement and Human Interaction
The project emphasizes the importance of human feedback in refining the AI-generated outputs. Users can interact with the system to customize the generated meal plans, adding or removing ingredients as needed. This iterative process ensures that the AI’s suggestions remain practical and tailored to individual preferences and dietary requirements. By continuously incorporating user feedback, the project aims to enhance the precision and utility of the Generative AI Cards, ultimately making the meal planning process more efficient and enjoyable.
Lessons Learnt
The “Personal Chef” project showcases how generative AI can be leveraged to support everyday tasks like meal planning. The use of Generative AI Cards allows for a modular and flexible approach, enabling users to create personalized and balanced meal plans. While the AI can provide valuable insights and save time, human interaction remains essential to validate and refine the outputs, ensuring they meet the users’ specific needs and preferences. This integration of AI and human expertise represents a significant advancement in making daily routines more manageable and creative.
Students Survey Results
The class consisted of 11 students (average age 23, 5 female) from various university faculties, i.e. Psychology, Cognitive Science, Human-Computer Interaction.
As said, it was a class of non-tech students. Indeed, most of them (7 out of 11) stated that they rarely (1-4 times in the last month) used tools such as ChatGPT. Only one student stated that he/she regularly (every day or almost every day) used these tools, either for study-related and unrelated purposes. One student admitted that he/she was not familiar with ChaptGPT, had only heard about it but had never used it.
In order to investigate the students’ knowledge of GenAI and its potential and to assess the effectiveness or otherwise of the course in increasing their knowledge and ability to use GenAI, we administered a questionnaire at the beginning and end of the course and then made comparisons.
The questionnaire consists of 50 items taken from existing surveys [1,2] investigating various dimensions concerning AI in general. These included: (a) AI Literacy, based on the level of knowledge, understanding and ability to use AI, (b) Anxiety, related to the fear of not being able to learn how to use AI correctly, as well as of losing one’s reasoning and control abilities, and (c) Self-Efficacy, related to confidence both in one’s technical capabilities and in AI as good aid to learning.
As evident from the graph below (Fig. 5), by comparing the answers given by the students before and after the course, it is evident that on average the students increased their literacy and self-efficacy, and decreased their anxiety.
Fig. 5:. Average scores of students’ literacy, anxiety and self-efficacy gathered at the beginning and the end of the Prompt Engineering course for non-tech students.
Furthermore, at the end of the course we asked the students to answer 10 additional questions aimed at gathering their feedback specifically about Generative AI. In particular, we asked them to score the following statements using a 7-point Likert scale, where 1 means “strongly disagree” and 7 means “strongly agree”:
I increased my knowledge and understanding of GenAI
I can effectively use GenAI
When interacting with technology, I am now aware of the possible use of GenAI
I am aware of the ethical implications when using AI-based applications
Taking a class on Prompt Engineering for Generative AI made me anxious
I am afraid that by using AI systems I will become lazy and lose some of my reasoning skills
AI malfunctioning can cause many problems
If used appropriately, GenAI is a valuable learning support
When using GenAI, I feel comfortable
I significantly increased my technological skills
Fig. 6 presents the average score and standard deviation of each item. As evident from the graph, after the course the students recognised the value of GenAI as a valuable tool to support learning (item 8). They also showed to be more aware of the possible uses of GenAI (item 3) and ethical implications of such uses (item 4), whereas they showed a low level of anxiety in attending the course (item 5) and a low level of fear of losing reasoning skills (item 6).
Fig. 6: Average scores on a 7-point Likert scale given by non-tech students at the end of the Prompt Engineering course.
Conclusions
Generative AI is widely used in higher education and skills training. Articles like [3] demonstrate that Generative AI is widely used in higher education and skills training, highlighting its benefits for productivity and efficiency, alongside concerns about overdependence and superficial learning. In Part 2 of our blog series, we delved into the practical applications and creative potential of this innovative technology. Through projects like “Generative Music” and “Personal Chef,” our students demonstrated the versatility and impact of generative AI across diverse fields. Central to these projects was the structured framework known as the Card Model and a Flow of the identified cards, which helped define and organize generative AI tasks.
The course significantly enhanced students’ understanding of prompt engineering, reducing their anxiety and increasing their self-efficacy. Survey results indicated improved AI literacy and decreased anxiety, with students feeling more confident in their technical abilities and recognizing the value of generative AI as a learning tool. Utilizing atomic cards to define and organize generative AI tasks facilitated the learning process. This structured approach allowed students to better grasp and control various aspects of content generation. In the “Generative Music” and “Personal Chef” projects, the cards provided a flexible and modular framework, enabling iterative refinement and improved output quality.
Looking ahead, future developments could further enhance the effectiveness of teaching generative AI. Developing specific tools and editors for configuring prompts could simplify the process, making it more intuitive for students. Establishing standard guidelines and metrics for evaluating generative outputs could provide more structured feedback, improving the learning process. Additionally, expanding the course content to include a broader range of diverse and complex case studies could help students explore more generative AI applications, deepening their understanding and innovative capabilities.
These advancements would not only improve the teaching of generative AI but also promote greater integration of technology and creativity, better preparing students for their future professional career.
Antonio Bucchiarone
Motivational Digital System (MoDiS)
Fondazione Bruno Kessler (FBK), Trento – Italy
Nadia Mana
Intelligent Interfaces and Interaction (i3)
Fondazione Bruno Kessler (FBK), Trento – Italy
References
[1] Schiavo, Gianluca and Businaro, Stefano and Zancanaro, Massimo. Comprehension, Apprehension, and Acceptance: Understanding the Influence of Literacy and Anxiety on Acceptance of Artificial Intelligence. Available at SSRN: https://ssrn.com/abstract=4668256.
[2] Wang, Y. M., Wei, C. L., Lin, H. H., Wang, S. C., & Wang, Y. S. (2022). What drives students’ AI learning behavior: a perspective of AI anxiety. Interactive Learning Environments, 1–17. https://doi.org/10.1080/10494820.2022.2153147
[3] Hadi Mogavi, Reza and Deng, Chao and Juho Kim, Justin and Zhou, Pengyuan and D. Kwon, Young and Hosny Saleh Metwally, Ahmed and Tlili, Ahmed and Bassanelli, Simone and Bucchiarone, Antonio and Gujar, Sujit and Nacke, Lennart E. and Hui, Pan. ChatGPT in education: A blessing or a curse? A qualitative study exploring early adopters’ utilization and perceptions. Computers in Human Behavior: Artificial Humans, Vol. 2, N. 1, 2024. https://doi.org/10.1016/j.chbah.2023.100027
Recent Comments