A Future With Safer Roads: Automatic Wheel Lug Nut Detection Using Machine Learning at the Edge

A Future With Safer Roads: Automatic Wheel Lug Nut Detection Using Machine Learning at the Edge

This article is contributed. See the original author and article here.

By Evan Rust and Zin Thein Kyaw



Introduction


 


Wheel lug nuts are such a tiny part of the overall automobile assembly that they’re easy to overlook, but yet serve a critical function in the safe operation of an automobile. In fact, it is not safe to drive even with one lug nut missing. A single missing lug nut will cause increased pressure on the wheel, in turn causing damage to the wheel bearings, studs, and make the other lug nuts fall off. 


 


Over the years there have been a number of documented safety recalls and issues around wheel lug nuts. In some cases, it was only identified after the fact that the automobile manufacturer had installed incompatible lug nut types with the wheel or had been inconsistent in installing the right type of lug nut. Even after delivery, after years of wear and tear, the lug nuts may become loose and may even fall off which would cause instability for an automobile to be in service. To reduce these incidents of quality control at manufacturing and maintenance in the field, there is a huge opportunity to leverage machine learning at the edge to automate wheel lug nut detection. 


 


This motivated us to create a proof-of-concept reference project for automating wheel lug nut detection by easily putting together a USB webcam, Raspberry Pi 4, Microsoft Azure IoT, and Edge Impulse, creating an end-to-end wheel lug nut detection system using Object Detection. This example use case and other derivatives will find a home in many industrial IoT scenarios where embedded Machine Learning can help improve the efficiency of factory automation and quality control processes including predictive maintenance. 


 


This reference project will serve as a guide for quickly getting started with Edge Impulse on the Raspberry Pi 4 and Azure IoT, to train a model that detects lug nuts on a wheel and sends inference conclusions to Azure IoT as shown in the block diagram below:


 A Future With Safer Roads_ Automatic Wheel Lug Nut Detection using Machine Learning at the Edge.png


Design Concept: Edge Impulse and Azure IoT


 


Edge Impulse is an embedded machine learning platform that allows you to manage the entire Machine Learning Ops (MLOps) lifecycle which includes 1) Data Acquisition, 2) Signal Processing, 3) ML Training, 4) Model Testing, and 5) Creating a deployable model that can run efficiently on an edge device. 


 


For the edge device, we chose to use the Raspberry Pi 4 due to its ubiquity and available processing power for efficiently running more sophisticated machine learning models such as object detection. By running the object detection model on the Raspberry Pi 4, we can optimize the network bandwidth connection to Azure IoT for robustness and scalability by only sending the inference conclusions, i.e. “How many lug nuts are on the wheel?”. Once the inference conclusions are available at the Azure IoT level, it becomes straightforward to feed these results into your business applications that can leverage other Azure services such as Azure Stream Analytics and Power BI.


In the next sections we’ll discuss how you can set this up yourself with the following items:



Setting Up the Hardware



We begin by setting up the Raspberry Pi 4 to connect to a Wi-Fi network for our network connection, configuring it for camera support, and installing the Edge Impulse Linux CLI (command line interface) tools on the Raspberry Pi 4. This will allow the Raspberry Pi 4 to directly connect to Edge Impulse for data acquisition and finally, deployment of the wheel lug nut detection model. 


yodaimpulse_0-1626467383740.png


 


For starters, you’ll need a Raspberry Pi 4 with an up-to-date Raspberry Pi OS image that can be found here. After flashing this image to an SD card and adding a file named ‘wpa_supplicant.conf’:


 

ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
country=<Insert 2 letter ISO 3166-1 country code here>
network={
 ssid="<Name of your wireless LAN>"
 psk="<Password for your wireless LAN>"
}

 


along with an empty file named ‘ssh’ (both within the ‘/boot’ directory), you can go ahead and power up the board. 


 


Once you’ve successfully SSH’d into the device with


 

ssh pi@<IP_ADDRESS>

 


and the password ‘raspberry’, it’s time to install the dependencies for the Edge Impulse Linux SDK. Simply run the next three commands to set up the NodeJS environment and everything else that’s required for the edge-impulse-linux wizard: 


 

curl -sL https://deb.nodesource.com/setup_12.x | sudo bash -
sudo apt install -y gcc g++ make build-essential nodejs sox gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-plugins-base gstreamer1.0-plugins-base-apps
npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm

 


For more details on setting up the Raspberry Pi 4 with Edge Impulse, visit this link.


 


Since this project deals with images, we’ll need some way to capture them. The wizard supports both the Pi camera modules and standard USB webcams, so make sure to enable the camera module first with


 

sudo raspi-config

 


if you plan on using one. With that completed, go to Edge Impulse and create a new project, then run the wizard with 


 

edge-impulse-linux

 


and make sure your device appears within the Edge Impulse Studio’s device section after logging in and selecting your project.


 


yodaimpulse_1-1626467383709.png


 


Data Acquisition


Training accurate production ready machine learning models requires feeding plenty of varied data, which means a lot of images are typically required. For this proof-of-concept, we captured around 145 images of a wheel that had lug nuts on it. The Edge Impulse Linux daemon allows you to directly connect the Raspberry Pi 4 to Edge Impulse and take snapshots using the USB webcam. 


 


Screen Shot 2021-07-16 at 2.13.01 PM.png


 


 


Using the Labeling queue in the Data Acquisition page we then easily drew bounding boxes around each lug nut within every image, along with every wheel. To add some test data we went back to the main Dashboard page and clicked the ‘Rebalance dataset’ button that moves 20% of the training data to the test data bin. 



Impulse Design and Model Training



Now that we have plenty of training data, it’s time to design and build our model. The first block in the Impulse Design is an Image Data block, and it scales each image to a size of ‘320’ by ‘320’ pixels. 


 


yodaimpulse_3-1626467383703.png


 


 


Next, image data is fed to the Image processing block that takes the raw RGB data and derives features from it.


 


yodaimpulse_4-1626467383680.png



Finally, these features are used as inputs to the MobileNetV2 SSD FPN-Lite Transfer Learning Object Detection model that learns to recognize the lug nuts. The model is set to train for ’25’ cycles at a learning rate of ‘.15’, but this can be adjusted to fine-tune for accuracy. As you can see from the screenshot below, the trained model indicates a precision score of 97.9%.


yodaimpulse_5-1626467383620.png


 


Model Testing



If you’ll recall from an earlier step we rebalanced the dataset to put 20% of the images we collected to be used for gauging how our trained model could perform in the real world. We use the model testing page to run a batch classification and see how we expect our model to perform. The ‘Live Classification’ tab will also allow you to acquire new data direct from the Raspberry Pi 4 and see how the model measures up against the immediate image sample. 

Screen Shot 2021-07-16 at 4.45.05 PM.png



Versioning



An MLOps platform would not be complete without a way to archive your work as you iterate on your project. The ‘Versioning’ tab allows you to save your entire project including the entire dataset so you can always go back to a “known good version” as you experiment with different neural network parameters and project configurations. It’s also a great way to share your efforts as you can designate any version as ‘public’ and other Edge Impulse users can clone your entire project and use it as a springboard to add their own enhancements. 



Screen Shot 2021-07-16 at 4.48.11 PM.png



Deploying Models



In order to verify that the model works correctly in the real world, we’ll need to deploy it to the Raspberry Pi 4. This is a simple task thanks to the Edge Impulse CLI, as all we have to do is run


 

edge-impulse-linux-runner 

 


which downloads the model and creates a local webserver. From here, we can open a browser tab and visit the address listed after we run the command to see a live camera feed and any objects that are currently detected. Here’s a sample of what the user will see in their browser tab:


 


yodaimpulse_6-1626467383879.png


 


 


Sending Inference Results to Azure IoT Hub


With the model working locally on the Raspberry Pi 4, let’s see how we can send the inference results from the Raspberry Pi 4 to an Azure IoT Hub instance. As previously mentioned, these results will enable business applications to leverage other Azure services such as Azure Stream Analytics and Power BI. On your development machine, make sure you’ve installed the Azure CLI and have signed in using ‘az login’. Then get the name of the resource group you’ll be using for the project. If you don’t have one, you can follow this guide on how to create a new resource group.


yodaimpulse_7-1626467383585.png


After that, return to the terminal and run the following commands to create a new IoT Hub and register a new device ID:


 

az iot hub create --resource-group <your resource group> --name <your IoT Hub name>
az extension add --name azure-iot
az iot hub device-identity create --hub-name <your IoT Hub name> --device-id <your device id>

 


 


yodaimpulse_8-1626467383649.png


 


Retrieve the connection string the Raspberry Pi 4 will use to connect to Azure IoT with: 


 

az iot hub device-identity connection-string show --device-id <your device id> --hub-name <your IoT Hub name>

 



yodaimpulse_9-1626467383700.png


 


Now it’s time to SSH into the Raspberry Pi 4 and set the connection string as an environment variable with:


 

export IOTHUB_DEVICE_CONNECTION_STRING="<your connection string here>"

 


Then, add the necessary Azure IoT device libraries with:


 

pip install azure-iot-device

 


(Note: if you do not set the environment variable or pass it in as an argument the program will not work!) The connection string contains the information required for the Raspberry Pi 4 to establish a connection with the Azure IoT Hub service and communicate with it. You can then monitor output in the Azure IoT Hub with:


 

az iot hub monitor-events --hub-name <your IoT Hub name> --output table

 


or in the Azure Portal.



yodaimpulse_10-1626467383628.png

To make sure it works, download and run this example on the Raspberry Pi 4 to make sure you can see the test message.



For the second half of deployment, we’ll need a way to customize how our model is used within the code. Edge Impulse provides a Python SDK for this purpose. On the Raspberry Pi 4 install it with 


 

sudo apt-get install libatlas-base-dev libportaudio0 libportaudio2 libportaudiocpp0 portaudio19-dev
pip3 install edge_impulse_linux -i https://pypi.python.org/simple

 


We’ve made available a simple example on the Raspberry Pi 4 that sets up a connection to the Azure IoT Hub, runs the model, and sends the inference results to Azure IoT.


yodaimpulse_11-1626467383696.png

Once you’ve either downloaded the zip file or cloned the repo into a folder, get the model file by running 


 

edge-impulse-linux-runner --download modelfile.eim

 


inside of the folder you just created from the cloning process. This will download a file called ‘modelfile.eim’. Now, run the Python program with 


 

python lug_nut_counter.py ./modelfile.eim -c <LUG_NUT_COUNT>

 


where <LUG_NUT_COUNT> is the correct number of lug nuts that should be attached to the wheel (you might have to use ‘python3’ if both Python 2 and 3 are installed).


yodaimpulse_12-1626467383702.png



Now whenever a wheel is detected the number of lug nuts is calculated. If this number falls short of the target, a message is sent to the Azure IoT Hub.


yodaimpulse_13-1626467383829.png



And by only sending messages when there’s something wrong, we can prevent an excess amount of bandwidth from being taken due to empty payloads.


yodaimpulse_14-1626467383644.png


 


Conclusion


We’ve just scratched the surface with wheel lug nut detection. Imagine utilizing object detection for other industrial applications in quality control, detecting ripe fruit amongst rows of crops, or identifying when machinery has malfunctioned with devices powered by machine learning.

With any hardware, Edge Impulse, and Microsoft Azure IoT, you can design comprehensive embedded machine learning models, deploy them on any device, while authenticating each and every device with built-in security. You can set up individual identities and credentials for each of your connected devices to help retain the confidentiality of both cloud-to-device and device-to-cloud messages, revoke access rights for specific devices, upgrade device firmware remotely, and benefit from advanced analytics on devices running offline or with intermittent connectivity.

The complete Edge Impulse project is available here for you to see how easy it is to start building your own embedded machine learning projects today using object detection. We look forward to your feedback at hello@edgeimpulse.com or on our forum.

Design a Azure IoT Indoor Air Quality monitoring platform from scratch

Design a Azure IoT Indoor Air Quality monitoring platform from scratch

This article is contributed. See the original author and article here.

This article is the first part of a series which explores an end-to-end pipeline to deploy an Air Quality Monitoring application using off-the-market sensors, Azure IoT Ecosystem and Python. We will begin by looking into what is the problem, some terminology, prerequisites, reference architecture, and an implementation. 


 


Indoor Air Quality – why does it matter and how to measure it with IoT?


 


Most people think of air pollution as an outdoor problem, but indoor air quality has a major impact on health and well-being since the average American spends about 90 percent of their time indoors. Proper ventilation is one of the most important considerations for maintaining good indoor air quality. Poor indoor air quality is known to be harmful to vulnerable groups such as the elderly, children or those suffering chronic respiratory and/or cardiovascular diseases. Here is a quick visual on some sources of indoor air pollution.


 


KaushikRoy_0-1626298945190.png


 


Post Covid-19, we are in a world where awareness of our indoor environments is key for survival. Here in Canada we are quite aware of the situation, which is why we have a set of guidlines from the Government of Canada, and a recent white paper from Public Health Ontario. The American Medical Association has put up this excellent document for reference. So now that we know what the problem is, how do we go about solving it? To solve something we must be able to measure it and currently we have some popular metrics to measure air quality, viz. IAQ and VOC.


 


So what are IAQ and VOC exactly?

Indoor air quality (IAQ) is the air quality within and around buildings and structures. IAQ is known to affect the health, comfort, and well-being of building occupants. IAQ can be affected by gases (including carbon monoxide, radon, volatile organic compounds), particulates, microbial contaminants (mold, bacteria), or any mass or energy stressor that can induce adverse health conditions. IAQ is part of indoor environmental quality (IEQ), which includes IAQ as well as other physical and psychological aspects of life indoors (e.g., lighting, visual quality, acoustics, and thermal comfort). In the last few years IAQ has received increasing attention from environmental governance authorities and IAQ-related standards are getting stricter. Here is a IAQ blog infographic if you’d like to read.


 


Volatile organic compounds (VOC) are organic chemicals that have a high vapour pressure at room temperature. High vapor pressure correlates with a low boiling point, which relates to the number of the sample’s molecules in the surrounding air, a trait known as volatility. VOC’s are responsible for the odor of scents and perfumes as well as pollutants. VOCs play an important role in communication between animals and plants, e.g. attractants for pollinators, protection from predation, and even inter-plant interactions. Some VOCs are dangerous to human health or cause harm to the environment. Anthropogenic VOCs are regulated by law, especially indoors, where concentrations are the highest. Most VOCs are not acutely toxic, but may have long-term chronic health effects. Refer to this and this for vivid details.


 


The point is, in a post pandemic world, having a centralized air quality monitoring system is an absolute necessity. The need for collecting this data and using the insights from it is crucial to living better. And this is where Azure IoT comes in. In this series we are going to explore how to create the moving parts of this platform with ‘minimum effort‘. In this first part, we are goiing to concentrate our efforts on the overall architecture, hardware/software requirements, and IoT edge module creation. 


 


Prerequisites


 


To accomplish our goal we will ideally need to meet a few basic criteria. Here is a short list.



  1. Air Quality Sensor (link)

  2. IoT Edge device (link)

  3. Active Azure subscription (link)

  4. Development machine

  5. Working knowledge of Python, Sql, Docker, Json, IoT Edge runtime, VSCode

  6. Perseverance


Lets go into a bit of details about the aforementioned points since there are many possibilities. 


 


Air Quality Sensor


This is the sensor that emits the actual IAQ/VOC+ data. Now, there are a lot of options in this category, and technically they should be producing the same results. However, the best sensors in the market are Micro-Electro-Mechanical Systems (MEMS). MEMS technology uses semiconductor fabrication processes to produce miniaturized mechanical and electro-mechanical elements that range in size from less than one micrometer to several millimeters. MEMS devices can vary from relatively simple structures with no moving elements, to complex electromechanical systems with multiple moving elements. My choice was uThing::VOC™ Air-Quality USB sensor dongle. This is mainly to ensure high quality output and ease of interfacing, which is USB out of the box, and does not require any installation. Have a look at the list of features available on this dongle. The main component is a Bosch proprietary algorithm and the BME680 sensor that does all the hard work. Its basically plug-and-play.  The data is emitted in Json format and is available at an interval of 3 milliseconds on the serial port of your device. In my case it was /dev/ttyACM0, but could be different in yours.


KaushikRoy_0-1626350059920.png


 


 


 


IoT Edge device


This is the edge system. where the sensor is plugged in. Typical choices are windows or linux. If you are doing windows, be aware some of these steps may be different and you have to figure those out. However, in my case I am using ubuntu 20.04 installed on an Intel NUC. The reason I chose the NUC is because many IoT modules require an x86_64 machine, which is not available in ARM devices (Jetson, Rasp Pi, etc.) Technically this should work on ANY edge device with a usb port, but for example windows has an issue mounting serial ports onto containers. I suggest better stick with linux unless its a client requirement.


 


KaushikRoy_0-1626351757966.jpeg


 


Active Azure subscription


Surely, you will need this one, but as we know Azure has this immense suit of products, and while ideally we want to have everything, it may not be practically feasible. For practical purposes you might have to ask for access to particular services, meaning you have to know ahead exactly which ones you want to use. Of course the list of required services will vary between use cases, so we will begin with just the bare minimum. We will need the following:



  • Azure IoT Hub (link)

  • Azure Container Registry (link)

  • Azure blob storage (link)

  • Azure Streaming Analytics (link)(future article)

  • Power BI / React App (link)(future article)

  • Azure Linux VM (link)(optional)


A few points before we move to the next prerequisite. For IoT hub you can use free tier for experiments, but I will recommend to use the standard tier instead. For ACR get the usual tier and generate username password. For storageaccount its the standard tier. The ASA and BI products will be used in the reference architecture, but is not discussed in this article. The final service Azure VM is an interesting one. Potentially all the codebase can be run using VM, but this is only good for simulations. However, note that it is an equally good idea to experiment with VMs first as they have great integration and ease the learning curve.


 


Development machine


The development machine can be literally anything from which you have ssh access to the edge device. From an OS perspective it can be windows, linux, raspbian, mac etc. Just remember two things – use a good IDE (a.k.a VSCode) and make sure docker can be run on it, optionally with priviliges. In my case I am using a Startech KVM, so I can shift between my windows machine and the actual edge device for development purposes, but it is not neccessary.


 


Working knowledge of Python, Sql, Docker, Json, IoT Edge runtime, VSCode


This is where it gets tricky. Having a mix of these knowledge is somewhat essential to creating and scaling this platform. However, I understand you may not be having proficiency in all of these. On that note, I can tell from experience that being from a data engineering background has been extremely beneficial for me. In any case, you will need some python skills, some sql, and Json. Even knowing how to use the VSCode IoT extension is non-trivial. One notable mention is that good docker knowledge is extrememly important, as the edge module is in fact simply a docker container thats deployed through the deployment manifest (IoT Edge runtime).


 


Perseverance


In an ideal world, you read a tutorial, implement, it works and you make merry. The real world unfortunately will bring challenges that you have not seen anywhere. Trust me on this, many times you will make good progress simply by not quitting what you are doing. Thats it. That is the secret ingredient. Its like applying gradient descent to your own brain model of a concept. Anytime any of this doesn’t work, simply have belief in Azure and yourself. You will always find a way. Okay enough of that. Lets get to business.


 


Reference Architecture


Here is a reference architecture that we can use to implement this platform. This is how I have done it. Please feel free to do your own.


 


aqarch.png


 


 


Most of this is quite simple. Just go through the documentation for Azure and you should be fine. Following this we go to what everyone is waiting for – the implementation.


 


Implementation


In this section we will see how we can use these tools to our benefit. For the Azure resources I may not go through the entire creation or installation process as there are quite a few articles on the internet for doing those. I shall only mention the main things to look out for. Here is an outline of the steps involved in the implementation.


 



  1. Create a resource group in Azure (link)

  2. Create a IoT hub in Azure (link)

  3. Create a IoT Edge device in Azure (link)

  4. Install Ubuntu 18/20 on the edge device

  5. Plugin the usb sensor into the edge device and check blue light

  6. Install docker on the edge device 

  7. Install VSCode on development machine 

  8. Create conda/pip environment for development

  9. Check read the serial usb device to receive json every few milliseconds

  10. Install IoT Edge runtime on the edge device (link)

  11. Provision the device to Azure IoT using connection string (link)

  12. Check IoT edge Runtime is running good on the edge device and portal 

  13. Create an IoT Edge solution in VSCode (link)

  14. Add a python module to the deployment (link)

  15. Mount the serial port to the module in the deployment

  16. Add codebase to read data from mounted serial port

  17. Augument sensor data with business data

  18. Send output result as events to IoT hub

  19. Build and push the IoT Edge solution (link)

  20. Create deployment from template (link)

  21. Deploy the solution to the device 

  22. Monitor endpoint for consuming output data as events 


Okay I know that is a long list. But, you must have noticed some are very basic steps. I mentioned them so everyone has a starting reference point regarding the sequence of steps to be taken. You have high chance of success if you do it like this. Lets go into some details now. Its a mix of things so I will just put them as flowing text.


 


90% of what’s mentioned in the list above can be done following a combination of the documents in the official Azure IoT Edge documentation. I highly advise you to scour through these documents with eagle eyes multiple times. The main reason for this is that unlike other technologies where you can literally ‘stackoverflow’ your way through things, you will not have that luxury here. I have been following every commit in their git repo for years and can tell you the tools/documentation changes almost every single day. That means your wits and this document are pretty much all you have in your arsenal. The good news is Microsoft makes very good documentation and even though its impossible to cover everything, they make an attempt to do it from multiple perspectives and use cases. Special mention to the following articles.


 



 


Once you are familiar with the ‘build, ship, deploy’ mechanism using the copius SimulatedTemperatureSensor module examples from Azure Marketplace, you are ready to handle the real thing. The only real challenge you will have is at steps 9, 15, 16, 17, and 18. Lets see how we can make things easy there. For 9 I can simply do a cat command on the serial port.


 

cat /dev/ttyACM0

 


This gives me output every 3 ms. 


 

{"temperature": 23.34, "pressure": 1005.86, "humidity": 40.25, "gasResistance": 292401, "IAQ": 33.9, "iaqAccuracy": 1, "eqCO2": 515.62, "eqBreathVOC": 0.53}

 


This is exactly the data that the module will receive when the serial port is successfully mounted onto the module. 


 

"AirQualityModule": {
            "version": "1.0",
            "type": "docker",
            "status": "running",
            "restartPolicy": "always",
            "settings": {
              "image": "${MODULES.AirQualityModule}",
              "createOptions": {
                "Env": [
                  "IOTHUB_DEVICE_CONNECTION_STRING=$IOTHUB_IOTEDGE_CONNECTION_STRING"
                ],
                "HostConfig": {
                  "Dns": [
                    "1.1.1.1"
                  ],
                  "Devices": [
                    {
                      "PathOnHost": "/dev/ttyACM0",
                      "PathInContainer": "/dev/ttyACM0",
                      "CgroupPermissions": "rwm"
                    }
                  ]
                }
              }
            }
          }

 


 


Notice the Devices block in the above extract from the deployment manifest. Using these keys/values we are able to mount the serial port onto the custom module aptly named AirQualityModule. So we got 15 covered.


Adding codebase to the module is quite simple too. When the module is generated by VSCode it automatically gives you the docker file (Dockerfile.amd64) and a sample main code. We will just create a copy of that file in the same repo and call it say air_quality.py. Inside this new file we will hotwire the code to read the device output. However, before doing any modification in the code we must edit requirements.txt. Mine looks like this:


 

azure-iot-device
psutil
pyserial

 


 


azure-iot-device is for the edge sdk libraries, and pyserial is for reading serial port. The imports look like this:


 

import time, sys, json
# from influxdb import InfluxDBClient
import serial
import psutil
from datetime import datetime
from azure.iot.device import IoTHubModuleClient, Message

 


 


Quite self-explainatory. Notice the influx db import is commented, meaning you can send these reading there too through the module. To cover 16 we will need the final three peices of code. Here they are:


 

message = ""
#uart = serial.Serial('/dev/tty.usbmodem14101', 115200, timeout=11) # (MacOS)
uart = serial.Serial('/dev/ttyACM0', 115200, timeout=11) # Linux
uart.write(b'Jn')
message = uart.readline()
uart.flushInput()
if debug is True:
  print('message...')
  print(message)
data_dict = json.loads(message.decode())

 


 


There that’s it! With three peices of code you have taken the data emitted by the sensor, to your desired json format using python. 16 is covered. For 17 we will just update the dictionary with business data. In my case as follows. I am attaching a sensor name and  coordinates to find me :happyface:.


 

data_dict.update({'sensorId':'roomAQSensor'})
data_dict.update({'longitude':-79.025270})
data_dict.update({'latitude':43.857989})
data_dict.update({'cpuTemperature':psutil.sensors_temperatures().get('acpitz')[0][1]})
data_dict.update({'timeCreated':datetime.now().strftime("%Y-%m-%d %H:%M:%S")})

 


 


For 18 it is as simple as 


 

print('data dict...')
print(data_dict)
msg=Message(json.dumps(data_dict))
msg.content_encoding = "utf-8"
msg.content_type = "application/json"
module_client.send_message_to_output(msg, "airquality")

 


 


Before doing step 19, two things must happen. First, u need to replace the default main.py in the dockerfile and with air_quality.py. Second, you must use proper entries in .env file to generate deployment & deploy successfully. We can quickly check the docker image exists before actual deployment.


 

docker images
iotregistry.azurecr.io/airqualitymodule   0.0.1-amd64  030b11fce8af  4 days ago  129MB

 


 


Now you are good to deploy. Use this tutorial to help deploy successfully. At the end of step 22 this is what it looks like upon consuming the endpoint through VSCode.


 

[IoTHubMonitor] Created partition receiver [0] for consumerGroup [$Default]
[IoTHubMonitor] Created partition receiver [1] for consumerGroup [$Default]
[IoTHubMonitor] [2:33:28 PM] Message received from [azureiotedge/AirQualityModule]:
{
  "temperature": 28.87,
  "pressure": 1001.15,
  "humidity": 38.36,
  "gasResistance": 249952,
  "IAQ": 117.3,
  "iaqAccuracy": 1,
  "eqCO2": 661.26,
  "eqBreathVOC": 0.92,
  "sensorId": "roomAQSensor",
  "longitude": -79.02527,
  "latitude": 43.857989,
  "cpuTemperature": 27.8,
  "timeCreated": "2021-07-15 18:33:28"
}
[IoTHubMonitor] [2:33:31 PM] Message received from [azureiotedge/AirQualityModule]:
{
  "temperature": 28.88,
  "pressure": 1001.19,
  "humidity": 38.35,
  "gasResistance": 250141,
  "IAQ": 115.8,
  "iaqAccuracy": 1,
  "eqCO2": 658.74,
  "eqBreathVOC": 0.91,
  "sensorId": "roomAQSensor",
  "longitude": -79.02527,
  "latitude": 43.857989,
  "cpuTemperature": 27.8,
  "timeCreated": "2021-07-15 18:33:31"
}
[IoTHubMonitor] Stopping built-in event endpoint monitoring...
[IoTHubMonitor] Built-in event endpoint monitoring stopped.

 


 


Congratulations! You have successfully deployed the most vital step in creating a scalable air quality monitoring platform from scratch using Azure IoT.


 


Future Work


Keep an eye out for a follow up of this article where I shall be discussing how to continue the end-to-end pipeline and actually visualize it on Power BI.

Improve seller productivity with the deal manager experience

Improve seller productivity with the deal manager experience

This article is contributed. See the original author and article here.

The sales pipeline is a visual representation of where prospects are within the sales funnel. Managing the pipeline is one of the core activities of any seller; it helps sellers to stay organized and focused on moving deals forward. A seller who can successfully master the sales pipeline will drive more revenue.

But mastering a sales pipeline is not easy, especially when sellers must balance multiple active deals, an array of contacts, and conversations across multiple channels while trying to figure out when the next interaction will occur, what next steps are required, and which app or process will help accomplish the job.

The deal manager workspace is an updated user experience in Dynamics 365 Sales that puts the seller at the center of their workflows, and is now available for public preview.

This enhanced deal manager workspace allows sellers to get a full view of their pipeline, quickly gather context and take action, collaborate with their colleagues, and effectively work the way they want to work.

Watch this video for an overview of how to manage deals in Dynamics 365 Sales:

Visualizing the pipeline with charts

The easiest way to get an overview of the sales pipeline is by visualizing the deals on a chart. Charts form a key component of the deal manager. Not only do they provide key insights on opportunities, but in the deal manager these charts are interactive and allow sellers to quickly locate and focus on the right deals.

In the July release, two charts are available out of the box:

  • A bubble chart that allows sellers to track risky deals on a timeline.
  • A funnel chart that allows sellers to see where deals are in the sales process.

These charts are configurable by administrators. In future releases, we will introduce additional charts.

Keeping track of key metrics

Another tool that keeps sellers informed are key performance indicators (KPIs). In the deal manager, we’ve introduced tools to track and highlight metrics that help sellers stay on top of their most important KPIs. Sellers can choose from a subset of metrics, re-order them, or even create their own metrics.

A modern, seller-optimized spreadsheet experience

When it comes to managing deals, it’s no wonder that sellers love spreadsheets. Spreadsheets provide a table view of all opportunities, with aggregation, quick filtering, sorting, grouping with pivot tables, re-ordering of columns, and the ability to edit fields inline easily. Unfortunately, data in a spreadsheet is static and not up to date within Dynamics 365.

The deal manager workspace comes with an inline grid that can be edited. This grid behaves just as a spreadsheet would. Sellers can:

  • Edit any cell inline.
  • Filter by any column (advanced querying is also supported).
  • Re-order columns by dragging them into place.
  • Freeze a column.
  • Hide and show any field from the table.
  • Summarize column data by SUM, MAX, MIN, or AVG.
  • Group the data by any field or option set.

The grid has also been optimized for salespeople. Sellers can add special “smart columns” to make their pipeline management even more insightful and efficient:

  • Close date: Intelligently combines the value of the Estimated Close Date and the Actual Close Date fields into one column for sellers to use and update.
  • Revenue: Intelligently combines the value of the Estimated Revenue and the Actual Revenue fields into one column for sellers to use and update.

Getting context without navigating away

With the deal manager workspace, useful information is easily accessible. When you select a record in the workspace, an optimized form appears in the side panel. This form contains a modern task management experience and provides useful information, such as:

  • Key entity fields
  • A timeline list of activities
  • An optimized list of notes

Administrators can customize the form to select the most relevant fields for your business.

Collaborate in context

Selling is a team sport. Now that Microsoft Teams is part of the Dynamics 365 experience, collaborating with your colleagues is seamless. Within the deal manager workspace, sellers are able to instantly follow up with any colleague regarding any deal of interest.

In this release, we have focused on empowering sellers with an intuitive workspace experience allowing sales reps to better manage their pipeline by getting to the information they need quickly.

Next steps

The public preview will gradually roll out over the next few weeks. For more information on how to enable the experience in your environment, read the documentation.

The post Improve seller productivity with the deal manager experience appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Azure Marketplace new offers – Volume 153

Azure Marketplace new offers – Volume 153

This article is contributed. See the original author and article here.











We continue to expand the Azure Marketplace ecosystem. For this volume, 84 new offers successfully met the onboarding criteria and went live. See details of the new offers below:





























































































































































































































































































































































Applications


Asset Performance Management.png

Asset Performance Management: By harnessing machine condition data (vibration, oil, thermography) and process operating data, Symphony Industrial AI provides accurate and predictive information on machine health and performance with a portfolio of integrated products.


AugmentedStore.png

AugmentedStore: Create an enhanced sales channel that helps customers understand more about your products in an interactive way. Support customers in their decision-making process with an at-home digital experience that improves your brand awareness and sales conversion rate.


Bedrock.png

Bedrock: Bedrock is a cloud-based enterprise artificial intelligence (AI) platform that helps you achieve a faster time to market for massive-scale AI engines. Inbuilt governance enables transparency and accountability of AI, the foundation for responsible AI deployments in enterprises.


BeyondMinds AI Platform.png

BeyondMinds AI Platform: BeyondMinds enterprise artificial intelligence (AI) platform delivers hyper-customized, production-ready AI systems that enable companies to overcome the massive failure rate in AI adoption and rapidly implement ROI-positive transformations.


BIA Employee.png

BIA Employee: Help employees manage their human resources activities anytime with this complete human capital management (HCM) platform. Enjoy powerful tools for reporting and data analysis with Microsoft Power BI and benefit from single sign-on with Microsoft 365 integration.


CENTERSIGHT scale - Your flexible IoT solution.png

CENTERSIGHT scale – Your flexible IoT solution: CENTERSIGHT scale enables fast implementation of initial Internet of Things (IoT) projects so you can immediately see a return on investment. It links Microsoft Azure services with ready-to-use IoT applications for solutions based on a flexible framework.


CGI Renewables Management System (RMS).png

CGI Renewables Management System (RMS): Need a real-time monitoring, control, and performance management solution for your power plants? RMS uses Microsoft Azure services like Azure IoT Hub and Azure Machine Learning to maximize availability, decrease energy losses, and boost your bottom line.


Cloud Native CMS & DXP for Joomla.png

Cloud Native CMS & DXP for Joomla: Build websites, portals, intranets, extranets, and more with this image from VMLAB. It contains all the components required to deploy and run the open-source content management system (CMS) Joomla! on Microsoft Azure.


Connected Drums.png

Connected Drums: This digital solution for cable drum management combines hardware, software, services, and engineering expertise to enable real-time pinpointing of drum location, allowing you to optimize drum management, logistics, and rotation cycle time while preventing loss and theft.


Content Collaboration Platform based on Seafile.png

Content Collaboration Platform based on Seafile: Sync, share, and collaborate across devices and teams with this image from VMLAB. It contains all the components required to run the open-source file sync and share solution Seafile on Microsoft Azure, designed for high reliability and performance.


Coroban.png

Coroban: Assess the risk of your medical patients falling by analyzing electronic records through the artificial intelligence Concept Insider, enabling uniform and objective judgments while reducing the burden on hospital or facility staff. This application is available only in Japanese.


Document Project CRM and Collaboration All in One.png

Document Project CRM and Collaboration All in One: VMLab offers this preconfigured image of ONLYOFFICE Groups for Microsoft Azure. ONLYOFFICE Groups is an open source collaborative system developed to manage documents, projects, customer relationships, and email correspondence, all in one place.


Enterprise AI Bots.png

Enterprise AI Bots: DBA LOUNGE’s Enterprise AI Bot on Microsoft Teams integrates with ERP and non-ERP business applications to create a conversational experience that interprets users’ intent, automating processes and delivering contextual responses to text commands.


Genpact Cora Finance Analytics.png

Genpact Cora Finance Analytics: Built on a robust and scalable data foundation layer, Cora Finance Analytics is Genpact’s comprehensive analytics solution that provides finance teams with strategic, operational, and tactical capabilities to help them make better, faster, and more insightful decisions.


Graphical and Creative Programming Platform.png

Graphical and Creative Programming Platform: VMLab provides this preconfigured image of Scratch GUI, which is a set of React components that compose the interface for creating and running Scratch 3.0 projects on Microsoft Azure.


HoloMuseum.png

HoloMuseum: HoloMuseum offers new ways to engage visitors during their remote or on-site museum tours. Visitors connect through Microsoft Teams or other collaborative platform and share the same point of view of the tour guide, who, thanks to Azure Spatial Anchors, can draw on several elements of each display.​


INFRABIRD.png

INFRABIRD: Nexans INFRABIRD helps telecom service providers prevent unauthorized access to their fiber-to-the-home street cabinets. The keyless access and Internet of Things supervision system can be deployed in just a few minutes to turn passive cabinets into smart, cloud-connected assets.


JARVIS Video Analytics - Automating Existing CCTVs.png

JARVIS Video Analytics – Automating Existing CCTVs: Joint AI Research for Video Instances and Streams (JARVIS) is an AI-powered video analytics platform that can be used on existing CCTV infrastructure to automate safety, security, and operational SOPs in real time.


Klaviyo Power BI Connector.png

Klaviyo Power BI Connector: Innovoco’s Klaviyo Power BI Connector enables users to gain access to hidden data from Klaviyo, integrate it into the business intelligence ecosystem, and visualize it using Microsoft Power BI. Bridge the gap between the data you need and the dashboards you use.


Market For Help.png

Market For Help: Available only in Italian, GeckoWay’s Market For Help is a cloud platform available to associations and public administration for managing the social service of giving assistance to disadvantaged people.


Metallic Backup for Microsoft Dynamics 365.png

Metallic Backup for Microsoft Dynamics 365: Metallic delivers enterprise-grade data backup and recovery with the simplicity of SaaS. With comprehensive coverage across production and sandbox environments, Metallic protects Microsoft Dynamics 365 data with ease – helping your business stay safe, compliant, and rapidly recoverable.


Metallic Database Backup.png

Metallic Database Backup: Metallic Database Backup offers a single solution to protect the structured data of database servers. The solution ensures backup as a service (BaaS) will enable customers to quickly and easily back up and recover database data from SQL Server, SAP HANA, and Oracle.


Metallic File & Object Backup.png

Metallic File & Object Backup: Metallic File & Object Backup offers a single solution to protect data stored on Windows, Linux, and Unix servers as well as in Microsoft Azure Blob Storage and Azure Files. Leverage cost-optimized protection for your unstructured data.


Metallic Salesforce Backup.png

Metallic Salesforce Backup: Metallic SaaS Backup delivers powerful, enterprise-grade data backup and recovery. With broad-ranging coverage across the Salesforce Cloud, Metallic safeguards valuable data from deletion, corruption, and ransomware attack. Keep your cloud data secure and recoverable.


Metallic VM & Kubernetes Backup.png

Metallic VM & Kubernetes Backup: Metallic VM & Kubernetes offers a single solution to protect workloads in hybrid virtual environments. Protect on-premises virtual machines running on Microsoft Hyper-V or VMware vSphere and cloud-native workloads running on a Microsoft Azure virtual machine.


MinIO Blob Storage Gateway (S3 API).png

MinIO Blob Storage Gateway (S3 API): MinIO Gateway provides an Amazon S3-compatible API for objects stored in Microsoft Azure Blob Storage. Enable applications to simultaneously use both the Azure Blob Storage API and the Amazon S3 API to access buckets and objects with the same credentials.


NousMigrator for Cognos to Power BI.png

NousMigrator for Cognos to Power BI: Migrate Cognos reports to Microsoft Power BI faster with reduced risks and complexities using NousMigrator. NousMigrator partially automates report migration, reducing both the risk of human error when migrating and the time to market per report.


Open Source Cloud Native CRM for SuiteCRM.png

Open Source Cloud Native CRM for SuiteCRM: VMLab provides this preconfigured image of SuiteCRM with PHP runtime on Microsoft Azure. SuiteCRM delivers workflow, reporting, portal, quotes, invoices, accounts, contacts, and much more with a responsive mobile theme and Microsoft Outlook and Thunderbird integration.


Open Source Online Office Suite.png

Open Source Online Office Suite: VMLab provides this preconfigured image of ONLYOFFICE Docs Community on Microsoft Azure. ONLYOFFICE Docs Community is a powerful online editor for text documents, spreadsheets, and presentations. Supported formats include docx, xlsx, pptx, odt, ods, odp, pdf, rtf, html, and more.


Open Source Wiki and Knowledge Base Software.png

Open Source Wiki and Knowledge Base Software: VMLab provides this preconfigured image of MediaWiki on Microsoft Azure. MediaWiki is an open-source wiki package written in PHP, originally for use on Wikipedia. It is used by several other projects of the nonprofit Wikimedia Foundation and by many other wikis.


OpenText EnCase Information Assurance.png

OpenText EnCase Information Assurance: EnCase Information Assurance (formerly EnCase eDiscovery) is a data risk management solution designed to help corporations and government agencies locate sensitive or regulated information quickly across the entire IT infrastructure.


PostgreSQL Server.png

PostgreSQL Server: Cloud Infrastructure Services provides this preconfigured image of PostgreSQL Server and pgAdmin on Ubuntu Server 20.04. PostgreSQL is an enterprise-class, open source relational database system that supports both SQL (relational) and JSON (non-relational) querying.


PostgreSQL Server with pgAdmin.png

PostgreSQL Server with pgAdmin: Cloud Infrastructure Services provides this preconfigured image of PostgreSQL Server and pgAdmin on CentOS Server 8.3. PostgreSQL is an enterprise-class, open source relational database system that supports both SQL (relational) and JSON (non-relational) querying.


Process Performance Optimization.png

Process Performance Optimization: Symphony Industrial AI’s Performance 360 channels the power of rapidly evolving IIOT, artificial intelligence, and big data technologies to optimize the performance of process units and plants, increasing reliability and availability, minimizing costs, and reducing operational risks.


Rapid7 VM Scan Engine.png

Rapid7 VM Scan Engine: Rapid7’s vulnerability management solutions, Nexpose and InsightVM, reduces your organization’s risk by dynamically collecting and analyzing risk across vulnerabilities, configurations, and controls from the endpoint to the cloud.


Recorded Future for Azure Sentinel.png

Recorded Future for Azure Sentinel: Recorded Future reduces security risk by automatically positioning threat intelligence data in your Microsoft Azure environment. Data is delivered to Azure Sentinel to provide context and empower analysts to identify and triage alerts faster, proactively block threats, and more.


Refactr Runner.png

Refactr Runner: Refactr helps you jump-start your journey to IT-as-code by introducing the latest automation techniques in DevSecOps. With minimal setup, DevSecOps teams can create repeatable, software-defined, and secure automation pipelines that are executed with a few clicks or through automation triggers.


RemoteSelling.png

RemoteSelling: RemoteSelling offers new ways to engage customers during remote or on-site shopping. Visitors connect through Microsoft Teams or other collaborative platform and share the same point of view of the sales representative, who, thanks to Azure Spatial Anchors, can draw on several elements of each product.


Route Guard _ IP Hijack Detection & Prevention.png

Route Guard | IP Hijack Detection & Prevention: Developed based on 15 years of academic research, Route Guard from BGProtect is a comprehensive IP hijack detection and prevention solution that provides IP hijack detection regardless of the hijack technology.


Scappman.png

Scappman: Scappman enables you to easily install and update applications on your Microsoft Intune-managed computers, offering an enterprise-grade software and patch management solution for small businesses that want to work securely and remain up to date.


SELMID.png

SELMID: SELMID, an IDaaS for B2C operators, is provided in Microsoft Azure Active Directory B2C to enable users to easily and flexibly implement SNS support of existing businesses, SNS cooperation, and identity verification for new services and businesses. This application is available only in Japanese.


Shobdo – Speech Keyword Spotting AI.png

Shobdo – Speech Keyword Spotting AI: Shobdo on Microsoft Azure is an AI-powered speech and keyword-spotting solution that provides brands with actionable insights by using machine learning models to recognize specific words and key phrases from audio recordings.


Shopify Power BI Connector.png

Shopify Power BI Connector: Innovoco’s Shopify Power BI Connector enables users to gain access to a wider range of fields and data from Shopify, integrate the data into the business intelligence ecosystem, and visualize it using Microsoft Power BI. Bridge the gap between the data you need and the dashboards you use.


Smaartpulse Ecommerce.png

Smaartpulse Ecommerce: Enixta Innovations’ Smaartpulse helps eliminate consumer confusion and shorten the purchase decision cycle time by generating actionable insights for products and helping consumers find the portions of product reviews they care about the most.


SquaredUp Dashboard Server.png

SquaredUp Dashboard Server: SquaredUp Dashboard Server lets you easily deliver real-time answers from any data source to anyone in your business. Connect to, surface, and dashboard any data to provide real-time insights so teams can optimize outcomes and identify issues fast.


Terragon CDP.png

Terragon CDP: The Terragon Customer Data Platform (CDP) is a marketing solution that aggregates and organizes customer data across a variety of touchpoints and data platforms to create persistent, unified records of all your customers, their attributes, and their interests.


VIVE Process Intelligence Platform.png

VIVE Process Intelligence Platform: VIVE’s Process Intelligence Platform is a cross-industry domain application addressing all the usual business processes, including supply chain and logistics, for retail, manufacturing, healthcare, finance, oil/gas/energy, smart cities, government, and more.


Write-Back Tool - Power BI.png

Write-Back Tool – Power BI: Innovoco’s Write-Back Tool allows Microsoft Power BI users to update source systems while staying within the context of the Power BI dashboard. The solution extends the BI functionality from traditionally being a read-only tool to a tool that allows users to create, edit, and delete data.



Consulting services


Azure Baseline Managed Services.png

Azure Baseline Managed Services: This managed service from Spikes enables an optimal Microsoft Azure environment with guaranteed 99.95% uptime. Spikes employs Recovery Services vaults to monitor, support, manage, and restore backups, and it uses Azure Site Recovery to ensure availability.


Azure Information Protection 3-Week Proof of Concept.png

Azure Information Protection: 3-Week Proof of Concept: This consulting offer from CS IT LLC pilots the use of the Microsoft Azure Information Protection data protection and encryption service, removing risks and concerns regarding its implementation. This offer is available only in Russian.


AKA Azure Cloud Adoption Assessment 4 Weeks.png

AKA: Azure Cloud Adoption Assessment: 4 Weeks: The experts at AKAVEIL will provide a clear plan on how to use Microsoft Azure to enable the digital transformation of your business. This consulting offer maps your current IT infrastructure, calculates total cost of ownership, and assesses cloud readiness.


App Modernization 10-Week Proof of Concept.png

App Modernization: 10-Week Proof of Concept: The experts at Canarys will work with your teams to plan, prioritize, and modernize or migrate your systems (identified apps) and deploy on Microsoft Azure App Service, Azure SQL, or to a container-based architecture using Azure Kubernetes Service.


Application Migration to the Cloud 4-Week Assessment.png

Application Migration to the Cloud: 4-Week Assessment: This consulting engagement with T-Systems is designed to help a customer plan and perform business-to-business infrastructure and application layer migrations to a Microsoft Azure environment. This offer is available only in Hungarian.


Application Modernization 2-Week Assessment.png

Application Modernization: 2-Week Assessment: This consulting engagement with Devoteam assists customers in identifying and prioritizing applications for modernization to Microsoft Azure. Customers are provided a cost estimate, reference architecture design, and modernization plan.


Azure AI ML Ideation 8-Hour Workshop.png

Azure AI/ML Ideation: 8-Hour Workshop: A Rackspace data scientist will show you the possibilities of Azure AI and Azure Machine Learning, then take you on a technical deep dive into machine learning products and the associated tools and services enabling AI/ML frameworks on Azure.


Azure Cognitive Services 8-Week Proof of Concept.png

Azure Cognitive Services 8-Week Proof of Concept: This consulting engagement with Persol uses Microsoft Azure Cognitive Search to support the evaluation of knowledge mining using your environment and text and image data. This service is available only in Japanese.


Azure Cost Optimization 4-Week Assessment.png

Azure Cost Optimization: 4-Week Assessment: Engineering experts from T-Systems will review your Microsoft Azure configuration and recommend optimizations that will result in predictable future costs to help you avoid overspending. This consulting service is available only in Hungarian.


Synapse Analytics Hands on Lab  1-Day Briefing.png

Azure Synapse Analytics Hands-on Lab: 1-Day Briefing: Datasolution’s consulting services for Microsoft Azure Synapse Analytics provide insights into the efficient utilization of IT resources through analytics and an overall understanding of data warehouse services. This offer is available only in Korean.


Windows Virtual Desktop 1-Day Implementation.png

Azure Virtual Desktop: 1-Day Implementation: Available only in German from SVA System Vertrieb Alexander, this offering is aimed at customers who need a 24×7 professionally managed Azure Virtual Desktop (formerly Windows Virtual Desktop) environment with an appropriate ITIL operating approach, high stability, and service management.


BaaS Backup as a Service.png

BaaS Backup as a Service: Quickly and easily protect your Microsoft 365 data with Zones Backup as a Service. This managed service offering on Microsoft Azure enables your organization to back up data and restore it directly with a call or email to a Zones data management expert.


Baseline Analytics Azure EDW Implementation.png

Baseline Analytics: Azure EDW Implementation: This consulting service combines an agile approach with Enfo Sweden’s 20 years of experience to kick-start your analytics and data platform project via a solid delivery methodology and a proven Microsoft Azure reference architecture.


Build Smart Data Platform Hybrid Cloud 5-Day Implementation.png

Build Smart Data Platform Hybrid Cloud: 5-Day Implementation: NTT Com will survey your network and server infrastructure, then help you design and implement the Microsoft Azure environment required to securely store your assets. This offer is available only in Japanese.


Cloud Adoption 4-Hour Workshop.png

Cloud Adoption 4-Hour Workshop: This CEO/CTO/CIO-level workshop from CLOUD SERVICES, based on the Microsoft Cloud Adoption Framework, can help organizations looking for best approaches in technical framework, organizational transformations, and staff competency readiness.


Cloud Advisory Services - 3-Week Assessment.png

Cloud Advisory Services – 3-Week Assessment: Coforge’s Cloud Advisory Services help enterprises identify the need for infrastructure and application modernization, identify drivers of modernization, and develop an overall IT transformation strategy for moving applications to Microsoft Azure.


Cloud Native 1-Day Online Workshop.png

Cloud Native 1-Day Online Workshop: Available from Cloud Services, this one-day workshop is designed to simplify and accelerate your journey toward modernizing your applications and building new ones using Microsoft Azure Kubernetes Service.


Data Mart-as-a-Service Analytics DevOps Implementation.png

Data Mart-as-a-Service: Analytics DevOps Implementation: Data Mart as a Service enables the full potential of a data-driven organization by providing a fully hosted Microsoft Azure Data Warehouse, Microsoft Power BI reports, and the ability to scale when needed.


Dedicated Internet Access 5-Week Implementation.png

Dedicated Internet Access: 5-Week Implementation: Lumen’s offering aims to improve connectivity to Microsoft public services such as Microsoft 365, Dynamics 365, Teams, and other SaaS products running on Microsoft Azure. Customer traffic takes the shortest path to the Microsoft network from the nearest edge.


DevOps Consulting 6-Week Implementation.png

DevOps Consulting: 6-Week Implementation: Canarys Automations offers GitHub and Microsoft Azure DevOps consulting services and implementation, providing expert assistance in the design and development of workflows for your organization’s code build and deployment.


DevOps Toolchain Consultation 3-Week Assessment.png

DevOps Toolchain Consultation: 3-Week Assessment: Available only in Hungarian, T-Systems’ DevOps Toolchain consulting service helps your company improve software development and testing and shorten deployment times. Spend less time integrating and more time delivering better quality software faster.


Digital Assistant for Knowledge Workers 3-Week Proof of Concept.png

Digital Assistant for Knowledge Workers: 3-Week Proof of Concept: Using its experience in Microsoft Azure services, business processes, and artificial intelligence, Spikes will demonstrate the value of a digital assistant in a proof of concept. Gain hands-on insight into the capabilities of Microsoft Azure Cognitive Services and tailor-made AI services.


Disaster Recovery - 4 Week Implementation.png

Disaster Recovery – 4 Week Implementation: Available only in Portuguese from DataEX, the Disaster Recovery Deployment Service maps key resources, determines which servers and services will be included in the disaster recovery plan, and then helps you create and implement it.


Finchloom Professional Services for Azure - 1-Hour Consultation.png

Finchloom Professional Services for Azure – 1-Hour Consultation: Finchloom’s free consultation will help you determine the scope and pricing for your deployment projects. Finchloom’s team will work with your organization’s IT department to identify how to transform your datacenters and develop hybrid cloud solutions or migrations to Microsoft Azure.


Netezza to Azure Synapse Analytics Migration - 2-Hour Workshop.png

IBM Netezza to Azure Synapse Analytics Migration – 2-Hour Workshop: In this free value discovery workshop, you will learn how you can reduce IBM Netezza to Microsoft Azure Synapse Analytics migration cost and timelines via a risk-mitigated approach using Hexaware’s AMAZE re-platforming solution.


Insurhub IFRS 17 Jumpstart 4-Week Assessment.png

Insurhub IFRS 17 Jumpstart: 4-Week Assessment: Built on Microsoft Azure, Insurhub allows insurers to leverage their existing finance, actuarial, policy, and other corporate systems to securely manage data, business rules, calculations, and processes required to meet IFRS 17 regulatory compliance.


IT Technical Support.png

IT Technical Support: Ideal for small and midsize legal entities, AJ Santos Comércio de Produtos de Informática e Serviços’ IT Technical Support services cover information security, cloud management, software installation, and more. This offer is available only in Portuguese.


MAaaS (Analytics Service) 8-Week Implementation.png

MAaaS (Analytics Service): 8-Week Implementation: Available only in Korean, MAaaS is a Managed Analytics as a Service that operates and manages AI predictive models in Microsoft Azure. From developing and applying initial predictive models to diagnosing performance to tuning models, Data Solutions offers analytics to fill your needs.


Migrate to Windows Virtual Desktop -2 Hour Workshop.png

Migrate to Azure Virtual Desktop – 2-Hour Workshop: Hexaware invites you to a free value discovery workshop for companies looking to migrate to Azure Virtual Desktop (formerly Windows Virtual Desktop) from Citrix or VMWare Horizon. Learn how you can realize centralized management, improved data security, simplified deployment, lower costs, and more.


Move to Cloud  5-Day Assessment.png

Move to Cloud: 5-Day Assessment: Orange Business Services will help you define the best migration strategy to Microsoft Azure to improve your organization’s agility, efficiency, and cost optimization in line with your existing IT environment.


NetApp Back Up Cloud  6-Day Implementation.png

NetApp Back Up Cloud: 6-Day Implementation: Available only in French, ALFUN’s six-week implementation includes the hybridization of your organization’s NetApp solutions with Microsoft Azure for your backup and migration requirements.


SAP on Azure Migration 4-Week Assessment.png

SAP on Azure Migration: 4-Week Assessment: Learn about the requirements for migrating on-premises SAP systems to Microsoft Azure. T-Systems Hungary will map the parameters, interfaces, and dependencies of the source systems, then create target environments that are infrastructurally identical to the source environment.


Secure Landing Zone - 5-Week Implementation.png

Secure Landing Zone – 5-Week Implementation: Coforge Limited’s Secure Landing Zone offering brings a unique mix of Microsoft Azure best practice and Coforge’s deep technical expertise from managing cloud estates to help your organization start with a secure cloud foundation.


Teradata to Azure Synapse Analytics Migration - 2-Hour Workshop.png

Teradata to Azure Synapse Analytics Migration – 2-Hour Workshop: In this free value discovery workshop, you will learn how you can reduce Teradata to Microsoft Azure Synapse Analytics migration cost and timelines via a risk-mitigated approach using Hexaware’s AMAZE re-platforming solution.


VDI-Persona Investigation 4-Week Assessment.png

VDI-Persona Investigation: 4-Week Assessment: Orange Business Services’ consultants will work with you to determine which devices are used in your organization along with your application and data requirements, then create a design-led approach for the deployment of an Azure Virtual Desktop (formerly Windows Virtual Desktop) environment.


Workload Migration to Azure 10-Week Implementation.png

Workload Migration to Azure: 10-Week Implementation: The Applied Information Sciences Workload Migration to Azure offering aims to help organizations move on-premises workloads to the Microsoft Azure commercial cloud. The offering targets Microsoft Windows and Linux server workloads running on physical servers or virtualized on VMWare or Microsoft Hyper-V.



Enabling hybrid work with Microsoft 365 and collaborative apps

Enabling hybrid work with Microsoft 365 and collaborative apps

This article is contributed. See the original author and article here.

The world around us has dramatically changed. Hybrid, global work requires structural changes to how we build and interact with applications. We need a new class of apps that are centered around enabling synchronous and asynchronous modes of collaboration with real-time meetings, ad-hoc messaging, document collaboration, and business processes automation. Microsoft Teams, together with the…

The post Enabling hybrid work with Microsoft 365 and collaborative apps appeared first on Microsoft 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.