Data Factory adding ORC and SQL MI (preview) for ADF Data Flows and Synapse Data Flows

This article is contributed. See the original author and article here.

Next week, the Microsoft Azure Data Factory team will release 2 new native data flow connectors to continue to make it easier for you to perform direct data transformation directly against data sources in data flows from Azure Data Factory and Synapse Analytics. Direct connectors to data flows in ADF and Synapse makes it super easy to build no-code visual ETL processes with cloud-scale data transformations.

 

The documentation links for these connectors will be updated along with the ADF UI to automatically show the availability of Azure SQL MI and ORC in data flows next week:

 

https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-sql-managed-instance

https://docs.microsoft.com/en-us/azure/data-factory/format-orc 

 

Note that SQL MI is in public preview because we do not yet have the ability to leverage private endpoints in Azure SQL MI, which is the mechanism used by the Azure Integration Runtime to provide vnet security for your data sources in ADF. Therefore, we are only supporting public endpoints with SQL MI in data flows until that feature is released in Azure SQL MI. Once that happens, we can support private Azure SQL MI connections and support the connector as Generally Available.

A flexible print environment with Epson inkjet printers and Universal Print

This article is contributed. See the original author and article here.

Epson products and services contribute to the realization of ‘the sustainable office’ to meet the aspirations of our society by enhancing productivity, transforming the work environment, and saving energy and paper resources.

Universal Print is a new innovative technology that delivers a flexible printing environment using a cloud-based service that is part of Microsoft 365.

Key contributions of Universal Print are:

  1. Windows endpoints can print without installing printer drivers.
  2. Centralized printer management and maintenance without using on-premises servers.
  3. A secure printing environment.
  4. Make printing accessible anywhere.
  5. Enables organization’s adoption to zero trust network concept.

Building a universal and multi-device printing environment are essential elements to realize ‘the sustainable office’. Epson is exited to support Universal Print, a solution for the new normal at work.

Starting next calendar year, Epson will be adopting native support for Universal Print in our products, especially inkjet printers for office use.

 

Working with Open Source to support MRTK on Oculus Quest

This article is contributed. See the original author and article here.

 

In MRTK 2.5.0 we launched support for Mixed Reality Toolkit with the Oculus Quest and made several improvements to our teleport pointer, including allowing users to teleport via hand gestures (Only on the Quest for now). Here we’ll be taking a closer look at how these improvements were made as well as how they will improve the user’s experience.

 

Quest Support:

First and foremost is the support for deploying MRTK on Oculus Quest. In the past few months, Unity has worked towards consolidating all of their XR tools into the XR Plugin Framework. Though Oculus’s current XR-plugin framework provides support for interfacing with their controllers, it does not provide access to the hand tracking capabilities of the Oculus Quest. In the meantime, one of our open source community members, Eric Provencher, had already developed handtracking support by leveraging the Oculus Integration Asset Store package, which directly fetches the data from lower level Oculus APIs. Working closely with him, we were able to bring in those handtracking integrations into MRTK while still coexisting with the larger XR Plugin Framework. Thanks Eric!

 

Teleport Pointer Improvements: 

The other major changes that came from this collaboration were improvements to the teleport pointer. There are four main improvements we’ve made to the teleport pointer.

 

Improved hand materials:

The materials on the articulated hands for Quest have been adjusted to give a subtle glow when pinching. This visual feedback helps make the pinch action feel more responsive. This improvement is only available on Oculus Quest for the 2.5 release, but we are planning on expanding it to more platforms.

 

Teleporting with articulated hands: 

You can now make a gesture with articulated hands to begin a teleport action! This gives users a whole new way of navigating environments in mixed reality and without the need for physical controllers! This improvement is only available on Oculus Quest for the 2.5 release, but we are planning on expanding it to more platforms.

 

Teleport Sounds:

A sound now plays when starting and completing a teleport, making the action a lot more responsive to use.

 

Teleport Animations:

The teleport cursor now has animations when the user is pointing at a valid surface to teleport to. This improves the overall feel and experience of teleporting and should make the tool a bit more fun to use. There is also a subtle proximity light on the cursor, which helps highlight the location the user wants to teleport to.

 

The cursor’s speed scaling has been adjusted from a linear interpolation to a quadratic one. This makes it much easier to pinpoint a location to teleport to, as the cursor will accelerate less rapidly when transitioning from closer to farther locations.

 

Launch of unified Azure Certified Device program

This article is contributed. See the original author and article here.

Giving certified IoT devices the ability to stand out from the crowd

 

As we work across the IoT industry, we continue to hear from device builders that you are looking for help connecting with customers who want to find the right device to meet their needs and differentiating your devices by making them solution ready. With over 30 billion active IoT devices in the world and 400 percent growth in devices over the past three years, the industry is moving incredibly fast; the challenge of connecting the right audience with the right product will only become more difficult.

 

To enable you to keep up with this pace, I am pleased to share that a unified and enhanced Azure Certified Device program is now generally available, expanding on previous Microsoft certification offerings.

 

At the heart of this device certification program is a promise—a promise to device builders that they can not only quickly get their devices to market, but also better differentiate and promote their devices. And a promise to buyers that they can easily identify devices that meet their needs and purchase those devices with confidence that they have Microsoft approval. Our promise is to the entire IoT ecosystem: Microsoft is committed to helping a diverse set of partners easily create and find IoT devices built to run on Azure, and we’ll support connecting those devices with the right customers.

 

Visit our Getting Started with the Azure Certified Device program page to learn more.

 

Advantages of certifying IoT devices with Microsoft Azure

At Microsoft, we have been certifying devices for over 20 years, resulting in the creation of an ecosystem of over one billion PCs worldwide, that you, our partners, helped build. Now, we are enhancing how we apply our certification experience to our expertise in the cloud—building an IoT device ecosystem that will be exponentially larger—with tens of millions of devices connected to Azure and tens of thousands of customers utilizing devices built by our rapidly growing IoT device builder community. As we continue to build a thriving IoT ecosystem, we are committed to going even further for IoT builders and buyers through our improved tools and services as well as the following certification commitments:

 

  • Giving customers confidence: Customers can confidently purchase Azure certified devices that carry the Microsoft promise of meeting specific capabilities.

  • Matchmaking customers with the right devices for them: Device builders can set themselves apart with certification that highlights their unique capabilities. And customers can easily find the products that fit their needs based on certification differentiation.

  • Promoting certified devices: Device builders get increased visibility, contact with customers, and usage of the Microsoft Azure Certified Device brand.

 

Three certifications available today, with more coming

This IoT device certification program offers three specific certifications today (with more on the way!). Certifications currently available include Azure Certified Device, IoT Plug and Play, and Edge Managed. 

 

Azure Certified Device

Azure Certified Device certification validates that a device can connect with Azure IoT Hub and securely provision through the Device Provisioning Service (DPS). This certification reflects a device’s functionality and interoperability, which are a necessary baseline for more advanced certifications.

 

IoT Plug and Play

Announced in August, IoT Plug and Play certification validates Digital Twin Definition Language (DTDL) version 2 and interaction based on your device model. It enables a seamless device-to-cloud integration experience and enables hardware partners to build devices that can easily integrate with cloud solutions based on Azure IoT Central as well as third-party solutions. Additionally, Azure IoT platform services and SDKs for IoT Plug and Play will be generally available by the end of this month. View our developer documentation for more information, and join the companies already beginning to prepare and certify their devices for IoT Plug and Play.

 

Edge Managed

Edge Managed certification focuses on device management standards for Azure connected devices. Today, this program certification focuses on Edge runtime compatibility for module deployment and management. Informed by conversations with our partners, our Edge Managed certification will continue to grow in the future with additional customer manageability needs.

 

 

Security and edge AI certifications soon in private preview

In addition to the currently available certifications, we are also working on additional security and edge AI certifications, which will soon be in private preview. These programs reflect our continued engagement with customers and partners to address key customer needs and business opportunities in delivering both secure and high-quality AI perception experiences at the edge. Interested partners can contact the Azure Certified Device team for more information.

 

Accelerate business with the Azure Certified Device Catalog

The Azure Certified Device certification program connects a community of device builders with solution builders and buyers through the Azure Certified Device Catalog. Certified devices are searchable based on which devices meet which capabilities, allowing device builders to differentiate their offerings based on the certification program. By certifying their devices to appear in the Azure Certified Device Catalog, device builders gain access to a worldwide audience looking to reliably purchase devices that are built to run on Azure. Meanwhile, buyers can use the catalog as a one-stop-shop they can trust to find and review a wide array of IoT devices.

 

Next steps for pursuing IoT device certification                                   

If you’re a device builder, now is the right time to start thinking about how IoT device certification can benefit your company—elevating your profile and better positioning your devices to reach a broader market. Begin saving valuable time and make your devices stand out from the crowd by taking part in the Azure Certified Device program.

 

Visit our Getting Started with the Azure Certified Device program page to learn more.

 

TodoMVC Full Stack with Azure Static WebApps, Node and Azure SQL #beginners #node #sql #serverless

TodoMVC Full Stack with Azure Static WebApps, Node and Azure SQL #beginners #node #sql #serverless

This article is contributed. See the original author and article here.

TodoMVC is a very well known (like ~27K GitHub stars known) application among developers as it is a really great way to start to learn a new Model-View-Something framework. It has plenty of samples done with different frameworks, all implementing exactly the same solution. This way is very easy to compare them against each other and see what is the one you prefer. Creating a To-Do App is easy enough, but not too easy, to be the perfect playground to learn a new technology.

 

pt1vlgtqp6mfjvx5um4v.jpg

 

The only issue with TodoMVC project is that it “only” focus on front-end solutions. What about having a full-stack implementation of the TodoMVC project with also back-end API and a database? Well it turns out that there is also an answer for that: Todo-Backend. There are more than 100 implementations available! Pretty cool, uh?

If you want to have a test run building a full-stack solution using a new technology stack you want to try, you are pretty much covered.

 

Full Stack with Azure Static Web Apps, Node, Vue and Azure SQL

Lately I was intrigued by the new Azure Static Web Apps that promises an super-easy Azure deploy experience, integration with Azure Function and GitHub Actions, and ability to deploy and manage a full-stack application in just one place, so I really wanted to try to take the chance to create a 100% serverless TodoMVC full stack implementation using:

  • Vue.Js for the frontend as I find it really really cool and powerful;
  • Azure Static Web Apps as I can manage the full-stack app just from one place and deploy just by doing a git push;
  • Node.js for the backend, as I’m learning it and I want to keep exercising. Not to mention that is very common and very scalable;
  • Azure SQL as I want to have a database ready for anything I may want to throw at it;

I searched in the TodoMVC and TodoBackend but didn’t find this specific stack of technologies…so why not creating it myself, I thought? Said and done! Here’s some notes I took while building this.

 

Azure Static Web Apps

sadjelj98npdafzg7efo.png

Still in Preview but I loved it as soon as I saw it. Is just perfect for a full-stack development experience. In one shot you can deploy front-end and back-end, make sure they are correctly configured to work together (you know, CORS) and correctly secured.

Deployment is as easy as configuring a GitHub Action, that is actually automatically done for you, even if you still have full access to it, so you can customize it if needed (for example to include the database in the CI/CD process).
Azure Static Web Apps will serve a static HTML whatever you specify as the app and will spin up and deploy an Azure Function using Node.js to run the back-end using anything you instead specify as the api:

 

abcl1d3uzmwmp1z7nnai.png

 

As you can guess from the configuration, my repo contains the front-end in the client folder and the back-end code in the api folder:

 

7xg7nyf52h5476gkqolg.png

 

Front-End: Vue.js

As I’m still learning also Vue I kept the code very simple and actually started from the TodoMVC Vue sample you can find on the Vue website: TodoMVC Example.

I like this sample a lot as it shows the power of Vue.js using a single file. Very easy to understand if you have just started learning it. If you are already an experienced Vue user, you’ll be happy to know the Azure Static Web Apps has a native support for Vue, so that you can build and deploy Vue CLI. I’m honestly not that expert yet so I really like the super-simple approach that Vue also offers. Plus I also think that the super-simple approach is perfect for learning, which make it just great for this post.

 

7lxx3xebiautjipmegnb.jpg

 

Call a REST API

The original TodoMVC sample uses a local storage to persist To-Do data. Thanks to the Watchers feature that Vue provides, the code JavaScript code you need to write is very simple as any changes to a watched list – todo in this case – is automatically persisted locally via the following snipped of code:

 

watch: {
    todos: {
        handler: function(todos) {
            todoStorage.save(todos);
        },
        deep: true
    }
},

 

Of course, to create a real-world full-stack sample, I wanted to send the To-Do list data to a REST API, avoiding the usage of local storage, to enable more interesting scenarios, like collaboration, synchronization on multiple devices and so on.

Instead of relying on a Watcher, which would unfortunately send the entire list to the REST API and not only the changed item, I decided to go for a more manual way and just call the REST API just binding them directly to the declared methods:

 

methods: {
    addTodo: function () {
        var value = this.newTodo && this.newTodo.trim();
        if (!value) {
            return;
        }
        fetch(API + "/", {headers: HEADERS, method: "POST", body: JSON.stringify({title: value})})
        .then(res => {                  
            if (res.ok) {                                               
                this.newTodo = ''
                return res.json();
            }
        }).then(res => {                        
            this.todos.push(res[0]);
        })
    },

 

Connecting the addTodo method to an HTML object is really simple:

 

<header class="header">
    <h1>todos</h1>
    <input class="new-todo" autofocus autocomplete="off" placeholder="What needs to be done?" v-model="newTodo"
        @keyup.enter="addTodo" />
</header>

 

With these changes done, it’s now time to take a look at the back-end.

 

Back-End: Node

Azure Static Web Apps only support Node.js as a backend language today. No big deal, Node.js is a great, fast and scalable language that works perfectly with Azure Function and Azure SQL so we’re really good here. If you are not familiar on how to run Azure Function with Node.js and Azure SQL make sure to read this article: Serverless REST API with Azure Functions, Node, JSON and Azure SQL. As Azure Static Web Apps uses Azure Functions behind the scenes, everything you learned for Azure Function will be applicable to Azure Static Web Apps back-ends.

The client will send a HTTP request to the back-end REST API passing the To-Do payload as JSON. For example to mark a To-Do as done, this JSON

 

{"completed":true}

 

will be send via a PUT request:

 

https://xyz.azurestaticapps.net/api/todo/29

 

to set the To-Do with Id 29 as done. If everything is ok the REST API will return the entire object, to make sure the client always have the freshest data:

 

[{
    "id":29,
    "title":"Write about Vue",
    "completed":1
}]

 

Thanks to Azure SQL support to JSON, the back-end doesn’t have to do a lot…just turn an HTTP request into a call via the TDS protocol supported by Azure SQL but beside that there isn’t a lot to do. JSON will be passed as is, so what the back-end really has to do is to make sure that depending on the HTTP request method invoked, the correct Azure SQL operation will be executed. For example a PUT request should call and UPDATE statement. Implementation is very easy:

 

switch(method) {
    case "get":
        payload = req.params.id ? { "id": req.params.id } : null;            
        break;
    case "post":
        payload = req.body;            
        break;
    case "put":
        payload =  { 
            "id": req.params.id,
            "todo": req.body
        };   
        break;
    case "delete":
        payload = { "id": req.params.id };
        break;       
}

 

If you have more complex needs you may decide to implement one function per HTTP request method, but it this case would have been an overkill. I really try to follow the KISS principle as much as possible. The simple the better. But not simpler! (Of course if that would be production code I would check and make sure that JSON is actually valid and harmless before passing it to Azure SQL. Never trust user-provided input, you never know!)

 

Database: Azure SQL

Azure SQL has been created with just one simple table:

 

create table dbo.todos
(
  id int not null primary key 
    default (next value for [global_sequence]),
  todo nvarchar(100) not null,
  completed tinyint not null 
    default (0)
)

 

As a developer I still prefer to use JSON in the backend and to send data back and forth to Azure SQL, so that I can also minimize the roundtrips and thus improve performances, so all the stored procedures I’m using have this very simple signature:

 

create or alter procedure [web].[get_todo]
@payload nvarchar(max)

 

Then inside the stored procedure I can then use OPENJSON or any of the JSON functions to manipulate JSON. This way it becomes really easy to accept “n” To-Do as input payload. For example, let’s say I want to delete three To-Dos at once. I can pass something like

 

[{"id":1}, {"id":2}, {"id":8}]

 

and then just by writing this

 

delete t from dbo.todos t 
where exists (
   select p.id 
   from openjson(@payload) with (id int) as p where p.id = t.id
)

 

I can operate on all the selected To-Dos at once. Super cool, and super fast! The ability of Azure SQL to operate both with relational and non-relational features is really a killer feat!

 

Why Azure SQL and not a NoSQL database?

Answering that question could take a book so let me try to summarize. A NoSQL database for a To-Do list app is more than enough. But I always try to think about future improvements, and I want to make sure than anything I’d like to do in future will be reasonably well supported by my database. I might need to have geospatial data, to aggregate data to do some analytics, I may want to use graph or I may need to create a concurrent system to allow more than one person working on he same to-do list and I need a structure without locks. All these things are available inside Azure SQL without requiring me to use anything other than a technology I already know. This means that I’ll be super productive. I won’t even have scalability issues as with Azure SQL I can go up to 100 TB.

 

A To-Do list has a pretty well-defined schema, and the performance I can get out of a properly designed relational database are exceptional and cover a huge spectrum of use cases. With a NoSQL database I might squeeze a bit more performances when I focus on a very specific use case, but at the expense of all the others. I really want to keep door open to any improvement so, for this time, for my use case and future needs, I think Azure SQL is the best option I have here.

 

Keep in mind that well-defined schema doesn’t mean carved in stone. I can have all the flexibility I may want as I can easily store To-Do as JSON (or just a part of it) into Azure SQL, mixing relational and non-relational features, allowing end-users to add custom field and properties if the want to. Actually, you know what? That looks like a great idea for a post. I’ll definitely write on on this topic, so stay tuned!

 

Conclusion

Creating and deploying a full-stack solution is really easy now, thanks to Azure Static Web Apps. Completely serverless, you can just focus on coding and design while enjoying the simplicity – along with scalability and flexibility – that serverless solution offers. Azure SQL will guarantee that your solution is future-prof, providing scalability out and up to 100 TB with all the perks of a modern post-relational database, like multi-model support, security built-in, columnstore, lock-free tables and anything you may need in your wildest dream.

 

As usual enjoy the full source code here: https://github.com/Azure-Samples/azure-sql-db-todo-mvc