Support Tip: Install Rosetta 2 on new Apple Silicon (M1) Macs to run apps built for Intel Macs

Support Tip: Install Rosetta 2 on new Apple Silicon (M1) Macs to run apps built for Intel Macs

This article is contributed. See the original author and article here.

Apple recently announced Apple Silicon Macs. These devices run on 64-bit ARM (RISC) CPUs relative to the previous generation of Macs that ran on Intel CPUs. Apple also announced a translation layer called Rosetta 2 that allows apps built for Intel Macs to run on the new Apple Silicon Macs.


 


Intune apps on macOS such as Intune Company Portal and the Intune MDM agent depend on the Rosetta 2 translation layer for managing Apple Silicon Macs. If you purchase a new Apple Silicon Mac running macOS 11.x (Big Sur), Rosetta 2 does not come pre-installed and the end-user is prompted by macOS to install it on first launch of an Intel-based application.


 


macOS installation prompt for RosettamacOS installation prompt for Rosetta


 


If you are upgrading to macOS 11 on Intel Macs, this is not an issue.


 


Issue: Apple Silicon (M1) Macs fail to run shell scripts when enrolled via Apple Automated Device Enrollment (ADE)


In this scenario, the device gets enrolled into Intune using macOS Setup Assistant. If you have configured shell scripts for these Macs, the Intune MDM agent is automatically installed on the Mac. However, the Intune MDM agent cannot start because Rosetta 2 is not installed. macOS 11 does not prompt the end user to install Rosetta 2 in this case.


 


If you are enrolling your Apple Silicon Macs using Company Portal, you will be prompted to install Rosetta 2 on first launch of Company Portal.


 


Recommendation


Install Rosetta 2 on Apple Silicon Macs to ensure app compatibility with Intel-based apps using one of the following steps:




  • Recommend users to install Rosetta 2 manually by launching any installed Intel-based app on the Apple Silicon Mac.




  • Recommend users to open Terminal and run the following command or provide a script that runs this command to users:



    • /usr/sbin/softwareupdate --install-rosetta (root permission not required)


    • /usr/sbin/softwareupdate --install-rosetta --agree-to-license (root permission required)





 


Let us know if you have any additional questions on this by replying back to this post or tagging @IntuneSuppTeam out on Twitter.

How to save up to 50% on your ELT/ETL total cost of ownership (TCO)

How to save up to 50% on your ELT/ETL total cost of ownership (TCO)

This article is contributed. See the original author and article here.

The need for faster data quality


Data validation, data transformation and de-identification can be complex and time-consuming. As data volumes grow, new downstream use cases and applications emerge, and expectations of timely delivery of high-quality data increase the importance of fast and reliable data transformation, validation, de-duplication and error correction.


 


How the City of Spokane improved data quality while lowering costs


To abstract their entire ETL process and achieve consistent data through data quality and master data management services, the City of Spokane leveraged DQLabs and Azure Databricks. They merged a variety of data sources, removed duplicate data and curated the data in Azure Data Lake Storage (ADLS).


 


“Transparency and accountability are high priorities for the City of Spokane,” said Eric Finch, Chief Innovation and Technology Officer, City of Spokane. “DQLabs and Azure Databricks enable us to deliver a consistent source of cleansed data to address concerns for high-risk populations and to improve public safety and community planning.”


 


City of Spokane data architecture.png


City of Spokane ETL/ELT process with DQLabs and Azure Databricks


 


How DQLabs leverages Azure Databricks to improve data quality


“DQLabs is an augmented data quality platform, helping organizations manage data smarter,” said Raj Joseph, CEO, DQLabs. “With over two decades of experience in data and data science solutions and products, what I find is that organizations struggle a lot in terms of consolidating data from different locations. Data is commonly stored in different forms and locations, such as PDFs, databases, and other file types scattered across a variety of locations such as on-premises systems, cloud APIs, and third-party systems.”


 


To help customers make sense of their data and answer even simple questions such as, “is it good?” or “is it bad?” are far more complicated than organizations ever anticipated. To solve these challenges, DQLabs built an augmented data quality platform. DQLabs helped the City of Spokane to create an automated cloud data architecture using Azure Databricks to process a wide variety of data formats, including JSON and relational databases. They first leveraged Azure Data Factory (ADF) with DQLabs’ built-in data integration tools to connect the various data sources and orchestrate the data ingestion at different velocities, for both full and incremental updates.


 


DQLabs uses Azure Databricks to process and de-identify both streaming and batch data in real time for data quality profiling. This data is then staged and curated for machine learning models PySpark MLlib.


 


Learn more and get started


Continue reading how the City of Spokane improved data quality while lowering their TCO using DQLabs. Then get hands on with Azure Databricks by attending a Quickstart Lab.

Serverless Architecture and Concepts. What is it?

Serverless Architecture and Concepts. What is it?

This article is contributed. See the original author and article here.

I needed to go through this subject this week so I thought that would be a good opportunity to share SQL Serverless Architecture concepts.


 


1) What is the difference between SQLOD and the former SQLDW?


formerDW_SQLOD.png


 


SQL Serverless or SQL On-demand



  • Grow or shrink compute power, within a dedicated SQL pool, without moving data.

  • Pause compute capacity while leaving data intact, so you only pay for storage.

  • Resume compute capacity during operational hours.


SQL DW you can actually pause and resume. You insert, update and delete. You have storage with your data which SQLDW is held responsible.


 


What is SQLOD?


SQLOD is a query service over the data in your data lake. You do not need to pause or resume. It is a service per comsumption or per demand. The service is resilient to failure and elastic.


Note: Serverless SQL pool has no local storage, only metadata objects are stored in databases. Basically is for reading, only.


 


What it means resilient to failures and elastic?


It means it auto-scale the node’s resources if required by the engine while querying your file and it means also if there is any failure in any node it recovers without any user intervention.


 


 


What is supported on Serverless?


Supported T-SQL:



  • Full SELECT surface area is supported, including a majority of SQL functions

  • CETAS – CREATE EXTERNAL TABLE AS SELECT

  • DDL statements related to views and security only


How does it work?



  • DQP or Distributed Query Processing.

  • Compute Node


 


DQP is responsible per optimize and orchestrate distributed execution of user queries by splitting them into smaller queries that will be executed on Compute nodes.


The Compute Nodes will execute the tasks creates by the DQP. The tasks are pretty much the query logic break in chunks of data to be processed. Those chunks of data are the files organized in data cells.  How many data cells and tasks will be executed depends on the plan optimization.


 


Plan optimization also depends on the stats. The more serverless the SQL pool knows about your data, the faster it can execute queries against it and in the end, the plan chosen is based on cost, lowest cost.  Note: Automatic creation of statistics is turned on for Parquet files. For CSV files, you need to create statistics manually.


 


 


 


Ref: Synapse SQL architecture – Azure Synapse Analytics | Microsoft Docs


Serverless SQL pool – Azure Synapse Analytics | Microsoft Docs


https://www.microsoft.com/en-us/research/publication/polaris-the-distributed-sql-engine-in-azure-synapse/


Create and update statistics using Azure Synapse SQL resources – Azure Synapse Analytics | Microsoft Docs


 


That is it!


Liliam Leme


UK Engineer.


 

Cracking down on ticket bots that leave you out in the cold

Cracking down on ticket bots that leave you out in the cold

This article was originally posted by the FTC. See the original article here.

For most of us, it’s been a long time since we’ve been able to attend a live event. Think back, if you can, to the last time you tried to buy tickets online to go to a concert, a game, or a play. Were you shut out because tickets sold out before you got yours? You’re not alone.

So what happened? Sometimes there just aren’t enough tickets available for everyone who wants to attend an event, especially if promoters save tickets for artists and other VIPs. Ticket bots may also be a factor. People may use software to buy tickets quicker than the average consumer. They also might use bots to cheat the ticketing system and bypass ticket limits or to buy tickets using fake names and addresses. Then they resell the tickets for higher prices. Congress passed the Better Online Ticket Sales (BOTS) Act to address these problems.

We (the Federal Trade Commission) settled three cases with companies that violated the BOTS Act. The companies circumvented Ticketmaster’s security measures to buy thousands of tickets that they later re-sold at a profit. The court orders require the companies to stop their illegal ticket-buying practices and impose civil penalties. (Read the business blog post to learn more about the case.)

The next time you’re looking to score tickets to a must-see event:

  • Look for opportunities to buy tickets before they go on sale to the public. Sign up for newsletters or alerts from ticket sellers, artists, or venues, or follow them on social media. And check with your credit card company about promotions.
  • Set up an account with the ticket seller. That way you’ll be ready to buy as soon as tickets go on sale.
  • Check back. The promoters might make more tickets available after the initial release or add another show.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Cloud Design Patterns: The Gatekeeper and Valet Key Patterns

This article is contributed. See the original author and article here.

When designing any solution, we often look for common practices or patterns that can be re-used. Think about this from a Software development perspective. You’ve probably heard about the Factory Method pattern, Builder or Singleton patterns.


 


What about design patterns for the cloud? Fortunately a number of well-known patterns already exist and are documented! You can find those over on the Azure Architecture Center, in the Cloud Design Patterns section.


 


In the following video, Chris Reddington and Peter Piper explore the Gatekeeper Pattern and the Valet Key Pattern.


 


https://www.youtube-nocookie.com/embed/zM3hJBZu2vA


 


What is the Gatekeeper Pattern?


The Gatekeeper Pattern helps you to protect your application and services by exposing your application or service through a dedicated instance. This dedicated instance (the gatekeeper) is a type of Façade layer that decouples clients from your trusted hosts. The gatekeeper may perform tasks like authentication or authorization, or other sanitization steps such as rate limiting or checking for specific metadata in requests.


 


It may be useful in scenarios where you have a distributed application (e.g. a set of microservices), and want to centralize your validation steps for simplicity. Alternatively, if your application has requirements for a high level of protection from malicious threats, then you may want to consider reviewing this pattern.


 


What should you consider before implementing the Gatekeeper pattern?



  • The Gatekeeper should be kept lightweight, and typically focuses upon validation/sanitization. Try not to get pulled into a trap of any processing related to your applications, which would introduce coupling between services!

  • As the Gatekeeper is “less trusted” than your trusted hosts, they are typically hosted in separate environments.

  • As the Gatekeeper is a Façade-based pattern, you are introducing an extra step in your application’s routing which means this may increase latency.

  • Given that the Gatekeeper is a type of Façade, be careful not to introduce a point of failure into your architecture. Implement scaling of your Gatekeeper component as needed.


This is just a brief summary of the pattern, and some key considerations. For the full detail, check out The Gatekeeper Pattern on the Azure Architecture Center.


 


What is the Valet Key Pattern?


The Valet Key pattern could also be considered if security is important. At a high level, the Valet Key Pattern is an approach to prevent direct access to resources and instead uses keys or tokens to restrict access to those resources.


 


Consider an Azure Storage Account with blobs in a private container. You could provide access to the account using the Storage Account Key, but that would grant overall direct access to the storage account and pose a security risk. Instead, you could generate time-bound permissions-restricted access to a set of files in the Storage Account using Shared Access Signatures. A Shared Access Signature is an example implementation of the Valet Key pattern.


 


What should you consider before implementing the Valet Key pattern?



  • As a token/key is required to provide restricted access, how do you provide that secret material to the user in the first place? Make sure to send it to the user securely.

  • Ensure that you have a key rotation strategy in place ahead of time. Don’t wait until a token is compromised to test your operational process of rotating keys!


 


The Valet Key is a separate architectural pattern in its own right, but worth noting it is commonly used in combination with the Gatekeeper pattern.  


 


This is just a brief rundown of the pattern, and some common considerations. For the full detail, check out The Valet Key Pattern on the Azure Architecture Center.


 


Remember, there are many more cloud design patterns that you can use in your own solutions! Check them out on the Azure Architecture Center. If you prefer video/audio content, then take a look at Architecting in the Cloud, One Pattern a time series on Cloud With Chris