Change the MFA Default Verification Method for a User in AAD

I recently had a user who had been required to configure MFA for their account. They had had problems with using the Microsoft Authenticator application and had ended up configuring their mobile phone number as the verification method. The user wanted to have another go at setting up MFA using the Microsoft Authenticator application but didn’t know how.

As its something that doesn’t get done very often I thought it would be useful to document the process.

Firstly, the user will need to authenticate to Azure, by going to https://azure.portal.com. Type Users into the search field and select Users in the returned list.

Select your user and then select Authentication Methods from the left hand menu.

When the profile page for the user is displayed, select Additional security verification on the right hand side of the screen.

You will now be taken to the Additional Security Verification page. Here you can change your MFA settings and default contact method.

Adding existing devices to Autopilot using Powershell

Adding devices to Autopilot requires gathering the hardware ID hash of the device. If you are buying new devices from a supplier they can usually provide the details in a file that can be imported into Autopilot.

If you have existing machines that you want to enable for Autopilot deployment, Microsoft have provided a script that will gather the correct information and create a CSV file that can be imported into Autopilot.

md c:\\HWID
Set-Location c:\\HWID
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Unrestricted
Install-Script -Name Get-WindowsAutoPilotInfo
Get-WindowsAutoPilotInfo.ps1 -OutputFile AutoPilotHWID.csv

The above script will need to be run on each device that needs to be added to Autopilot.

Are your Application Packaging Standards costing you money? : 5 mistakes often seen in packaging standards documents

Application Packaging Standards are used by organisations to specify how their desktop applications will be repackaged for deployment to desktop and laptop devices.

This makes good sense as it provides standards that should be met for all applications, to reduce errors and ensure that deployments work in all required scenarios. These standards are of particular benefit when bring in external resources for things like desktop transformations, when the new people may have a different idea of what standards to use.

Unfortunately, Packaging Standards often get set in stone and remain unchanged for many years. Being technical documents, the are usually created by one or two, very technical, people, who may have left the organisation since they were written. The application packaging environment has moved on a lot since the early days, but it is amazing how often I come across documents that have not been updated for many years, and are still specifying out of date requirements. These aged requirements are probably adding to the cost of each application that gets packaged. As many organisations can have between 300 and 1000 applications, this can add up to a very substantial outlay, for no good business reason.

Below are some of the mistakes I come across frequently.

1. Specifying that all applications should be repackaged as MSI

This requirement is very outdated. In the early days of application packaging, many installers did not support command line installation. That is very rare these days. Why go to the trouble of repackaging to an MSI when the vendor’s installer may install silently simply by adding /s to the command line? Also, the moment an application is repackaged you will get no support from the majority of software vendors. Using the vendors automation means that vendor support will be maintained.

There are times when repackaging as MSI makes sense. Some application vendors write very poor installation routines and it can sometimes be impossible to automate the install without user intervention. These situation, though, should be rare. Ninety percent, or more, of applications can be successfully installed using the vendors inbuilt automation. Repackaging costs time and money.

2. Very prescriptive and complicated standards

This is often related to mistake 1. Repackaging an application to an MSI allows all sorts of customisation to the installer. Often, though, I struggle to see a valid business reason behind the requirement.

Is there really a proper business need for putting the application shortcuts in a different location from that specified by the vendor? Is removing ALL ICE Errors and Warnings really going to make your package more robust (considering the number of errors and warnings that Microsoft leaves in its products)? Are all those extra Properties really required? Do you really need to remove all those files from an App-V sequence?

Standards are important, but there must be a valid reason for them. otherwise they are simply adding to the packaging costs and keeping application packagers in a job. Remember that you are happy to accept the vendors manual installation, including all the extra files and registry entries that it adds, so why make major changes when automating that installation?

3. Specify a single application packaging type

There are many different packaging types out there these days. MSI, EXE, MSIX, App-V, App-X, ThinApp etc. The moment that you specify a single packaging type you are going to end up spending a lot of time trying to get some of your square applications into the round packaging type hole.

Far better to use the correct packaging type for the job, and support many packaging types. You can have preferred options, for example App-V or MSIX, but don’t expect these to work for all applications.

I always specify multiple applications types, with a default starting point. For example you might have a default package type of App-V, but limit the amount of time spent making this work. Have a fallback position of “vendor supported”, which will be used if your application doesn’t work with the default packaging type.

4. Not including deployment in your Application Packaging Standards

Deployment of a packaged application is the other side of the packaging coin. To not cover it in the packaging standards removes options from the packagers toolkit. Things like application upgrades would normally be handled by the deployment tool/s you normally use.

A deployment test should be performed by the packager who packaged the application so that they can ensure that the package installs correctly.

5. Manual processes not automated

Many packaging processes relay on manual tasks, for example manually creating Applications, Collections, deployments etc in SCCM. It is very straightforward to automate this and both speed up the process and ensure that naming conventions etc are implemented properly.

If you save 15 or 20 minutes per application, that can mean a large time saving on packaging and deploying 300 applications. An identifiable and specific cost saving. But there can be cost savings from ensuring that things like SCCM collections and media deployments are done correctly by using automation. So many times I visit a site and see multiple naming conventions and ways things have been implemented that do not meet the required standards, and automation helps to ensure those standards are always met.

Automation can be implemented a bit at a time, to make implementation easier. People often say that it costs too much time to develop, but I have always seen overall cost savings, for the modest amount of upfront effort and cost.

Use of the automation should be described in your Packaging Standards.