How to disable Hybrid Azure AD Join

A device is said to be hybrid joined if it has both an AD object and an Azure AD (AAD) object, which allow users of that device to sign in with an AD user account, which provides access to resources which are protected by either the AD or the AAD user.

A hybrid joined computer is joined to both AD and AAD, but the AD join is primary because the device initially uses AD authentication. Only Windows devices can be hybrid joined. The benefits of having Hybrid Azure AD Join devices are

  • The computer has a device object in Azure AD, which enables a variety of capabilities including:
    • Microsoft 365 Apps device licensing is possible
    • Azure AD Conditional Access features based on device conditions are possible
  • There is a reduction in user sign ins because user sign in gets both an NETID AD token and AAD token

But what If you want to disable that Hybrid Join?

You can disable hybrid join by preventing one of the requirement elements from triggering hybrid join registration:

  1. Modify the Hybrid Azure ADJoin from the AADConnect (Recommended)
  2. Use the following registry in the computers to block: HKLM\SOFTWARE\Policies\Microsoft\Windows\WorkplaceJoin: “BlockAADWorkplaceJoin”=dword:00000001
  3. Modify the Scheduled Task which triggers AAD device registration. See Task Scheduler > Microsoft > Windows > Workplace Join > Automatic-Device-Join. See the following 3 items for details:
    1. Deleting the Scheduled Task seems to work reliably.
    2. Disabling the Scheduled Task does not work reliably; the disabled task will still run after a user signs in.
    3. Modify both triggers from an Enabled status to a Disabled status; this works reliably.
  4. Add a firewall block for https://enterpriseregistration.windows.net, to prevent the computer from connecting to the Azure AD Device Registration Service (AAD DRS). See the following item for possible side-effects:
    1. This should only affect the ability to AAD join. If you have Office installed on the Windows device, this might have an undesirable impact on AAD device registration (different from AAD device join) which is required per user for Microsoft 365 Apps (was Office 365 ProPlus) sign-in.
  5. Add a firewall block for the UW ADFS server, sts.domain.com, to prevent the computer from getting an ADFS token to authenticate to the AAD DRS. See the following item for possible side-effects:
    1. Note: this option will only work for as long as you continue to have federated authentication for AAD, which is planned to be removed. This option may be undesirable if there is any interaction with Azure AD applications like Office 365 from the device–those interactions would be blocked.

Ok, that’s great but what if I want to unjoin a Hybrid AzureADDevice? For hybrid Azure AD joined devices, make sure to turn off automatic registration. Then the scheduled task doesn’t register the device again.

Next, open a command prompt as an administrator and enter dsregcmd.exe /debug /leave. Or run this command as a script across several devices to unjoin in bulk.

AIP AutoApply Label not working as expected

I have been testing auto apply label in some scenarios, but what I have discovered is that AutoApply Label is not working when I activate the autosave toogle in Word (for example):

I’ve been testing the same policy under different circumstances (Windows 10 + Office 2101 C2R)

Turned ON AutoSave for synced libraries (Default On):

Tested to create a new Word.docx where AutoSave in ON and then added keyword in doc to trigger AutoApply label.

After hit Safe nothing happens, no label applied (or suggested)

Turned OFF AutoSave for synced libraries (Default On):

Tested to create a new Word.docx where AutoSave in OFF and then added keyword in doc to trigger AutoApply label.

After hit Safe the Auto Apply label suggests me to change label.

But after digging some more, I was aware that If I used the built-in labeling client in Word it works with or without autosave. Strange isn’t it? I do not know if it’s a limitation or what…

I will keep tracking that problem…

Messing around with WVD, AADDS and FSLogix

In a project where WVD was involved, we needed to implement AADDS and FSlogix to the scenario. If you take a look to that scenario, it is pretty simple, but it hides some stones that we hit during the road, so I want to explain them in this post 😊

First of all, once you have deployed the AADDS, remember to check DNS settings in the VNet, it is necessary to put the DNS from the AADDS, otherwise won’t be possible to join VMs to the AADDS domain:

Once the AADDS instance was deployed it took turn for the golden image, as you probably know there is no problem to install all the programs and updates, but our stone here was once we deployed the language pack and the image was prepared, the sysprep was crashing, so we need to deep dive into the logs to solve the problem…

So the deployment begun to be fun, but after digging, we were able to solve by executing…

Remove-AppxPackage -Package Microsoft.LanguageExperiencePackes-ES_19041.17.51.0_neutral__8wekyb3d8bbwe -AllUsers

And then… boom!

Probably you will need to change your package in your case but is important to include the -allusers parameter.

Solved the golden image problem, it take turn to the deploy the host pool which process was straightforward. Our next stone was the storage account… ☹

Deploying the storage account into the AADDS was easy, but the problem was to give NTFS permission to the users, we were used to do that process in ADDS scenarios, so we know what to do, but with AADDS the procedure changes a bit…

So my piece of advice, would be to follow the instructions given in docs: Uso de Azure AD Domain Services para autorizar el acceso a los datos de archivo a través de SMB | Microsoft Docs

We were using the AAD credentials and we were stuck for a while until we read this in the documentation. Lesson learned, read documentation help.

Once you have entered to the storage account with your storage account key, you are able to give NTFS permission to the users (please follow instructions from docs xD)

Once we solved this, we were in position to configure FSLogix for the mobility of the profiles. For those who do not know FSLogix it allows to store both user profiles and applications on a centralized file share. This is extremely useful in virtual desktop environments, as the user’s profile does not have to be copied prior to boot. FSLogix will mount those profiles hosted on a file share and will make them appear local.

But again, once we have configured the entry in the VM registry:

We hit another stone… because we were logging into the WVD remote desktop and it didn’t create any profile on the Storage account, after digging and asking ourselves, we decided to go to FSLOgix logs located here: %ProgramData%\FSLogix\Logs. We checked the profile logs and found the following:

Configuration setting not found: SOFTWARE\FSLogix\Profiles\AttachVHDSDDL. Using default:
[17:33:52.257][tid:00000c4c.00000e74][INFO] Session configuration wrote (REG_SZ): SOFTWARE\FSLogix\Profiles\Sessions\S-1-5-21-1901185187-4119977032-3365905087-1004\AttachVHDSDDL = ‘D:AI(A;;GA;;;SY)(A;;GA;;;BA)(A;;GA;;;BU)(A;;GA;;;WD)(A;;GA;;;RC)(A;;GA;;;AC)S:(ML;;NW;;;LW)’
[17:33:52.273][tid:00000c4c.00000e74][INFO] Status set to 0: Success
[17:33:52.273][tid:00000c4c.00000e74][INFO] Reason set to 3: A local profile for this user exists on this system
[17:33:52.273][tid:00000c4c.00000e74][WARN: 00000003] Local profile already exists. Do nothing. (El sistema no puede encontrar la ruta especificada.)

Probably you will asking yourself what kind of error is that? It is simple, your local profile is messing with the network profile being created, so what we had to do is to remove the local profile. You can do that by going into advanced system settings and deleting the profile

We did that, and we tried again and booooooom! The profile was created in the storage account:

After doing that, we were in position to do all the test in WVD and then di all the steps to create and enterprise environment (optimization, monitoring, a “true” golden image, hide the power button, etc…).

Till nex time!

Swap OS disk to storage account

Quick post to remember what actions have to be made to swap your OS disk to a VHD disk in a storage account (yes swapping from MD to UMD, I know probably I’m crazy, but for golden images it is great).

But imagine that you have a VM running a MD disk and you need to swap that OS Disk with and UMD… how can you that?

# Get the VM 
$vm = Get-AzVM -ResourceGroupName myResourceGroup -Name myVM 

# Make sure the VM is stopped\deallocated
Stop-AzVM -ResourceGroupName myResourceGroup -Name $vm.Name -Force

# Set the VM configuration to point to the new disk
Set-AzVMOSDisk -VM $VirtualMachine -Name "osDisk.vhd" -VhdUri "https://mystorageaccount.blob.core.windows.net/disks/osdisk.vhd"

# Update the VM with the new OS disk
Update-AzVM -ResourceGroupName myResourceGroup -VM $vm 

# Start the VM
Start-AzVM -Name $vm.Name -ResourceGroupName myResourceGroup

That’s all! Your VM is running with vhd disk 🙂

Log Analytics Best Practices

Hi! You probably know that I am a fan of Log Analytics, so with this post I want to share with you what are my thoughts about best practices while designing and setup of Log Analytics in several deployments, let’s roll!

  • Use as few workspaces as possible: At the beggining I was using several workspaces (each one for subscription), but in the practice it is more useful to only have one. (The only thing to have separate workspaes would be money and retention). and if you want to control cost, use the table level retention feature!
  • For Long term retention move data to Storage Account 🙂
  • Use one WS for each region: depending in where are you working and laws, would be advisable to have different WS across region (EMEA, APAC, EEUU…)
  • Use Azure Policies to install the Monitoring Agents 🙂 it is very useful
  • Define proper RBAC: depending in which information you are ingesting to Log Analytics, will be important to some people have access to certain data.
  • Setup Alerting for events: Yes you are collecting a huge amount of data, but… are you creating alerts and monitoring rules for those important services?
  • Control the cost: It is easy to set up Log Analytics, but to put verbose data for all those services it is also easy, so your main goal would be to tweak the source of the data and the amount of information that you’re ingesting to log analytics

And finally, the last piece of information… keep an eye to the Log Analytics roadmap, to be updated is my daily nightmare, so… be patient with this

till next time!

Extending your subnet in Azure with VXLan

Continuing with networking in Azure, today I want to talk about how extend the subnets in Azure with VXLAN. To those who do not know VXLAN, it proposes to generate a virtual network to overlap a LAN which will work as base. VXLAN technology uses a layer 3 technology, in order to extend the network, as it shows the following diagram, where the same subnet exists on both sides (Azure and OnPrem):

Ejemplo de extensión de subred

So here the key is that you need a tunel between Azure and OnPrem in order to exchange traffic, but also take into account the following:

  • IP addresses of on-premises hosts are configured as additional IP addresses on the Network Interface Card (NIC) of the Azure VM (using Azure orchestration system);
  • IP addresses of Azure hosts are configured as additional IP addresses of the NIC of on-premises VM.

Whenever an on-premises host tries to reach an Azure VM it sends out an ARP request, and the on-premises Extended Network VM replies to it.

Whenever an Azure VM tries to reach an on-premises host, the Azure Virtual Router sends the traffic to the local IP address which is owned by the Azure VM running Extended Network code.

Take care and happy networking!

Forced Tunneling in Azure

I am not an expert on networking, but sometimes while working in Azure, I have to face some different configurations in order to fulfill customer requirements.

In this case, my customer wanted to redirect all the Internet traffic on the VMs from Azure to OnPrem. Because If you don’t configure forced tunneling, Internet-bound traffic from your VMs in Azure always traverses from the Azure network infrastructure directly out to the Internet, without the option to allow you to inspect or audit the traffic. 

And you know… nowadays, unauthorized Internet access can potentially lead to security breaches…

So, to do that I was thinking in the way I used to do those kind of things, create a table route, redirect the 0.0.0.0/0 traffic to a NVA and done, but this case was not the same, because i needed to redirect all the traffic.

So in this case what is needed is the Forced Tunneling:

Diagrama que muestra la tunelización forzada.

With that configuration any connectio from midtier and backed it is redirected back to onpremises via VPN S2S, and then the traffic can be inspected or event restricted.

The magic to achieve that scenario is to use PowerShell, there is no option to do that with the UI, you can check the full procedure in the following link: Configure forced tunneling for Site-to-Site connections – Azure VPN Gateway | Microsoft Docs

But make special attention to the following parameter:

$LocalGateway = Get-AzLocalNetworkGateway -Name "DefaultSiteHQ" -ResourceGroupName "ForcedTunneling"
$VirtualGateway = Get-AzVirtualNetworkGateway -Name "Gateway1" -ResourceGroupName "ForcedTunneling"
Set-AzVirtualNetworkGatewayDefaultSite -GatewayDefaultSite $LocalGateway -VirtualNetworkGateway $VirtualGateway

With that you create the magic in networking, the other steps that are done in the article are more or less the same that I usually did in my implementations.

So take into account that and happy routing!

How to “backup” resources in Azure

Have you ever wonder how to backup a resource in Azure in order to rebuild it in case it is accidentally deleted? Or imagine that you want to reconfigure it in other subscription/customer.

With Powershell we can do that, it is pretty straightforward and powerful. In my case, what I want to backup is an Azure Firewall, because for example I have configured the AZFirewall with multiple rules, and I want to reuse them in another subscriptions.

To do that, what we have to is to get the firewall resource ID

$AzFirewallId = (Get-AzFirewall -Name "MyFirewallName" -ResourceGroupName "MyRgName").id

Then configure a file to export the configuration:

$FileName = ".\MyResourceBackup.json"

and finally… export the JSON to the previous file:

Export-AzResourceGroup -ResourceGroupName "MyRgName" -Resource $AzFirewallId -SkipAllParameterization -Path $FileName

Take note that I have used the “SkipAllParameterization” parameter, it allows to recreate exaxtly what we have “backup” in the JSON. In case that we want to change the names to the resources we can avoid that parameter. Also it is important, that the json contains all the configurations that we have done in the service, so, we are not losing anything.

And now… how to restore them in Azure? Pretty simple as well:

New-AzResourceGroupDeployment -name "RestoreJob" -ResourceGroupName "MyRgName" -TemplateFile ".\MyResourceBackup.json"

That’s all for one resource, and what if I whant to “backup” all the resources contained in a Resource Group? You can do it as well! But in this case, you must change some of the parameters to export the file:

Export-AzResourceGroup -ResourceGroupName "MyRgName" -SkipAllParameterization -Path $FileName

You will avoid what is in the resource itself, but you will have a laaaarge JSON, with all the parameters and configurations. Then you can restore the resources with the same method explained earlier.

Also, a good point would be to configure an automation account to export all the configuration files from the portal and store them in a blob in order to have a copy of all the resources in the subscription.

Authentication methods in AzureAD: Security Questions

Security questions

I’ve been searching some information about Security Questions in AzureAD, and one of the fisrt conclusiones that I came, is that Security Questions are only available for users that use, SSPR.

Taking this into considerations, you have to be aware of the following, when using security questions:

  • Azure stores security questions privately and in a security-enhanced manner on a user object in the directory. Only users can answer the questions and only during registration. An administrator can’t read or change a user’s questions or answers.
  • Azure provides 35 predefined questions, all translated and localized based on the browser locale.
  • You can customize the questions by using the administrative interface; however, Azure displays them in the language entered. The maximum length is 200 characters.

Authentication methods in AzureAD

Have you been struggling in how and what to configure in AzureAD in order to set your authentication methods? With the following table, you will be able to see th different authentication methods that we have available, and which services can use them.

Authentication methodServices
PasswordAzure AD MFA and SSPR
Security questionsSSPR
Email addressSSPR
Microsoft Authenticator appAzure AD MFA and SSPR
OATH hardware tokenAzure AD MFA and SSPR
Text messageAzure AD MFA and SSPR
Voice callAzure AD MFA and SSPR
App passwordsAzure AD MFA in certain cases