How to “backup” resources in Azure

Have you ever wonder how to backup a resource in Azure in order to rebuild it in case it is accidentally deleted? Or imagine that you want to reconfigure it in other subscription/customer.

With Powershell we can do that, it is pretty straightforward and powerful. In my case, what I want to backup is an Azure Firewall, because for example I have configured the AZFirewall with multiple rules, and I want to reuse them in another subscriptions.

To do that, what we have to is to get the firewall resource ID

$AzFirewallId = (Get-AzFirewall -Name "MyFirewallName" -ResourceGroupName "MyRgName").id

Then configure a file to export the configuration:

$FileName = ".\MyResourceBackup.json"

and finally… export the JSON to the previous file:

Export-AzResourceGroup -ResourceGroupName "MyRgName" -Resource $AzFirewallId -SkipAllParameterization -Path $FileName

Take note that I have used the “SkipAllParameterization” parameter, it allows to recreate exaxtly what we have “backup” in the JSON. In case that we want to change the names to the resources we can avoid that parameter. Also it is important, that the json contains all the configurations that we have done in the service, so, we are not losing anything.

And now… how to restore them in Azure? Pretty simple as well:

New-AzResourceGroupDeployment -name "RestoreJob" -ResourceGroupName "MyRgName" -TemplateFile ".\MyResourceBackup.json"

That’s all for one resource, and what if I whant to “backup” all the resources contained in a Resource Group? You can do it as well! But in this case, you must change some of the parameters to export the file:

Export-AzResourceGroup -ResourceGroupName "MyRgName" -SkipAllParameterization -Path $FileName

You will avoid what is in the resource itself, but you will have a laaaarge JSON, with all the parameters and configurations. Then you can restore the resources with the same method explained earlier.

Also, a good point would be to configure an automation account to export all the configuration files from the portal and store them in a blob in order to have a copy of all the resources in the subscription.

Advertisement

Conditional Access extension for Chrome

If you’re implementing conditional access in your company and you’re struggling with Windows 10 devices and Chrome support, probably you will need to visit that Docs link: https://docs.microsoft.com/es-es/azure/active-directory/conditional-access/concept-conditional-access-conditions#chrome-support

But in this post, I want to talk about something related to it, in one of my projects, I have a CA policy that required one of the following selected controls: Require MFA or Require Hybrid AAD joined device

My device was Hybrid, so I was fullfilling one of the requirements, for example, when I was accessing with IE or Edge, the device info gets passed properly and MFA is bypassed for hybrid AAD machines.

But with Chrome, even having the Windows 10 Account extension pushed via GPO, I was able to see in the azure sign-in logs that device info is blank except for Browser and OS, so the AAD join status is not passed and MFA triggers. So it was very weird and it was causing me some problems…

So finally, after hours of troubleshooting, i finally figured out what was wrong. When you automatically install the extension, it doesn’t clear some cookies which Chrome will then try to use the old way of logging in. So in this case what you will need to do is access to chrome://settings/content/all and delete the cookies for login.microsoftonline.com

After doing that, everything was working perfectly, keep aware of that!!

What is really PIM?

Currently in all project were I’m involved I’m trying to used Best Practices of Security, including the use of PIM.  Privileged Identity Management it is a service that is available in Azure AD and is part of Azure AD Plan 2, it is used for all admin related tasks, where no employee has standing access within the company, reducing the surface of an attack.

PIM makes it possible to give a user the privilege to elevate his or her access rights for a preset amount of time to a higher role such as User Administrator or SharePoint Administrator.

PIM gives access to huge quantity of roles in Office 365 and Azure resources where the user is by default a reader and can elevate it to be an owner of a resource (group) for a specific amount of time (which is great!)

Enabling a PIM role is done by going to the Azure Portal and select the role you want to elevate. You need to do this for every role separately.

For example, imagine that you have members that need to elevate their account daily to be a SharePoint and User administrator, so they need to do this daily. After enabling they need to sign out and sign in again to make sure the roles are activated.

No more to give the role to a user and forget which role we give to them…

 

Discover who invited guest users with Log Analytics

Reading my posts, you will probably know that I am a bit fan of Log Analytics, so in this post we are trying to examine the AzureAD logs in order to discover who invited a specific a guest account, because sometimes can be quite a challenging question to find this information…

So first of all, we need to forward the audit logs from AAD to workspace in Log Analytics, once we have done this, we can execute the following query:

AuditLogs
| where OperationName == 'Invite external user' and Result == 'success'

InviteExternalUsers

As you can see in the image, this query shows some basic information about the users, but if you want to find all accepted invitations, you can execute the following…

AuditLogs 
| where OperationName == 'Invite external user' and Result == 'success'
| extend InvitationId = tostring(AdditionalDetails[0].value)
| join (
   	AuditLogs
	| where OperationName in('Redeem external user invite')
	| parse kind=regex TargetResources[0].displayName with * "InvitationId: " InvitationId:string ","
)
on $left.InvitationId == $right.InvitationId

Once we have done that, we can include this information in our Governance Plan or even to create some Log Analytics alerts in order to be sure that everything is doing under our umbrella os security

Nested Virtualization

Do you need to run a VM in Azure which this VM it is not supported or it’s giving problem to be upload to Azure. Use Nest Virtualization instead!
First thins to take into account:
  • You can do this on either Windows Server or Windows 10.
    Only “_V3” Azure VM’s support nested virtualisation.
Deploy a VM,  once you have logged into this VM run the followoing commands in an elevated PowerShell session:
Install the Hyper-V role:
Install-WindowsFeature -Name Hyper-V -IncludeManagementTools -Restart
Create a new network Switch
New-VMSwitch -Name “InternalNAT” -SwitchType Internal
Get the Interface Index number – take a note of this number to use next
Get-NetAdapter
Set an IP Address and create the network:
New-NetIPAddress -IPAddress 192.168.0.1 -PrefixLength 24 -InterfaceIndex 13
New-NetNat -Name “InternalNat” -InternalIPInterfaceAddressPrefix 192.168.0.0/24
Once this is complete you can then open up Hyper-V manager and create your “nested” VM, and this is where you create your Linux VM (or whatever you want). Just go and download the ISO file from the relevant website and create a new VM as per normal.
Now you only will be charged for the VM in Azure and not for the two virtual machines
That’s all

FIVE KEY QUESTIONS TO CONTROL CLOUD COSTS

Today I’m not talking regarding Azure, I’m talking generally in the Cloud, so… the case here is: you’re ready to modernize your infrastructure to truly benefit from the cloud and avoid sprawl, poor performance, complexity, and sky-high subscriptions. Here are the five key areas you need to plan things out.

  1. What problem are you solving? The business case behind your infrastructure must be understood, with input from key stakeholders. How will the cloud help you deliver that value to your users or customers? How will you benchmark and improve on that goal? Common goals for cloud infrastructure are faster service delivery, lower operating costs, agile deployment, simplified user experience, modernization, and resilience.
  2. What base infrastructure components will be used across your stack? Your subscription model, identity management, network, storage, backup, and other key components should work with any given app or VM that might be attached to your subscription.
  3. Where can you automate? Automation and standard procedures are key to realizing the full benefit of the cloud. This helps minimize costs by automatically provisioning and decommissioning and implementing configurations without intervention. Infrastructure as Code will only continue to be more vital to cloud management.
  4. Who is responsible for each piece of the stack? As you embrace more and more components delivered entirely as a Service, you must understand which entity is responsible for their management and potential failure. Obviously the underlying hardware is not yours to command, and a hardware-level failure would fall upon the service provider. But issue-resolution can be a difficult gray area. Be prepared to work with the cloud provider and have processes in place for issue tracking and resolution as well as version control and backup/restore.
  5. How are you managing cloud governance? Depending on your management models, it may be much easier for users within or outside the IT department to provision their own resources. It was much harder to order, install, and configure a physical server than to simply click a few times to have a virtual one ready to go. You must be ready to protect and secure your data from inside and outside risks, while also reining in sprawl and maintaining compliance. Automation tools and code are a good place to start with governance and compliance.

What do I put in front of my web application to secure it?

this a recurrent question from my customers, “I want to secure my application, how can I do it?”

So, this quick post is willing to show the different possibilities that I try to explain to them…

Azure Application Gateway Azure Traffic Manager Azure Front Door Azure CDN
Scope Regional Global Global Global
Protocols HTTP Any HTTP HTTP
VNet Possibility to create inside VNet Outside Outside Outside
Health Probe HTTP based / Response from 200 to 399 HTTP Based / Response Custom HTTP Based / Response 200
Caching Up to 8 MB Designed for
Type Web Traffic Load Balancer DNS based Load Balancer Application delivery Network CDN
Designed For Reverse Proxy & WAF Scalable Entry Point Sclalabe & Security Entrypoint Serve Static content
Backend Internal or External External External External
Routing Method Host Header or Path based Priority, Weight, Performance, Geo… Priority, Weight, Host Header, Path based

Monitor Azure AD Smart Lockouts with Log Analytics

Following my last post about Azure AD Smart Lockout, have you ever wonder, how to monitor those events? Of Course, we can do it with Log Analytics

HOW TO MONITOR SMART LOCKOUT?

Integrating the monitor and alerting of Smart Lockout is very simple, this post will explain you how to do it:

  1. In Azure Portal, Select Azure Active Directory > Diagnostic settings -> Add diagnostic setting. select Export Settings from the Audit Logs or Sign-ins page to get to the diagnostic settings configuration page.
  2. In the Diagnostic settings menu, select the Send to Log Analytics workspace check box, and then select Configure.
  3. Select the Log Analytics workspace you want to send the logs to, or create a new workspace in the provided dialog box.
  4. Select one or both of the following:
    • To send audit logs to the Log Analytics workspace, select the AuditLogs check box
    • To send sign-in logs to the Log Analytics workspace, select the SignInLogs check box
  5. Select Save to save the setting.

Now, what we will need to do is wait, probably in 5-10 minutes we will start to have data into workspace.

The logs we are looking for are in the AuditLogs and SigninLogs tables in the workspace.

SigninLogs
| limit 50

and click on run to get the last 50 sign-ins listed. As we are interested into the actual Smart-Lockouts (ResultType 50053), we need to add a filter criteria to our query and we probably want to limit it to the last seven days only.

SigninLogs 
| where CreatedDateTime >= ago(7d)
| where ResultType == "50053" 

And because we like to get notified, we can also set an Alert Rule on this type of query. Click on New Alert Rule, which will bring you the next blade.

Then we need to reuse and Action Group or create a new one, this is used to be alerted when the signal occurs. we have to be aware that depending of the frequency of the alert, we will be charged in €€€

Another thing to keep in mind is that we will have a lag between the alert and the action when it happens, it’s not inmediately.

that’s all, have fun!

AzureAD Smart Lockout

Have you ever wondered how Microsoft prevents password guessing attacks or brute force attacks on Azure AD? Well, it is basically the same method as you would do on your on-premises User Directory! It is just smarter!

Azure AD Smart lockout is a feature being applied to every sign-in processed by Azure AD, regardless if the user has a managed account or a synced accounts using password hash sync or pass-through authentication.
The smart part comes from the ability to distinguish valid users from attackers. It locks out the attackers while letting your users continue to access their accounts and be productive.

smartlockout

If you still run ADFS, there is also a Feature available named Extranet Smart Lockout but this one is not as smart as the one in Azure AD. https://support.microsoft.com/en-us/help/4096478/extranet-smart-lockout-feature-in-windows-server-2016

The default lockout setting kicks in after ten invalid login attempts for one minute. The account locks again after each subsequent failed attempt, for one minute at first and longer periods in subsequent attempts. Also, smart lockout tracks the last three bad password hashes to avoid incrementing the lockout counter for the same password, basically, if the same bad password is entered multiple times, it will not cause another lockout.

Note: The monitoring of the same sign-in attempt is only available for the Password Hash sync scenario, as the Pass-Through password validation happens against your on-premise AD domain controllers.

Smart Lockout is always turned on for all Azure AD customers. If you want to modify the default behavior of 10 invalid attempts to trigger a one-minute lockout, then you require Azure AD P1 or P2 licenses for your users.

Moving VM disk to remote subscription

Hey Folks! In todays article I want to show you how to copy managed disks between Azure subscriptions using PowerShell.

Script is very easy in use. The only things which you should provide are variables on the beginning like subscription ids, resource groups etc. As a result .vhd file will be created under destination container on storage account.

Remember that Azure account under which script for copy managed disks between Azure subscriptions will be run should have permission in both subscriptions, otherwise script will retun error.

Let’s rock!

#Variables
$SourceRG = ''
$SourceSubscrpitionId = ''
$SourceTenantId = ''
$ManagedDiskName = ''
$DestinationRG = ''
$DestinationSubscriptionId = ''
$DestinationTenantId = ''
$DestinationStorageAccount = ''
$containerName = ''
$vhdName = $ManagedDiskName + '.vhd'
#Source
Select-AzSubscription -Subscription $SourceSubscrpitionId -Tenant $SourceTenantId
$grant = Grant-AzDiskAccess -ResourceGroupName $SourceRG -DiskName $ManagedDiskName -Access Read -DurationInSecond 10800
#Destination
Select-AzSubscription -Tenant $DestinationTenantId  -Subscription $DestinationSubscriptionId
$storageAccount = Get-AzStorageAccount -StorageAccountName $DestinationStorageAccount -ResourceGroupName $DestinationRG
if($storageAccount -eq $null)
{
New-AzStorageAccount -StorageAccountName $DestinationStorageAccount -ResourceGroupName $DestinationRG -Location "West Europe" -SkuName "Standard_LRS"
}
$storageAccountKey = Get-AzStorageAccountKey -ResourceGroupName $DestinationRG -Name $DestinationStorageAccount
$storageContext = New-AzStorageContext -StorageAccountName $DestinationStorageAccount -StorageAccountKey $storageAccountKey.Value[0]
$container = Get-AzStorageContainer $containerName -Context $storageContext -ErrorAction Ignore
if ($container -eq $null)
{
New-AzStorageContainer $containerName -Context $storageContext
}
#copy
$CopyToBlob = Start-AzStorageBlobCopy -AbsoluteUri $grant.AccessSAS -DestContainer $containerName -DestBlob $vhdName -DestContext $storageContext
#copystate
$State = $CopyToBlob | Get-AzStorageBlobCopyState
While($State.Status -eq "Pending"){
Start-Sleep 30
$State = $CopyToBlob | Get-AzStorageBlobCopyState
$PercentCompleted = [Math]::Round((($State.BytesCopied/$State.TotalBytes)*100))
Write-Host "$PercentCompleted % completed for managed disk $ManagedDiskName"
}