Hi! I will participate in the Spanish track of the GlobalAzure event, I will talk about security, AzureAD and best practices. If you want to knowmore, check out the agenda: https://globalazure.es/#schedule

Hi! I will participate in the Spanish track of the GlobalAzure event, I will talk about security, AzureAD and best practices. If you want to knowmore, check out the agenda: https://globalazure.es/#schedule
As you probably have seen in my previous posts, security keys are here to stay. They can be used as a separate authentication method beyond secondary authentication. There are multiple manufacturers that help us in our passwordless journey, for that reason, to the realization of this post, I used a IDMelon security key that also supports FIDO2
The main difference here with other security key, is that IDMelon uses is own security authenticator app which I will review during this post.
How it works
First of all, you need to download the app to your smartphone, which can be done in a very simply steps and then plug the security key and install the software.
Once you have done this, you can pair your smartphone and the key:
And you’re ready to go:
So, once you have done, you’re ready to go to https://aka.ms/setupmfa and configure the security key for your user account. I am not going to cover the process because it has been done in previous post, and the process is straightforward (PassWordless Authentication with Fido 2 Keys – Albandrod’s Memory (albandrodsmemory.com) PassWordless Authentication with Fido 2 Keys – Part 2 – Albandrod’s Memory (albandrodsmemory.com)
At first glance, what it takes my attention to, it was the push notification, I was expecting the typical push notification from authenticator, but in the IDMelon security key, it was provided by the application that you have installed in the smartphone earlier.
If you look deeper into this application, you can check the current plan that you have with the security key, and most important, the activity log of the security key, which I think is great!
Now, if we look into the AAD Sign in we are able to review the sign in information regarding the security Key:
To conclude, I found that IDMelon keys are a great product, because not only provides a password less journey to the users, also provides a simply way to manage the activity of the security token and also the signin process.
Thanks to IDMelon for providing this token to test out their solution
till nex time!
This history began with a new dev project, we needed to be included in a DevOps Project inside the customer organization.
We were first invited to the Teams group to collaborate, upload the documentation and so on, so our users were first created in the AAD of the customer, till here, no problem.
But then, the customer created the DevOps project, and he invited us to collaborate in the project, we received the mail, but when we tried to access, we were receiving the following error message:
We were pretty sure, that we had access to the project, we were checking with the customer the access, and we were having access, we waited some time to replicate the permissions change, but nothing, so where was the problem?
The error page shows that we do not have access, so after digging a while with the problem, I realized that when I tried to navigate to the organization URL, in Edge was showing the error message that could lead us to something:
So, the problem is that guests are not allowed to access to the organization (TF909091), so how we can solve that problem?
Pretty simple, we need to ask the customer, to go to the organization settings and modify the security policies:
Also, to check if in the policies of the project, the check was allowed:
After doing that, we were able to access to the DevOps project, and start working
Problem and mystery solved!
Currently, we could say that Legacy Authentication is one of the most compromising sign-in, luckily for us, older protocols have been replacing with modern authentication services, taking the advantage that MA supports MFA, while Legacy Authentication refers to all protocols that use Basic Authentication, and only requires one method of authentication.
So, it is important thar for security reasons we need to disable legacy authentication in our environments, why? Because enabling MFA isn’t effective if legacy protocols are not blocked. For example, the following options are considered legacy authentication protocols:
How can we monitor the usage of legacy authentication in Azure AD?
Thanks to Log Analytics, Insights and workbooks, we are able to monitor the use of those protocols, for instance:
And check the non-interactive sign-ins (be careful with ADConnect sync accounts):
What we can do to avoid this?
The best way to block or report legacy authentication for users is use Conditional Access policies (Does my organization need Azure AD Conditional Access? – Albandrod’s Memory (albandrodsmemory.com) & Enabling zero trust security in your environment – Albandrod’s Memory (albandrodsmemory.com)
But the best way is creating a CA policy:
My final advice
Legacy authentication must be disabled to protect our environments, but first, start small and analyse the impact in your organization.
Till next time!
This is a second part of my blog about reviewing Fido2 Keys from Feitian (PassWordless Authentication with Fido 2 Keys – Albandrod’s Memory (albandrodsmemory.com))
In this case, I am testing out the K33 and K44 products
The initial setup of the tenant is covered in my previous post, so I will skip the details of how to do it.
To configure the K33 key you will need to download the app “BioPass FIDO2 Manager” from the Windows Store:
And connect your K33 key via USB to the laptop (otherwise won’t be possible to configure), the configure your preferred PIN, and finally configure your fingerprints. The process to the K44 is similar, but in this case, I am using and Ipad, and the app to download is “iePassManager”
Once the two keys are configured, you’re ready to setup them in AzureAD MFA (https://aka.ms/setupmfa)
K33 process
I have explained the process of how to initially configure the K33 key, but I strongly recommend to follow the steps mentioned in K33_Microsoft_Services_Guide.pdf (ftsafe.com) to pair the key with your laptop.
Once the key has been paired, the process to configure it is simple, the only thing that you must take into account is that even it is Bluetooth Key, you must configure it as an USB key (but remember, it must be paired first with the device).
Authentication with K33
K44 Registration Process
Once again, it is needed to set up the PIN for the Key, in my case, it has been done with the Ipad, but the registration process, is easy as the video shows:
The sign in process is very similar as we’ve seen before, so I do not want to cover this, but as you can observe, the registration and use of Fido2 Keys is pretty simple.
Inclusion, MFA keys and particularly, Fido2 Keys from Feitian are great!! But now, something that you must consider when implementing Fido2 keys in your environment:
There’s no way to enforce PIN policy in Azure AD: Every user can set up their own PIN to use their key. There is no centralized way to manage PINs, but Windows Hello for Business blocks simple PIN codes by default. The bad news is, if you add the key directly to your Azure AD account, these settings are overridden ☹
Feitian offers multiple options for connecting your key, so you’re sure to find one that works for you. Among the available connections are USB-A, USB-C, NFC, Bluetooth, PIN, biometrics, and more.
Biometrics requires app installation: you need to download the manufacturers’ application that enables fingerprint scanning, which is additional software that you must consider to install
Again, I want to thanks Feitian for providing the security keys to test out the use cases
Till next time!
Have you ever tried to create an AzureAD application to give SSO access to an OnPrem application? I had to do it with an SAP application, the process it is straightforward, but what about giving permission to end users?
You can give nominal permission to each user who needs to access to the app or even group, but you must be aware about group limitations:
Group-based assignment is supported only for security groups. Nested group memberships are not supported for group-based assignment to applications at this time.
https://docs.microsoft.com/en-us/azure/active-directory/users-groups-roles/groups-saasapps
My Customer had a complex role based user permission model, so it was impossible for them for using AD/AAD groups. The workaround it is not being granular giving user permissions, it is to grant Everyone access to the app registration. To do this, we must select the “User assignment required” option in the “Properties” blade on the enterprise side of the app registration to no, which allows all logged in users to have access to the service.
Doing this, we rely on the permission given to the app to access
Easy solution, problem solved, till next time!
Nobody can doubt that 2021 has been the year to adopt the cloud (due to COVID of course), mostly because most of us needed to work from home. We can say that business has changed and “Probably” will never go back to was it was.
Remote work will continue growing, so in 2022 we will need to protect our assets much better, and for this, here are my predictions/concerns for the next year:
For now, I think that it’s all, stay tuned to the blog and happy new year!
Probably you’re asking yourself what’s a jump host? So in simple words, is a virtual host which is not the same as you use daily to read e-mail, browse the web, install software, but is used to perform administrative tasks for one or multiple IT infrastructures.
These are some of the recommendations that I follow when I need to deploy a jump host in Azure. The first two, are the most important, you have to be sure of not doing any of these
and other some recommendations…
That’s all, as always, these are my recommendations, probably you have different ones
The following are recommendations and thoughts that I extracted by working with several customers, maybe you will find it obvious, but for other people could be useful. So, let’s begin:
In the identity plane, we could say that exists 2 categories:
I don’t want to enter of how to resist or contain attacks, because probably I covered some of these topics in other blog entries, but for me, there is another category which is: understand the human nature.
Nothing more that understand that almost every rule that we impose to the end users, result in degradation of security. Why? Because we force users to use long passwords, with special characters, and in the end, users tend to reuse passwords which makes easier to guess or crack passwords for malicious actors.
So, in the post I will resume some of my experiences as AntiPatterns and recommendations:
My tip: Use minimum 8 length requirement but ban common passwords with Azure AD Password Protection.
My tip for the two previous points: Azure AD Password Protection + Conditional Access based on User Identity
Tip: Look at my first tip 😊
Enabling MFA prevents up to 99.9% of identity attacks, and if we use other controls such as user location, the better.
PRO TIP: Use Conditional access with FIDO2 security key (PassWordless Authentication with Fido 2 Keys – Albandrod’s Memory (albandrodsmemory.com))
EndUser TIP: Consider turning on two-step verification everywhere you can
Probably you will have different ones based on your experience but these are my recommendations. Till next time and stay safe!
SSH File Transfer Protocol is a very common protocol used by many customers for secured file transfer over a secure shell. Microsoft did not have a fully managed SFTP service in Azure, but now is it possible to do it with Azure Blob Storage.
So, you will be able to use an SFTP client to connect to that storage account and manage the objects inside and even specify permissions for each user.
But before beginning, you will need to register the SFTP feature in your subscription, to do that you have to type the following:
# Set the Azure context for the desired subscription
az account set --subscription "xxxx-xxxx-xxxx-xxxx"
# Check if the live tier feature is registered first
az feature show --namespace Microsoft.Storage --name AllowSFTP
# Register the live tier feature on your subscription
az feature register --namespace Microsoft.Storage --name AllowSFTP
Also, you can check that information in Preview Features option in the Azure Portal:
Once you have that, you will need to enable the hierarchical namespace in the storage account, note that you can’t enable that on an existing storage account…
BEFORE
AFTER
At the time of writing, I couldn’t create the FTPS service through the Azure Portal or event with template, when I select WestEurope as destination, in the future I’m sure that would be supported
Now, we can deploy the ARM template to the RG previously created in Azure, but, previously to do that, you have to decide if your user will connect through Password or a SSH Key. In my case, I decided to implement it with an ARM Template with SSH key, but first you need to generate an SSH key pair:
next I provide the two ARM templates for both types of implementation
Template FOR PASSWORD Implementation:
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"storageAccountType": {
"type": "string",
"defaultValue": "Standard_LRS",
"allowedValues": ["Standard_LRS", "Standard_ZRS"],
"metadata": { "description": "Storage Account type" }
},
"location": {
"type": "string",
"defaultValue": "northeurope",
"allowedValues": ["westeurope", "northcentralus", "eastus2", "eastus2euap", "centralus", "canadaeast", "canadacentral", "northeurope", "australiaeast", "switzerlandnorth", "germanywestcentral", "eastasia", "francecentral"],
"metadata": { "description": "Region" }
},
"storageAccountName": {
"type": "string",
"metadata": { "description": "Storage Account Name" }
},
"userName": {
"type": "string",
"metadata": { "description": "Username of primary user" }
},
"homeDirectory": {
"type": "string",
"metadata": { "description": "Home directory of primary user. Should be a container." }
}
},
"resources": [
{
"type": "Microsoft.Storage/storageAccounts",
"apiVersion": "2021-02-01",
"name": "[parameters('storageAccountName')]",
"location": "[parameters('location')]",
"sku": {
"name": "[parameters('storageAccountType')]"
},
"kind": "StorageV2",
"properties": {
"isHnsEnabled": true,
"isSftpEnabled": true
},
"resources": [
{
"type": "blobServices/containers",
"apiVersion": "2021-02-01",
"name": "[concat('default/', parameters('homeDirectory'))]",
"dependsOn": ["[parameters('storageAccountName')]"],
"properties": {
"publicAccess": "None"
}
},
{
"type": "localUsers",
"apiVersion": "2021-02-01",
"name": "[parameters('userName')]",
"properties": {
"permissionScopes": [
{
"permissions": "rcwdl",
"service": "blob",
"resourceName": "[parameters('homeDirectory')]"
}
],
"homeDirectory": "[parameters('homeDirectory')]",
"hasSharedKey": false
},
"dependsOn": ["[parameters('storageAccountName')]"]
}
]
}
],
"outputs": {
"defaultContainer": {
"type": "string",
"value": "[parameters('homeDirectory')]"
},
"user": {
"type": "object",
"value": "[reference(
resourceId('Microsoft.Storage/storageAccounts/localUsers', parameters('storageAccountName'), parameters('userName'))
)]"
}
}
}
Template for SSH Implementation
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"storageAccountType": {
"type": "string",
"defaultValue": "Standard_LRS",
"allowedValues": ["Standard_LRS", "Standard_ZRS"],
"metadata": { "description": "Storage Account type" }
},
"location": {
"type": "string",
"defaultValue": "northeurope",
"allowedValues": ["westeurope", "northcentralus", "eastus2", "eastus2euap", "centralus", "canadaeast", "canadacentral", "northeurope", "australiaeast", "switzerlandnorth", "germanywestcentral", "eastasia", "francecentral"],
"metadata": { "description": "Region" }
},
"storageAccountName": {
"type": "string",
"metadata": { "description": "Storage Account Name" }
},
"userName": {
"type": "string",
"metadata": { "description": "Username of primary user" }
},
"homeDirectory": {
"type": "string",
"metadata": { "description": "Home directory of primary user. Should be a container." }
},
"publicKey": {
"type": "string",
"metadata": { "description": "SSH Public Key for primary user." }
}
},
"resources": [
{
"type": "Microsoft.Storage/storageAccounts",
"apiVersion": "2019-06-01",
"name": "[parameters('storageAccountName')]",
"location": "[parameters('location')]",
"sku": {
"name": "[parameters('storageAccountType')]"
},
"kind": "StorageV2",
"properties": {
"isHnsEnabled": true,
"isLocalUserEnabled": true,
"isSftpEnabled": true
},
"resources": [
{
"type": "blobServices/containers",
"apiVersion": "2019-06-01",
"name": "[concat('default/', parameters('homeDirectory'))]",
"dependsOn": ["[parameters('storageAccountName')]"],
"properties": {
"publicAccess": "None"
}
},
{
"type": "localUsers",
"apiVersion": "2019-06-01",
"name": "[parameters('userName')]",
"properties": {
"permissionScopes": [
{
"permissions": "rcwdl",
"service": "blob",
"resourceName": "[parameters('homeDirectory')]"
}
],
"homeDirectory": "[parameters('homeDirectory')]",
"sshAuthorizedKeys": [
{
"description": "localuser public key",
"key": "[parameters('publicKey')]"
}
],
"hasSharedKey": false
},
"dependsOn": ["[parameters('storageAccountName')]"]
}
]
}
],
"outputs": {
"defaultContainer": {
"type": "string",
"value": "[parameters('homeDirectory')]"
},
"user": {
"type": "object",
"value": "[reference(
resourceId('Microsoft.Storage/storageAccounts/localUsers', parameters('storageAccountName'), parameters('userName'))
)]"
},
"keys": {
"type": "object",
"value": "[listKeys(resourceId('Microsoft.Storage/storageAccounts/localUsers', parameters('storageAccountName'), parameters('userName')), '2019-06-01')]"
}
}
}
Once you have deployed the template, you can go to the portal to configure the user permission:
Remember to keep the password, without that you can’t be able to connect to the SFTP
And now, you can connect to the SFTP via PS or other preferred tool
And play with some of the files:
We can check the blob itself to review the information about the recent uploads:
As you have seen, now you’re able to deploy your SFTP for Azure Blob Storage without worrying about Container Solutions or other weird experiments.
Till next time!