Microsoft has released multiple security updates in last Patch Tuesday. One off them fixes a high risk vulnerability (CVE-2021-38647) Also know as OMIGOD. This vulnerability can be used remotely, so exploitation is expected soon.
This flaw doesn’t directly affect Windows at all, because it’s a bug in Microsoft’s open source Open Management Infrastruture (OMI) tool that is designed for Linux in general, and for Azure-hosted Linux servers in particular. However, a lot of resources in Azure do use it
A brief overview
Simplified, OMI is Microsoft’s Linux based answer to WMI, that sysadmins use to keep managing their Windows Networks.
Like WMI, the OMI code runs as a priviliged process on your servers so that sysadmins, and system administration software, can query and control what’s going on, such as enumerating processes, kicking off utility programs, and checking up on system configuration settings.
Unfortunately, cyber criminals love WMI/OMI like we sysadmins do.
Sadly, OMIGOD is an OMI bug that, in theory, offers criminals the same sort of distributed power over your Linux servers…
Today I had to renew my Azure Solutions Expert certification. This was a first time I had to do that. By going to your certification profile you can do a online exam with just 26 questions with numerous things that have been changed in the past year.
I have to say this is a nice way of renewing, and made me think and search and update my Azure knowledge.
Some people might have noticed it, but Microsoft has released a new Azure Icon in its portal. By changing the icon Microsoft wants to match the style with their Fluent Design System making it more familiar for their customers.
Recently a customer asked me how to save cost on their Azure SQL database without moving away from DTU based subscription model. In this case this customer knows exactly at what time their database is heavily utilized, and when it’s idling. So with a script its easy to automate.
In this manual we are going to size a SQL database from S4 to S3.
Step 1: In this first step we are going to add some modules to your Automation Account. Go to modules, and click on Browse gallery
From the Gallery search for az.accounts, click on it
Next make sure to Import the module
Now browse the Gallery again, this time search for az.sql and make sure to import this module as well.
STEP 2: This next step is important. We will need to create and assign a Run As Account when you’ve chosen not to create a run as account on the setup of your automation Account. Go to Run as Account, and click on Create Azure Run As Account
Click on Create
STEP 3: Now we will need to add some variables to your automation account. These variables will need to be filled with information about your Azure SQL Database and Server. Create the following variables, and make sure that you fill them.
Servername (without database.windows.net)
STEP 4: Now go to runbooks, and create a new runbook!
Give your runbook a name, as type select PowerShell!
In the new opened window copy and paste the code from below. Adjust the variables $Edition and $PricingTier to your needs.
$ResourceGroupName = Get-AutomationVariable -Name "Resourcegroup"
$ServerName = Get-AutomationVariable -Name "Servername"
$DatabaseName = Get-AutomationVariable -Name "Database"
$Edition = "Standard"
$PricingTier = "S4"
# Keep track of time
# Log in to Azure with AZ (standard code)
Write-Verbose -Message 'Connecting to Azure'
# Name of the Azure Run As connection
$ConnectionName = 'AzureRunAsConnection'
# Get the connection properties
$ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName
'Log in to Azure...'
$null = Connect-AzAccount `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
# You forgot to turn on 'Create Azure Run As account'
$ErrorMessage = "Connection $ConnectionName not found."
# Something else went wrong
Write-Error -Message $_.Exception.Message
# Getting the database for testing and logging purposes
$MyAzureSqlDatabase = Get-AzSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName
Write-Error "$($ServerName)\$($DatabaseName) not found in $($ResourceGroupName)"
Write-Output "Current pricing tier of $($ServerName)\$($DatabaseName): $($MyAzureSqlDatabase.Edition) - $($MyAzureSqlDatabase.CurrentServiceObjectiveName)"
# Set Pricing Tier Database
# Check for incompatible actions
if ($MyAzureSqlDatabase.Edition -eq $Edition -And $MyAzureSqlDatabase.CurrentServiceObjectiveName -eq $PricingTier)
Write-Error "Cannot change pricing tier of $($ServerName)\$($DatabaseName) because the new pricing tier is equal to current pricing tier"
Write-Output "Changing pricing tier to $($Edition) - $($PricingTier)"
$null = Set-AzSqlDatabase -DatabaseName $DatabaseName -ServerName $ServerName -ResourceGroupName $ResourceGroupName -Edition $Edition -RequestedServiceObjectiveName $PricingTier
# Show when finished
$Duration = NEW-TIMESPAN –Start $StartDate –End (GET-DATE)
Write-Output "Done in $([int]$Duration.TotalMinutes) minute(s) and $([int]$Duration.Seconds) second(s)"
Use the menu to Save your runbook, use the Test pane to review the output of your PowerShell script. When ready Publish your runbook!
STEP 5: Last step is to create a schedule. From your workbook go to Schedules, and Add an schedule.
Create a new schedule based on your requirements/needs.
Click create to finalize the process. Now go back to your SQL database. When the change is happening, you should see a update line like below that shows that the pricing tier is being updated!
When users leave the company you might want to retain the email for a longer period than the default 30 days. By enabling litigation hold you can retain mailboxes longer than 30 days, before you disable a user you can set the litigation hold to any value you would like. But at some point you might need the mailbox to be re-enabled for some reason. In this manual I am going to explain how to do it.
STEP 1: Open a new Power Shell window and type the following command
How cool would it be to automate your daily SQL tasks using Azure Automation? Well, really cool off course! So lets start using Azure Automation! So go ahead, if you don’t have an automation account yet, create one by going to Automation Accounts.
Give your automation account an name, choose a subscription, resource group and a location and hit the create button!
Microsoft has released its Data Loss Prevention tools for endpoint clients. Customers with Microsoft 365 subscriptions can now protect data on physical devices next to online services and apps.
This new feature it is possible to enable Microsoft 365 policies that have been configured for apps, to be active on computers as well. This is an extra service of Data Loss Prevention. It enables IT administrators to allow users what to do with sensitive data, and what to share. For example, IT administrators can block copying sensitive files to an external USB drive, or print the file.
With the modern workplace getting more and more into the businesses, you might want to verify if your devices have been joined to both your local on-premises AD and Azure AD. Just one simple command is all you need to verify the status.
On the (hybrid) domain joined device open up a command prompt as administrator, and run the following command:
This should give you a result like below. The explanation for each value can be found below.
When you migrate to Azure SQL, you might think that Azure does all SQL maintenance, including the maintenance of your database… But the truth is, you will need to setup some maintenance yourself for your databases. Microsoft doesn’t know what is best for your application or database. With this manual you should be able to setup basic database maintenance on Azure SQL.
STEP 1: First we will need to install AD connect. Run the setup wizard and follow the steps, this is an easy process. After installation the configuration wizard starts, and this is where it gets interesting.
STEP 2: Let’s go through the wizard, first agree with the license terms and click Continue. Feel free to actually read the license terms 🙂
Microsoft had announced the limited preview of Azure Shared disks. With these announcement it will be possible to migrate clustered environments running Windows Server to Azure. This capability is designed to support SQL Server, Scale-Out File servers, RDS User Profile Disk and SAP ASCS/SCS servers running on Windows. Also Linux-based clustered file systems like GFS2 are supported.
The diagram above shows a 2 node cluster with a single shared disk. Just one node will receive write access, the other node will only receive read access. In case Azure Virtual Machine 1 goes down, write access will be transferred to Azure Virtual Machine 2. This scenario can be extended to more than 2 machines, but multiple shared disks can be attached as well, making it ideal for running parallel jobs or other multi machine tasks.
Azure Shared Disks are only available on Premium SSD disks and only greater than P15 (256GiB) Microsoft has announced that Azure Ultra disk support will be released soon. The number of nodes that can be attached to a disk needs to be preset before mounting the disk to any node. Each disk type has its only limitation. The IOPS limit and bandwidth limit are not affected by this number. I would recommend to set this value has high as possible when deploying. In case a shared disk needs resizing to expand the number of nodes, it is required to un-mount the disk from all nodes.
Today I had the honors to do another workshop Ethical hacking together with Erik Loef. It is always good to share your knowledge, and help other people with their work, now and in the future. I hope that these students will embrace what they have learned, and that they will apply this newly obtained knowledge at their (future) employers.
When you want to migrate an older environment to Office 365 and OneDrive, you might miss the OneDrive GPO settings. Unfortunately Microsoft hasn’t release the download of the ADMX files. You will need to grab them manually from a recent Windows 10 machine, and import them in the right location.
Since I like to simplify things, I thought it might be convenient to create a prepared ADMX ZIP file with all necessary files, ready for extraction. So here is a link to download OneDrive ADMX files. Just simply extract the proper folders to the following location:
Local Domain Controller store: C:\Windows\PolicyDefinitions\
Central Active Directory store: \\<your domain>\sysvol\<your domain>\Policies\PolicyDefinitions\
Today I noticed a new checkbox in the Azure Portal. Microsoft has released IPv6 in the Public preview for Azure VNets. Virtual machines will be equipped with a dual-stack IP connectivity. Meaning both will be available. With the ending of IPv4 addresses it makes IPv6 mandatory for everybody.
From the Azure portal you can now add IPv6 address to the address scope on the VNet level.
The following diagram shows how IPv6 works as a dual-stack next to IPv4
If you have a large on premise environment, you might want to automate the assignment of Office 365 licenses by using (dynamic) security groups in Azure AD. With this simple manual you should be able to setup automatic license assignment based on a security group.
By default everyone may create a new team in Microsoft Teams. As an organisation admin you might want to control this, or release it a some point. With this manual you should be able to lock down team creation to users that are member of a Azure AD Security group.
STEP 1: First we will need to install the Preview version of the Azure Active Directory PowerShell module for Graph. Open a PowerShell window with Adminstrator privileges and run the following 2 commands:
The result of the script should give you the updated settings. On the last line you should see EnableGroupCreation. If you want to reverse this setting. Just simply change the following line to True and run the entire script:
$AllowGroupCreation = “True”
If you want another security group, rerun the script with the new group name.
So you want to clean up unused (shared) mailboxes in your Exchange (Online) environment. How to find out which mailboxes have been inactive for a long time? The answer is yet simple again, with a cool Power Shell script.