I am still available for music performance through my company but the IT Side of the business has been closed as I am on a full time project and cannot devote any outside time to IT consulting. Thank everyone for supporting me over the years. My blog will still remain current so please check here often for the latest updates for Exchange, M365, Security, Compliance, and Windows!
Category: Active Directory
Error: Policy Is Missing when trying to load and run an AIP/MIP UL on-premises content scan
WORKAROUND UPDATE!
SEE AT END OF THIS POST!
There is a current BUG is has been filed with Microsoft that relates to AIP/MIP Scanner and running a Unified Labeling content scan on premises. The main issue is with the Security and Compliance Center and it replicating the Policies that you create for your Sensitivity Labels in your M365 Tenant.
Since these Policies will not replicate, your content scans will fail and you will see the following error within the Azure Portal under Azure Information Protection:

You will be able to verify that the Policies are present in the Security and Compliance Center under the Information Protection page and the Policies Page in Azure Information Protection:


NOTE: I had created my labels and label policies in Azure AIP and migrated them to Security and Compliance Center via the following LINK
What you will also notice, if you create a policy in SCC, it will NOT replicate to Azure.
Next, I checked to see if the AIP Scanner Service Account has the policies applied to it as a member recipient of the policies. It needs to so that the account can apply labels to on premises accounts through the policy.

Let’s continue troubleshooting…
The AIP Scanner account was a member of a defined policies in the Security & Compliance Center and you are still having issues:
- Is the AIP Scanner service started?
- If the answer is no, start it
- From PowerShell run the following:
1 | Get-AIPScannerStatus | fl |

It says it is scanning, but you are not getting results AND you have that Error: Policy is missing statement in the Nodes Tab of AIP.
The next thing to verify this whether or not the policy is replicating from SCC to Azure. This is done through PowerShell by running the following:
Connect to SCC PowerShell
1 2 3 4 5 6 7 | $userCredential = get-credential Write-Host "Connecting to your Security and Compliance Center PowerShell Console" $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.compliance.protection.outlook.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection Import-PSSession $Session -DisableNameChecking |
Check the policy replication status
1 | Get-labelpolicy | select-object Name,DistributionStatus,WhenCreated,WhenChanged | FL |

Distribution Status is in Pending
Normal replication is up to 24 hours for a change or policy addition. So, if your WhenChanged or WhenCreated values are more than 24 hours old, then they are NOT replicating. You can further verify this by running the following:
1 | get-labelpolicy -identity "Name of your Policy" | fl DistributionResults,LastStatusUpdateTime |

Replication Taking Longer Than Expected Error
What do I do next?
If you have this error, it would be best to log a support call with Microsoft and explain that you have the AIP UL Policy Replication Error. From my sources they are saying this is a known issue with the SCC and Azure that will be remediated by the end of October.
So, in the meantime, I guess we will wait!
WORKAROUND
After troubleshooting with this issue with some of my Microsoft Colleagues, I was able to get the Scanner to start scanning properly with out the error being listed in the Azure Portal. Here are the steps.
On the scanner node, right-click a file or folder and choose to protect it:

Next, within the AIP Application, choose Help and Feedback

Next, choose Reset Settings

Click Continue

Once completed, click Close, then exit the AIP application. This clears all the registry settings within the scanner node.
Now you will want to reset all the local files for the scanner
First, stop the scanner services for the scanner and network discovery
1 2 3 | Stop-Service AIPScanner Stop-Service AIPNetworkDiscovery |
Next, navigate to the following folder for the local account that is used for AIP scanner. Example – C:\Users\AIPScanner\AppData\Local\Microsoft\MSIP
Rename or Delete the MIP folder in that MSIP directory.
(I renamed my folder to mip-old2)

Restart the services you stopped
1 2 3 | Start-Service AIPScanner Start-Service AIPNetworkDiscovery |
You should now see the scanner as Running and Working within the Azure Portal. No more errors should be listed.

Thanks to Angel Marroquin at Microsoft for the assistance on this workaround!
THANKS FOR VIEWING!
KEEP THE COMMENTS FLOWING!
REFERENCES:
Migrate AIP Policies
AIP FAQ
Installation and Configuration of Azure Information Protection Unified Labels Scanner
With the release of Unified Labeling in Azure and M365, there is now a way to protect your data and label your data appropriately for confidentiality and encryption for your files shares and files on your on premises devices. The following shows how to install the latest AIP_UL client and configure it in Azure to apply those Unified Labeling Policies.
This is a detailed process and I had some issues myself with getting the process simplified. I will do my best here to make this as smoot has possible with as many reference documents that I can input. Always feel free to comment as this data is ever changing and updating as Microsoft updates the offering.
Prerequisites
Please refer to this document for a full list of pre-requisites before deploying the scanner:
https://docs.microsoft.com/en-us/azure/information-protection/deploy-aip-scanner-prereqs
For the basics we have the following:
The prerequisites below are still required for successful AIP scanner installation.
- A Windows Server 2012 R2 or greater Server to run the service
- Minimum 4 CPU and 4GB RAM physical or virtual.
NOTE: More RAM is better. The scanner will allocate RAM 2.5-3 times of size of all files being scanned in parallel. Thus, if you scan 40 files that are 20MB each at the same time, it should take about 202.540=2GB RAM. However, if you have one big 1GB file it can take 3GB of RAM just for that file.
- Minimum 4 CPU and 4GB RAM physical or virtual.
- Internet connectivity necessary for Azure Information Protection
- A SQL Server 2012+ local or remote instance (Any version from Express or better is supported)
- Sysadmin role needed to install scanner service (the user running Install-AIPScanner, not the service account)
NOTE: If using SQL Server Express, the SQL Instance name is ServerName\SQLExpress.
NOTE: At this time, a different SQL instance is needed for each AIP Scanner node.
- Sysadmin role needed to install scanner service (the user running Install-AIPScanner, not the service account)
- Service account created in On Premises AD (I will call this account AIPScanner in this document).
- Service requires Log on locally right and Log on as a service right (the second will be given during scanner service install).
- Service account requires Read permissions to each repository for discovery and Read/Write permissions for classification/protection.
- AzInfoProtection_UL.exe is available on the Microsoft Download Center (The scanner bits are included with the AIP Client)
- The Azure AD Preview PowerShell module. From the machine you’re installing AIP Scanner on, run the following from an Administrator PowerShell:
1 | Install-Module AzureADPreview |
Configure the scanner in the Azure portal
Before you install the scanner, or upgrade it from an older general availability version, configure or verify your scanner settings in the Azure Information Protection area of the Azure portal.
To configure your scanner:
- Sign in to the Azure portal with one of the following roles:
- Compliance administrator
- Compliance data administrator
- Security administrator
- Global administrator
Then, navigate to the Azure Information Protection pane. For example, in the search box for resources, services, and docs, start typing Information and select Azure Information Protection.
- Create a scanner cluster. This cluster defines your scanner and is used to identify the scanner instance, such as during installation, upgrades, and other processes.
- Scan your network for risky repositories. Create a network scan job to scan a specified IP address or range, and provide a list of risky repositories that may contain sensitive content you’ll want to secure. Run your network scan job and then analyze any risky repositories found.
- Create a content scan job to define the repositories you want to scan.
Create a scanner cluster
- From the Scanner menu on the left, select Clusters
.
- On the Azure Information Protection – Clusters pane, select Add
.
- On the Add a new cluster pane, enter a meaningful name for the scanner, and an optional description. The cluster name is used to identify the scanner’s configurations and repositories. For example, you might enter Europe to identify the geographical locations of the data repositories you want to scan. You’ll use this name later on to identify where you want to install or upgrade your scanner.
- Select Save
to save your changes.
- On the Add a new cluster pane, enter a meaningful name for the scanner, and an optional description. The cluster name is used to identify the scanner’s configurations and repositories. For example, you might enter Europe to identify the geographical locations of the data repositories you want to scan. You’ll use this name later on to identify where you want to install or upgrade your scanner.
- Select Save
to save your changes.
Create a network scan job (public preview)
Starting in version 2.8.85.0, you can scan your network for risky repositories. Add one or more of the repositories found to a content scan job to scan them for sensitive content.
Note: The network discovery interface is currently in gradual deployment and will be available in all regions by September 15, 2020.
Network discovery prerequisites
Prerequisite | Description |
---|---|
Install the Network Discovery service | If you’ve recently upgraded your scanner, you may need to still install the Network Discovery service. Run the Install-MIPNetworkDiscovery cmdlet to enable network scan jobs. |
Azure Information Protection analytics | Make sure that you have Azure Information Protection analytics enabled. In the Azure portal, go to Azure Information Protection > Manage > Configure analytics (Preview). For more information, see Central reporting for Azure Information Protection (public preview). |
Creating a network scan job
- Log in to the Azure portal, and go to Azure Information Protection. Under the Scanner menu on the left, select Network scan jobs (Preview)
.
- On the Azure Information Protection – Network scan jobs pane, select Add
.
- On the Add a new network scan job page, define the following settings:
Network scan job name: Enter a meaningful name for this job. This field is required.
Description: Enter a meaningful description.
Select the cluster: From the dropdown, select the cluster you want to use to scan the configured network locations.
Tip: When selecting a cluster, make sure that the nodes in the cluster you assign can access the configured IP ranges via SMB.
Configure IP ranges to discover: Click to define an IP address or range.
In the Choose IP ranges pane, enter an optional name, and then a start IP address and end IP address for your range.
Tip: To scan a specific IP address only, enter the identical IP address in both the Start IP and End IP fields.
Set schedule: Define how often you want this network scan job to run.
If you select Weekly, the Run network scan job on setting appears. Select the days of the week where you want the network scan job to run.
Set start time (UTC): Define the date and time that you want this network scan job to start running. If you’ve selected to run the job daily, weekly, or monthly, the job will run at the defined time, at the recurrence you’ve selected.
Note: Be careful when setting the date to any days at the end of the month. If you select 31, the network scan job will not run in any month that has 30 days or fewer. - Select Save
to save your changes.
Tip: If you want to run the same network scan using a different scanner, change the cluster defined in the network scan job. Return to the Network scan jobs pane, and select Assign to cluster to select a different cluster now, or Unassign cluster to make additional changes later.
Analyze risky repositories found
Repositories found, either by a network scan job, a content scan job, or by user access detected in log files, are aggregated and listed on the Scanner > Repositories repositories icon pane.
If you’ve defined a network scan job and have set it to run at a specific date and time, wait until it’s finished running to check for results. You can also return here after running a content scan job to view updated data.
- Under the Scanner menu on the left, select Repositories
.
The repositories found are shown as follows:- The Repositories by status graph shows how many repositories are already configured for a content scan job, and how many are not.
- The Top 10 unmanaged repositories by access graph lists the top 10 repositories that are not currently assigned to a content scan job, as well as details about their access levels. Access levels can indicate how risky your repositories are.
- Do any of the following:
Select Columns to change the table columns displayed.
If your scanner has recently run network scan results, select Refresh to refresh the page.
Select one or more repositories listed in the table, and then select Assign Selected Items to assign them to a content scan job.
Filter The filter row shows any filtering criteria currently applied. Select any of the criteria shown to modify its settings, or select Add Filter to add new filtering criteria. Select Filter to apply your changes and refresh the table with the updated filter.In the top-right corner of the unmanaged repositories graph, click the Log Analytics icon to jump to Log Analytics data for these repositories.
Repositories with public access
Repositories where Public access is found to have read or read/write capabilities may have sensitive content that must be secured. If Public access is false, the repository not accessible by the public at all.
Public access to a repository is only reported if you’ve set a weak account in the StandardDomainsUserAccount parameter of the Install-MIPNetworkDiscovery or Set-MIPNetworkDiscovery cmdlets.
- The accounts defined in these parameters are used to simulate the access of a weak user to the repository. If the weak user defined there can access the repository, this means that the repository can be accessed publicly.
- To ensure that public access is reported correctly, make sure that the user specified in these parameters is a member of the Domain Users group only.
Create a content scan job
Deep dive into your content to scan specific repositories for sensitive content.
You may want to do this only after running a network scan job to analyze the repositories in your network, but can also define your repositories yourself.
- Under the Scanner menu on the left, select Content scan jobs.
- On the Azure Information Protection – Content scan jobs pane, select Add
.
- For this initial configuration, configure the following settings, and then select Save but do not close the pane.
Content scan job settings
– Schedule: Keep the default of Manual
– Info types to be discovered: Change to Policy only
– Configure repositories: Do not configure at this time because the content scan job must first be saved.
Policy enforcement
– Enforce: Select Off
– Label files based on content: Keep the default of On
– Default label: Keep the default of Policy default
– Relabel files: Keep the default of OffConfigure file settings– Preserve “Date modified”, “Last modified” and “Modified by”: Keep the default of On
– File types to scan: Keep the default file types for Exclude
– Default owner: Keep the default of Scanner Account - Now that the content scan job is created and saved, you’re ready to return to the Configure repositories option to specify the data stores to be scanned. Specify UNC paths, and SharePoint Server URLs for SharePoint on-premises document libraries and folders.
Note: SharePoint Server 2019, SharePoint Server 2016, and SharePoint Server 2013 are supported for SharePoint. SharePoint Server 2010 is also supported when you have extended support for this version of SharePoint.
To add your first data store, while on the Add a new content scan job pane, select Configure repositories to open the Repositories pane:- On the Repositories pane, select Add:
- On the Repository pane, specify the path for the data repository, and then select Save.
- For a network share, use
\\Server\Folder
. - For a SharePoint library, use
http://sharepoint.contoso.com/Shared%20Documents/Folder
. - For a local path:
C:\Folder
- For a UNC path:
\\Server\Folder
Note: Wildcards are not supported and WebDav locations are not supported.
- For a network share, use
- If you add a SharePoint path for Shared Documents:
- Specify Shared Documents in the path when you want to scan all documents and all folders from Shared Documents.
For example:http://sp2013/SharedDocuments
- Specify Documents in the path when you want to scan all documents and all folders from a subfolder under Shared Documents.
For example:http://sp2013/Documents/SalesReports
or, specify only the FQDN of your SharePoint,
For examplehttp://sp2013
to discover and scan all SharePoint sites and subsites under a specific URL and subtitles under this URL. - Grant scanner Site Collector Auditor rights to enable this.
- For the remaining settings on this pane, do not change them for this initial configuration, but keep them as Content scan job default. The default setting means that the data repository inherits the settings from the content scan job.
Use the following syntax when adding SharePoint paths:
Root path:http://<SharePoint server name>
Scans all sites, including any site collections allowed for the scanner user.
Requires additional permissions to automatically discover root content
Specific SharePoint subsite or collection:
One of the following:
–http://<SharePoint server name>/<subsite name>
–http://SharePoint server name>/<site collection name>/<site name>
Requires additional permissions to automatically discover site collection content
Specific SharePoint library:
One of the following:
–http://<SharePoint server name>/<library name>
–http://SharePoint server name>/.../<library name>
Specific SharePoint folder:http://<SharePoint server name>/.../<folder name>
- Specify Shared Documents in the path when you want to scan all documents and all folders from Shared Documents.
- On the Repositories pane, select Add:
- Repeat the previous steps to add as many repositories as needed.
- When you’re done, close both the Repositories and Content scan job panes.
Back on the Azure Information Protection – Content scan job pane, your content scan name is displayed, together with the SCHEDULE column showing Manual and the ENFORCE column is blank.
You’re now ready to install the scanner with the content scanner job that you’ve created. Continue with Scanner Installation.
Scanner Installation
Now that we have verified that all prerequisites and configured the AIP in Azure, we can go through the basic scanner install.
- Log onto the server where you will install the AIP Scanner service using an account that is a local administrator of the server and has permission to write to the SQL Server master database. (more restrictive scenarios are documented in the official documentation)
- Run AzInfoProtection_UL.exe on the server and step through the client install (this also drops the AIP Scanner bits).
WARNING: This blog is based on the current version of the AIP Client. If you want to update to the Preview client, please install the GA first and then install the preview client and use Update-AIPScanner after installation. - Next, open an Administrative PowerShell prompt.
- At the PowerShell prompt, type the following command and press Enter:
1 | Install-AIPScanner -SqlServerInstance "name" -Profile "cluster name" |
Create Cloud Service Account
If you are not using Azure AD Sync for your Service account, you will need to create a service account in the cloud tenant to use for AIP authentication. If you have synced your on premises service account, you can skip this task.
- Run the command below to connect to Azure AD.
1 | Connect-AzureAD |
- When prompted, provide tenant Global Admin credentials.
- To create an account in the cloud, you must first define a password profile object. Run the commands below to define this object.
1 2 3 4 5 6 7 8 9 | $PasswordProfile = New-Object -TypeName Microsoft.Open.AzureAD.Model.PasswordProfile $PasswordProfile.ForceChangePasswordNextLogin = $false $Password = Read-Host -assecurestring "Please enter password for cloud service account" $Password = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($Password)) $PasswordProfile.Password = $Password |
- When prompted, enter a password for the cloud service account.
- To create the account, run the commands below.
1 2 3 | $Tenant = Read-Host "Please enter tenant name for UserPrincipalName (e.g. contoso.com)" New-AzureADUser -AccountEnabled $True -DisplayName "AIP Scanner Cloud Service" -PasswordProfile $PasswordProfile -MailNickName "AIPScannerCloud" -UserPrincipalName "AIPScannerCloud@$Tenant" |
- When prompted, enter the tenant name you want to use for the UserPrincipalName for the cloud service account (e.g. tenant.onmicrosoft.com).
Creating the Azure AD Application in Azure
Next, we will configure the App Registration for the Web App that is required to run the Set-AIPAuthentication command that will be used to get the authentication token. We will also assign the necessary Oauth2Permissions for the Web App to have delegated rights to the App.
- Run the commands below to create the Web App, associated Service Principal, and key password.
1 2 3 4 5 6 7 8 9 10 11 | New-AzureADApplication -DisplayName AIPOnBehalfOf -ReplyUrls http://localhost $WebApp = Get-AzureADApplication -Filter "DisplayName eq 'AIPOnBehalfOf'" New-AzureADServicePrincipal -AppId $WebApp.AppId $WebAppKey = New-Guid $Date = Get-Date New-AzureADApplicationPasswordCredential -ObjectId $WebApp.ObjectID -startDate $Date -endDate $Date.AddYears(1) -Value $WebAppKey.Guid -CustomKeyIdentifier "AIPClient" |
- Next, we need to run some commands to build the RequiredResourceAccess object that is needed to automate delegation of permissions for the native application.
1 2 3 4 5 6 7 8 9 10 11 | $AIPServicePrincipal = Get-AzureADServicePrincipal -All $true | ? {$_.DisplayName -eq 'AIPOnBehalfOf'} $AIPPermissions = $AIPServicePrincipal | select -expand Oauth2Permissions $Scope = New-Object -TypeName "Microsoft.Open.AzureAD.Model.ResourceAccess" -ArgumentList $AIPPermissions.Id,"Scope" $Access = New-Object -TypeName "Microsoft.Open.AzureAD.Model.RequiredResourceAccess" $Access.ResourceAppId = $WebApp.AppId $Access.ResourceAccess = $Scope |
- Now we can create the App and associated Service Principal using the commands below.
1 2 3 4 5 | New-AzureADApplication -DisplayName AIPClient -ReplyURLs http://localhost -RequiredResourceAccess $Access -PublicClient $true $NativeApp = Get-AzureADApplication -Filter "DisplayName eq 'AIPClient'" New-AzureADServicePrincipal -AppId $NativeApp.AppId |
Authenticating as the AIP Scanner Service
In this task, we will use the command created previously to authenticate the AIP Scanner to the AIP Service.
- Open PowerShell using Run as a different user and use the on premises Scanner Service account which should have Run As Administrator rights.
- Run the commands in the following PowerShell session with the Run as Administrator option, which is required for the OnBehalfOf parameter.
- The first command creates a PSCredential object and stores the specified Windows user name and password in the $pscreds variable. When you run this command, you are prompted for the password for the user name that you specified.
- The second command acquires an access token that is combined with the application so that the token becomes valid for 1 year, 2 years, or never expires, according to your configuration of the registered app in Azure AD. The user name of scanner@contoso.com sets the user context to download labels and label policies from your labeling management center, such as the Office 365 Security & Compliance Center.
1 2 3 | $pscreds = Get-Credential DOMAIN\scanner Set-AIPAuthentication -AppId "Web App ID" -AppSecret "Password Generated from previous cmd" -DelegatedUser AIPScannerCloud@tenant.onmicrosoft.com -TenantId "Your M365 Tenant ID" -OnBehalfOf $pscreds |
Successful OUTPUT:
Acquired application access token on behalf of DOMAIN\scanner
- Last Step is to Restart the AIP Scanner Service
1 | Restart-Service AIPScanner |
Look to these reference documents for further details:
Set-AIPAuthentication
Get Azure AD Token for the AIP Scanner
Configure the scanner to apply classification and protection
The default settings configure the scanner to run once, and in reporting-only mode.
To change these settings, edit the content scan job:
- In the Azure portal, on the Azure Information Protection – Content scan jobs pane, select the cluster and content scan job to edit it.
- On the Content scan job pane, change the following, and then select Save:
- From the Content scan job section: Change the Schedule to Always
- From the Policy enforcement section: Change Enforce to On
Tip: You may want to change other settings on this pane, such as whether file attributes are changed and whether the scanner can relabel files. Use the information popup help to learn more information about each configuration setting.
- Make a note of the current time and start the scanner again from the Azure Information Protection – Content scan jobs pane:
Alternatively, run the following command in your PowerShell session:
1 | Start-AIPScan |
The scanner is now scheduled to run continuously. When the scanner works its way through all configured files, it automatically starts a new cycle so that any new and changed files are discovered.
MUCH MORE TO COME! CHECK OFTEN AND SEND COMMENTS!
REFERENCES:
AIP Prerequisites
Install and Configure AIP UL Application
AIP UL Client Download
AIP Classic Client Express Installation
Changing the M365 Tenant Security Defaults through Azure
Microsoft has put out a new standard for security defaults in a tenant that harden default settings in the org. Security defaults make it easier to help protect your organization from these attacks with preconfigured security settings:
- Requiring all users to register for Azure Multi-Factor Authentication.
- Requiring administrators to perform multi-factor authentication.
- Blocking legacy authentication protocols.
- Requiring users to perform multi-factor authentication when necessary.
- Protecting privileged activities like access to the Azure portal.
Now, there might be many reasons why you would not want these defaults enabled in your tenant, just remember that you will need to setup these things manually should you change the security defaults setting.
How to change security defaults in Azure/M365
- Log into https://portal.azure.com with your Global Admin account.
- Click on Azure Active Directory to navigate to that pane.
- In the list to the left, click Properties.
- Scroll to the bottom of the screen on the right and click Manage Security Defaults
- Make the appropriate change: YES/NO
- (IMPORTANT) Save the changes by clicking the Save button

This should set the defaults for your O365 tenant as you wish to have them. Please refer to the references below for more information and detail into each of the security defaults.
MORE POSTS TO COME ON SECURITY AND COMPLIANCE!
HAVE A WONDERFUL DAY!
REFERENCES:
What Are Security Defaults?
Setup Multi-Factor Authentication
Security Defaults
Grant an External User Guest Access to your M365 Tenant
Microsoft365 allows the tenant administrators to grant external users access to content in their tenant by setting them up as a guest in their M365 Tenant. Microsoft365 provides a guest access feature that you can use to grant content access to contractors, partners or others who need access to certain content.
However, the process of setting up a guest user works differently from that of setting up a normal, licensed user from within your organization.
By default, Microsoft365 Admin Center contains a Guest Users screen. You will also notice, however, that this screen does not contain an option to create a guest user. In fact, the only things that you can do are search for a user or delete a user.

Being that the Guest Users screen doesn’t give you a way to create a guest user, you will need to either delve into PowerShell or perform the task within Azure Active Directory. I prefer using PowerShell, and will write a post about how to perform this via PowerShell, but unless you need to create a large number of guest users, it is usually going to be easier to use the GUI. Below is how to create a guest user via Azure AD.
To create a guest user, expand the Admin Centers container and then click on Azure Active Directory. When the Azure Active Directory Admin Center opens, click on the Users container. You can see that just to the right of the New User option, there is an option to create a New Guest User.

NOTE: Creating a guest user account isn’t like creating a normal user account. Rather than providing the account details and clicking a Create button, you will instead need to send an invitation to the user.
Make Sure You Verify Their E-Mail Address Beforehand!!!
Choose Invite User > Enter the Identity Information

Next Enter A Personal Message (optional) > Choose their Group Membership > Update any AAD or M365 Permissions under Roles > Update their Sign In Settings > Click Invite to send the invitation

After a few minutes, the specified user will receive an e-mail invitation that looks something like the one shown below. The recipient will need to click the Accept Invitation button and accept the terms of use.

When the guest user completes the registration process, they are logged into Microsoft365 however, there are no applications initially available to the user. This is because unlike a standard user, external users do not automatically get access to applications.

If you go back to the Guest Users screen, you will see the newly created guest user listed (you may have to refresh the screen). As previously noted, you can’t do much from this screen. You can, however, click on the user to see a few extra details now. Example is below.

The way that you grant an external user access to data is to add the user to a group that has access to the data. Let’s suppose, for example, that for whatever reason, you need to add an external user to a Teams Group named Microsoft Exchange Guys. To do so, you would go to the Groups folder within the Microsoft 365 Admin Center, click on the Microsoft Exchange Guys group, and then edit the Membership list, as shown below.
After clicking the Edit button, click on Add Members and then select the external user that you wish to add. Click Save to complete the process,

If you now go back to the Group’s membership, you are able to see the Microsoft Exchange Guys group membership showing the new guest user as a member.

Granting access in this way does not provide the external user with blanket access to the Teams Group. However, another group member is now able to e-mail the external user a link to the Teams Group. The external user can use this link to access the Group within the Teams app.

NOTE: Keep in mind that I am only using the Teams Group as an example. You can use somewhat similar techniques to provide access to a variety of Microsoft365 AND Azure AD content.
MORE M365 CONTENT TO COME!
POSITIVE ATTITUDE = POSITIVE RESULTS
REFERENCES:
How To Enable Guest Access for Office 365
Exchange System Mailboxes not being configured cause Exchange Setup to fail
My continuation of the “Installation from HELL” proceeded onward today with our team attempting to install Exchange on another server in the test environment and having it fail when getting to the Mailbox Role portion of the installation.
The error kept saying that the installation was failing due to a “Database is mandatory on UserMailbox”. We had been having many issues with the Schema and RBAC roles which were resolved in my other post by adding the Role Assignments to the schema. I did mention that the environment started settling down and the system mailboxes (Arbitration) along with the Health Mailboxes started functioning. This was actually not the case for the Arbitration mailboxes. I glanced at the following article to see how to manually recreate the Arbitration mailboxes again.
I performed a “Get-Mailbox -Arbitration | fl Name” in Exchange Powershell (similar to the screenshot below) to see if the mailboxes were in fact created. They in fact were not and were giving the error “Database is mandatory for the UserMailbox.”

So, I tried to do what the original article said to do and enable the mailboxes one by one. I kept getting errors when trying to create the mailboxes. So I began to search the internet for another way to possibly remediate this without having to get too deep into the system.
I found the following article explaining the exact error I was getting during the installation of Exchange. In the article, it said to look at the attributes of the account associated with the Arbitration mailbox to see if the homeMDB attribute had no value:

Now, since I was NOT having good luck with either the Exchange Setup nor PowerShell, I had to figure out a way to place the attribute value so that the mailbox would be visible. What I did was this:
- I opened a User in ADUC with a working mailbox on the needed database.
- I went to the Attributes Tab and looked up the homeMDB attribute for that user then chose Edit.
- I copied the entire value from the screen and closed it.
- I then went to the Arbitration mailbox in question and opened it’s homeMDB attribute.
- I pasted the value into the Value box and saved it.

Once completed with remediating the attribute for all the Arbitration mailbox accounts missing the value, I re-ran the cmdlet to verify that the error was not present for any arbitration maibox:

I then uninstalled and re-installed Exchange using setup on the failing server and the installation completed successfully.
This has been an excellent week in training on the value of the setup process for Exchange and also the value of the system accounts and values in relation to Exchange and it working properly.
A POSITIVE OUTLOOK WILL YIELD POSITIVE RESULTS ULTIMATELY!
REFERENCES:
Exchange Install Error Database is mandatory on UserMailbox
Recreate missing arbitration mailboxes
RBAC Role Assignments NOT installed during Exchange Directory Preparation
I had a very interesting installation issue recently when installing Exchange 2019 into a new environment. We ran through all the Exchange Preparation for the root and child domains in the forest as described HERE. The results of those installation procedures showed SUCCESS, but when we started installing Exchange, we ran into issues with the System Mailboxes not being available to complete the Mailbox Role part of the installation. Most of the articles that I found said to re-run the Domain Prep (/preparealldomains) and the AD Prep (/pad). So we did, and managed to get the first server installed somehow.
The reason I said somehow is because when we tried to logon to the EAC, we would get a 400 Bad Request Error and could not logon to the console. Next, we tried PowerShell and was able to load PowerShell, but I noticed that only ~100 cmdlets loaded. I thought that maybe we had to re-create the account mailbox to get it working properly. Problem was, one of the cmdlets that would not load was Disable-Mailbox along with others like Enable-Mailbox and New-Mailbox. It was as if the admin account we were using had no rights to administer Exchange in any way.
Next, we opened the mailbox in OWA. The mailbox came up okay, so I told the admin to change the URL to /ecp to try and get into the admin center. What happened was that the normal user control panel opened instead, showing again that the account did not have permissions.
We checked replication to the child domain and made sure there were not any apparent AD issues present. There were none. I next started reviewing how Exchange uses RBAC (Role Based Access Control) Groups and Role Assignments to grant users access to Exchange Admin Functionality. I read the following article located HERE.
Something told me to go and check the Schema again, so I went to ADSIEdit > Configuration Container > Services > Microsoft Exchange > (Organization Name) > RBAC > Role Assignments
I looked at the list of role assignments in the window as follows:

From the picture, you can see that the list is small, which in my experience is not correct. I verified this by going into my own 2019 environment and comparing the number of objects in that folder:

If you notice the list is MUCH longer and has many more objects listed in the container. So, how did Exchange Setup miss this during preparation? That I will find out later, but first I have to remediate this problem.
CAUSE:
If the RBAC roles assignments are not installed to allow an account to have administrative privileges in Exchange, then you cannot administrate Exchange to even make the necessary changes! Especially so if you’ve only installed ONE server in the environment!
REMEDIATION:
Manually repair the installation by running the script that creates these Objects in the Schema during setup.
******DISCLAIMER: Running the following commands in these instructions, running ADSIEdit, and/or making changes to your Schema and Exchange Installation outside the normal setup process is NOT recommended! Microsoft, LDLNET LLC, nor I (Lance Lingerfelt) are responsible for any issues or errors that may arise from using these instructions, period!******
That said, preform the following to regenerate the objects in the Schema:
1) Open Windows PowerShell (not the Exchange Management Shell) on the server that you installed Exchange Server on with the same account you used to install Exchange.
a. If you have UAC enabled, right click Windows PowerShell and click Run as administrator.
2) Run Start-Transcript c:\RBAC.txt and press Enter
a. This will start logging all commands and output you type to a text file.
3) Run Add-PSSnapin *setup and press Enter
a. This adds the setup snap-in which contains the setup cmdlets used by Exchange during install. You may see errors about loading a format data file. You can ignore those errors.
NOTE: DO NOT run any other cmdlets in this snap-in. Doing so could irreparably damage your Exchange installation.
4) Run Install-CannedRbacRoleAssignments -InvocationMode Install -Verbose and press Enter.
a. This cmdlet should create the required role assignments between the role groups and roles that should have been created during setup.
b. Be sure you run with the Verbose switch so we can capture what the cmdlet does.
5) Run Remove-PSSnapin *setup and press Enter
6) Run $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://servername/PowerShell/ -Authentication Kerberos and press Enter
a. Be sure to replace SERVERNAME with the FQDN of your server.
7) Run Import-PSSession $Session and press Enter
a. You should notice that the normal number of cmdlets load (~700)
8) Run Get-ManagementRoleAssignment and press Enter. If you are able to run the cmdlet, then the remedation worked.
9) Run Stop-Transcript and press Enter
The final check is to return to ADSIEdit and check the container and see if all the objects are there. We also were now able to get into EAC as well as saw that the Arbitration mailboxes were populating along with the Health Mailboxes as needed per the installation.
It was very neat to see how running the Add-PSSnapIn cmdlet opened all the scripts from the Exchange Setup and allowed me to manually fix the installation problem by running the cmdlet script that need to perform that task that setup missed or refused to run.
POST MORTEM REVIEW
I am going to look over the installation logs and see where the installation failed and try to find out why it did not run on the subsequent re-installations of the AD Prep and Domain Prep. I will post those finding in this article when I have that available.
Thanks again to my Microsoft and Trimax teammates for your assistance with this. It has helped the customer in more ways than one!
HAPPY TROUBLESHOOTING!
POSITIVE ATTITUDE YIELDS POSITIVE RESULTS!
The Windows Time Service, Hyper-V Hosts, and DCs that are VMs.
The sheer craziness of it all! I noticed that my clocks were off on my servers by FOUR minutes. I had originally set in group policy for the PDC emulator for my domain, a VM on one of my Hyper-V hosts, to get the time from the Public NTP hosts. I then configured a group policy to have all the other machines get their time from the PDC Emulator.
This was working great for me until I realized that my Hyper-V hosts were actually controlling the time of the VMs. They were also configured to get the time from the PDC Emulator, but essentially, due to how Hyper-V is configured, the PDC Emulator VM was getting the time from the Host. So, once the time got thrown off, everything went wacky on me!
I’d read through a couple of articles and found the configuration flaw of Hyper-V and the need for those servers to get their time from the external NTP hosts as well as be configured as NTP servers themselves. This totally went against my Group Policy configuration which caused the issue!
Luckily, I had a stand alone server that is a tertiary DC in the domain not running Hyper-V. I was able to get my time synced again properly after performing the following configuration.
- I had to move the FSMO roles to the tertiary DC with the following cmdlet:
1 | Move-ADDirectoryServerOperationMasterRole -Identity backupdc01 -OperationMasterRole 0,1,2,3,4 -Confirm:$False |
- I then made sure the tertiary DC was syncing time correctly by running the following on that server:
1 2 3 4 5 6 7 8 9 10 11 | Stop and Start the time service: net stop w32time net start w32time Resync with the NTP servers: w32tm /resync /nowait Query the Source NTP server for the server you are running: w32tm /query /source (Should return your Public NTP server) 2.us.pool.ntp.org,0x1 |
- I then removed the Group Policy Object for syncing the time source to the DC that I had linked to my Hyper-V Servers OU in Active Directory
- Ran a gpupdate /force on the Hyper-V host to remove the policy there
- I then had to reconfigure the Hyper-V hosts to be NTP Servers and clients that got their time from a public NTP server:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 | Cmdlets in order: Unregister the Time Service: w32tm /unregister Stop the Time Service: net stop w32time Register the Time Service: w32tm /register Start the service: net start w32time Query the Source NTP server for the server you are running: w32tm /query /source (Should return the default public NTP server) time.windows.com,0x1 Configure the NTP time servers you want to use: w32tm /config /syncfromflags:manual /manualpeerlist:"0.pool.ntp.org, 1.pool.ntp.org, 2.pool.ntp.org" /update /reliable:yes Resync with the NTP servers: w32tm /resync /nowait Verify the Source NTP Server being used: w32tm /query /source (Should return the Public NTP server you set) 1.pool.ntp.org,0x1 Time should be correct on the Hyper-V host now. |
The one problem Hyper-V host that was syncing with the DC VM would not change settings via Group Policy nor through the w32tm cmdlet. I even went into the registry and tried to modify the following keys to make the changes stick:
1 2 3 4 5 6 7 8 | Path: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\ Keys - Value (Decimal): w32time\Config\AnnounceFlags - 10 w32time\Parameters\NtpServers - 0.us.pool.ntp.org,0x1 1.us.pool.ntp.org,0x1 2.us.pool.ntp.org,0x1 w32time\TimeProviders\NtpClient\SpecialPollInterval - 900 (15 minutes) w32time\TimeProviders\NtpServer\Enabled - 1 |
The values would just not change, most likely due to the time not being synchronized. I had to reboot the server and then run through the process again in order for the changes to stick.
I did look at another article that said to do the following on the DC VM in order for time NOT to sync with the Hyper-V Host:
Go into Hyper-V console on the host machine, right-click on the client VM AD server, and select Settings. Once in here, on the left look under:
Management –> Integration Services
Untick Time Synchronization
Click Apply/OK

Things are running smoothly now. Please view the references at the bottom of the post. There are a couple of great articles about the Time Synchronization process with Hyper-V and why it needs to be setup the way I have it now. I wished I had read it before I originally set this up. I will post the article about getting group policy to handle the time sync process. Just remember, if your PDC Emulator is a VM, don’t sync it to a public NTP server. Sync it to your Hyper-V Host and have the Host sync publicly.
In the long run, I think it is a good design solution to have your Hyper-V hosts time synced to the Public NTP servers than having to remember to configure each VM DC you create to NOT time sync with the host. To each is own though, and one thing I learned from working Microsoft, there are multiple ways to get to the same goal that are technically sound methods.
THANKS FOR READING!
PLEASE COMMENT!
REFERENCES:
Setup of NTP on Hyper-V servers
Time Synchronization in Hyper-V
“It’s Simple!” – Time Configuration in Active Directory
NTP Circular Time Sync – Windows Server 2012 R2 / Hyper-V
Error 801c0003 when joining computer to Azure AD
I just received my new laptop for my current project and was setting up Windows 10 to join the company Azure AD domain. When I got to the part where you join, I received the following error:

Turns out that my account is unable to domain join a device to the tenant. This is easily solved though. You have your tenant admin perform the following:
Go to Azure Active Directory -> Devices
Check the device settings, in particular the options:
Users may join devices
Maximal number of devices

Now, in my case, I did not have access as I am NOT a tenant admin:

So, I am currently waiting for my IT department to resolve the access issue and grant me access to join the device to the domain. Just be sure to look at this if you’re having issues setting up your Windows 10 device to join your Azure tenant!
HAPPY TROUBLESHOOTING!
POSITIVE ENERGY!
References:
Issue Joining A Device To An Azure AD Tenant Domain
Importing User Photos to Office 365 in bulk for your company.
In a previous post, I showed how you could update one user’s photo for their Outlook and AD profiles via PowerShell. In this post, we will explore how to do this for your entire organization via PowerShell to Office365.
NOTE: I have not tested the scripts as I do not have enough mailboxes in my O365 tenant along with not using a ‘.’ in my alias. If the scripts are incorrect, please inform me with the correction and I will update accordingly.
Please make sure that your photos are reviewed before posting, and try to keep the file size of the photos to a minimum. In Office 365, there exists a limitation for the user photo not to be more than 10 KB in size, but I will show you how to get around that limitation.
Having a user photo for each of your users is very beneficial as it personalizes each account to a face in the company. The user photos can be viewed in below locations:
- Outlook Web Access
- Contact Card
- Thumbnail in emails
- Outlook Client
- Yammer
- Lync Client
- SharePoint (People Search / Newsfeed)
Steps to take:
- Remove the 10KB photo size limitation in Exchange Online
- Prepare a folder with all users photos
- Update the profile photos via a PowerShell cmdlet.
Connect to Exchange Online with the RPS Proxy Method to remove the 10K size limitation
1 2 3 | $UserCredential = Get-Credential $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxyMethod=RPS -Credential $UserCredential -Authentication Basic -AllowRedirection Import-PSSession $Session |
NOTE: In the PowerShell cmdlet above, we connected using a different proxy method. This was to overwrite the limitation of uploading the images with size more than 10KB. Using the different proxy method (/?proxyMethod=RPS ) to connect to Office 365 in the above cmdlet accomplishes this.
Prepare a folder locally and place all the photos in that folder
Create a folder named C:/UserPics and make the filename of each photo be the username of that particular user. (i.e. llingerfelt.png)
The below script should be able account for aliases that have a ‘.’ in the id as well. (i.e. lance.lingerfelt)
NOTE: From my research, there is no set photo type that is required for the photo. My suggestion would be to keep the photos .png for size constraints while maintaining picture clarity.
Update the profile pictures via PowerShell
Create the following script and name it Photos-Update.ps1
1 2 3 4 5 6 | $path= 'C:\UserPics\' $Images = Get-ChildItem $path $Images |Foreach-Object{ $Identity = ($_.Name.Tostring() -replace '.png','') $PictureData = $path+$_.name Set-UserPhoto -Identity $Identity -PictureData ([System.IO.File]::ReadAllBytes($PictureData)) -Confirm:$false } |
Run Photos-Update.ps1 and the script should upload the photos to Office 365 and apply each photo to the corresponding user.
NOTE: If you’re still having some issues with the alias having a ‘.’ in the name, you can also configure the Photos-Update.ps1 script in this manner to get that working properly:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | $path= 'C:\UserPics\' $Images = Get-ChildItem $path $Images |Foreach-Object{ $Identity = ($_.Name.Tostring() -split "\.") $count = $Identity.count If($count -gt 2) { for($i=0; $i -le $count-2; $i++) { $username=$username+$Identity[$i]+"." } $Id=$username.TrimEnd(".") } Else { $Id=$Identity[0] } $PictureData = $path+$_.name Set-UserPhoto -Identity $Id -PictureData ([System.IO.File]::ReadAllBytes($PictureData)) -Confirm:$false $username=$null } |
HAPPY SCRIPTING!
PLEASE COMMENT!

REFERENCES:
How to import Office365 User photos over 10KB & without CSV in bulk
Unable to open settings from the Settings App in Windows Server 2016/2019
In Windows Server 2016/2019 you have been upgraded to the Windows 10 Desktop Experience GUI. So, in the new versions, you are directed to use the to get to your settings. What was happening within the Settings is that I would choose a setting that calls on the control.exe file to open a Control Panel app. I would get the following error when attempting to do that function:

I immediately think it is a permissions issue. So I go to try to validate the permissions so that I could change them. Turns out, that due to it being a Windows System directory, I couldn’t modify the permissions without compromising directory security with NTFS permissions:

Now, if I open Control Panel, Network Sharing Center, etc…, I was able to access the applets with no issues. This was just happening in the Settings Gear Box Application. So, I started looking around and found that there is a registry key that needs to be modified so that your Administrator account can open these settings apps through the Settings Application:
1) Launch the Registry Editor (regedit.exe)
2) Navigate to:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System
3) Change the value of FilterAdministratorToken (REG_DWORD) from 0 to 1 (If you don’t see that key, you can create it by right-clicking on any empty space from the right panel and select New > DWORD value, type the name and set the value to 1)
4) Reboot the computer and then it will be working fine.
I decided to create a Group Policy in AD to add this registry key so that it would propagate to all my 2016/2019 Servers:
1) Launch the Group Policy Manager
2) Create a new GPO and Link it to your Domain
3) Go to Computer Configuration > Preferences > Windows Settings > Registry > New Registry Key (DWORD)
4) Set the Action to “Replace”
5) Set the path as:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System
6) Set the Key as FilterAdministratorToken
7) Set the Value as 1 (Decimal Format) and Save
8) Run gpupdate /force on your servers.
9) Schedule a Reboot of those servers for the change to truly take effect.

After the reboot of the server, all the apps launched correctly from the Settings Application within Windows. I am going to research a little more to see why this is like that. If you have a comment, or more information, please feel free to post!
HAPPY TROUBLESHOOTING!
PLEASE COMMENT!
Hyper-V 2019 will NOT mount ISO from a network share.
Like most IT guys. They have a repository of their ISO images saved on a network share so that they can mount the ISO if needed on multiple machines. I recently switched to Hyper-V and have been having an issue with creating VMs and using my ISO from my network share to do so.
Hyper-V Manager available through RSAT doesn’t have an option to mount an ISO or capture a drive from a machine on which is running. Instead it gives you drives of the Hyper-V host, and that would of course require you to have an ISO or the disc itself present on the host. I didn’t want to do that. I would rather have my repository share available for that purpose to allow for all the drive space to be available on the Hyper-V host.
So, I would map a network drive with my ISOs. The mapping would succeed, but mapped drive (letter) will not be visible in Hyper-V manager when trying to mount an ISO. Okay, so next I tried mounting from UNC share directly, but that would also fail, with the message:
“‘VM’ failed to add device ‘Virtual CD/DVD Disk’” & “User account does not have permission required to open attachment”.

It goes back to the constrained delegation requirement for the Hyper-V host accounts to be used to perform functions such as this. This has been a pain to say in the least, as I have also had issues with live migration with my machines not being clustered due to different hardware.
So, in researching, I found this blog post. It has helped me through this issue with mapping the shared folder with the ISOs.
The cause of the problem is that the Hyper-V is intended to run with VMM Library Server and to mount files from it, not any random share. To re-mediate this:
- You need to assign full NTFS and share permissions to computer account of Hyper-V on a shared folder with ISO’s you want to mount.
- In AD on the computer account of Hyper-V machine delegate specific service ‘cifs’ to the machine you want your ISO’s mounted from. Microsoft calls this constrained delegation.
Here is step by step procedure for the constrained delegation:
- Go to Active Directory Users and Computers
- Find the Hyper-V server computer account and open up its properties.
- Go to Delegation tab.
- Select Trust this computer for delegation to the specified services only radio button.
- Click the Add button.
- Click the Users or Computers… button.
- In the Add Services window, click Users or Computers and enter the computer account that will act as a library server and click OK.
- Select the cifs Service Type and click OK.
The resulting setup should look something like this:
I added both the server that contained the ISO images and the server that I run my RSAT tools from just to be safe. I next rebooted the Hyper-V host (that is a requirement).
When the host rebooted, I was able to successfully create the VM.
Hopefully, this will also solve my issue with live migration between my hosts. I will have to test that again and will inform everyone here if that succeeds as well!
PLEASE COMMENT!
THANKS FOR READING!
References:
Hyper-V Server 2012 won’t mount ISO from a network share
Hyper-V authentication in Windows Server 2016 for managing remote Hyper-V servers through RSAT
Constrained Delegation
How to transfer FSMO Roles using PowerShell
A rare weekend post for me! HA! I am currently migrating my server environment from VMWare 6.7 to Server 2019 Hyper-V. I have a separate standalone box that I use for my VM backups and as a tertiary DC. Since I had to shut down my VMs in order to convert them, I needed to quickly move my FSMO roles from the DC Virtual Machine to the Standalone box so things would stay running.
I found this great article on how to do that quickly through PowerShell since it is a pain to go into ADUC, ADDT, and setup an MMC for the Schema snap-in.
When you create a domain, all FSMO roles assigned to the first domain controller in the forest by default. You can transfer FSMO roles from one DC to another both the Active Directory graphics snap-ins and the PowerShell command line. Moving FSMO roles using AD PowerShell has the following benefits:
- You do not need to connect with a MMC snap-ins to the future role owner;
- Transferring or seizing FSMO roles does not require a connection to the current or future role owner. You can run AD-PowerShell module cmdlets on a Windows Client or Server running RSAT Tools;
- To seize the FSMO role (if the current owner is not available), it suffices to use an additional parameter -force.
Import the Active Directory Module Into PowerShell:
1 | Import-Module ActiveDirectory |
To get the current forest level FSMO role owners (Domain Naming Master and Schema Master roles) you can use the following PowerShell cmdlet:
1 | Get-ADForest ldlnet.net | ft DomainNamingMaster, SchemaMaster -a -wr |
To view domain-wide FSMO roles (Infrastructure Master, PDC Emulator and Relative Identifier Master roles):
1 | Get-ADDomain ldlnet.net | ft InfrastructureMaster, PDCEmulator, RIDMaster -a -wr |

Transfer FSMO Roles using PowerShell
To transfer FSMO roles between Active Directory domain controllers, we use the PowerShell cmdlet:
Move-ADDirectoryServerOperationMasterRole
To use the Move-ADDirectoryServerOperationMasterRole cmdlet, you must meet the following requirements:
- There must be at least one DC with a version of Windows Server 2008 R2 or higher
- PowerShell version 3.0 or newer
- Active Directory module (2.0 or newer)
NOTE: Unlike the Ntdsutil.exe utility, the Move-ADDirectoryServerOperationMasteRole cmdlet can be performed from any domain computer to migrate the Operations Master roles if you have the appropriate rights (Domain admins and Enterprise Admins).
Import the AD Module:
1 | Import-Module ActiveDirectory |
I needed to move all the roles from one server to the other, so, I ran the following to do so:
1 | Move-ADDirectoryServerOperationMasterRole -Identity “servername” –OperationMasterRole DomainNamingMaster,PDCEmulator,RIDMaster,SchemaMaster,InfrastructureMaster -Confirm:$False |
NOTE: To simplify the command, you can replace the names of roles with numbers from 0 to 4. The correspondence of names and numbers is given in the table:
PDCEmulator | 0 |
RIDMaster | 1 |
InfrastructureMaster | 2 |
SchemaMaster | 3 |
DomainNamingMaster | 4 |
So, by having knowledge of these numbers, you can simplify your cmdlet:
1 | Move-ADDirectoryServerOperationMasterRole -Identity “servername” –OperationMasterRole 0,1,2,3,4 -Confirm:$False |
NOTE: In the event that the current owner of one or all of the FSMO roles fails, the forced transfer of FSMO roles is performed by the same command, but with the -Force option. Also, after the FSMO roles have been seized, the domain controller from which the roles was seized should never be connected to the domain. You will need to preform a metadata cleanup of the Schema before even thinking about putting that failed server back into production.
Once completed, I ran the previous cmdlets of Get-ADForest and Get-ADDomain to verify that the FSMO roles moved to the destination server.
As of now, my conversion to Hyper-V is going smoothly, although it takes quite a bit of time to convert the hard disks. Thanks again!
HAPPY TROUBLESHOOTING! KEEP SCRIPTING!
PLEASE COMMENT!
Connect to all PowerShell Modules in O365 with one script
Let’s say you’re an admin that needs to connect to Office365 via PowerShell often. Now, there are many different websites or blogs that will show you how to connect to each session via PowerShell. That can cause a headache since you can end up having five different PowerShell sessions running in five different windows. You end up having to enter a username and password all those times, which can become time consuming.
I want to show you here how to combine all those sessions into one script where, if you’re security is tight enough on your computer, you don’t even have to enter credentials. This way, you can click on one icon and pull up all the O365 PowerShell commands that you’ll need to manage your organization.
First you need to download the following PowerShell Module Installation Files so that your PowerShell Database will have the correct modules installed:
Microsoft Online Service Sign-in Assistant for IT Professionals RTW
Windows Azure Active Directory Module for Windows PowerShell v2
SharePoint Online Management Shell
Skype for Business Online, Windows PowerShell Module
Next, we want to setup the CLI (Command Line Interface) to be too cool for school. I have learned it helps to have knowledge of how to customize the CLI window. You can do all of this in PowerShell ISE or Notepad, which ever you prefer. Here are the commands for the script that I use to setup the CLI:
1 2 3 4 5 6 7 8 9 | #Set the PowerShell CLI. Make sure you have a directory named "C:\PowerShell" set-location c:\PowerShell $a = (Get-Host).UI.RawUI $a.BackgroundColor = "black" $a.ForegroundColor = "yellow" $a.WindowTitle = "(Your Company Name) PowerShell for ALL O365 PowerShell Services" $curUser= (Get-ChildItem Env:\USERNAME).Value function prompt {"(Your Company Name) O365 PS: $(get-date -f "hh:mm:ss tt")>"} $host.UI.RawUI.WindowTitle = "(Your Comapny Name) O365 PowerShell >> User: $curUser >> Current Directory: $((Get-Location).Path)" |
Next, you want to set your Execution Policy and put in your credentials so that you won’t be prompted to enter the user credentials when you run the script.
NOTE: MAKE SURE YOU KEEP YOUR SCRIPT SAFE AS THE CREDENTIALS ARE VISIBLE WITHIN THE SCRIPT IN PLAIN TEXT!
You can, alternatively, set your script to prompt for credentials every time by using the following:
$LiveCred = Get-Credential
Here is that part of the script:
1 2 3 4 5 6 | #Setup Execution Policy and Credentials using your Tenant Admin Credentials Set-ExecutionPolicy Unrestricted $user = "tenantadmin@companyname.onmicrosoft.com" $pass = "adminpassword" $secpass = $pass | ConvertTo-SecureString -AsPlainText -Force $LiveCred = New-Object System.Management.Automation.PSCredential -ArgumentList $user, $secpass |
Now we get into the importing of the modules for each O365 service:
Get the MSOnline Module:
1 2 3 | #Set up the powershell cmdlets for Office365 Write-Host "Getting MSOnline Module" -ForegroundColor Green Get-Module MSOnline |
Connect to the MSOnline Service:
1 2 3 | #Connect the MS Online Service Write-Host "Connecting to the MSOnline Service" -ForegroundColor Green Connect-MSOLService -Credential $LiveCred |
Connect to Azure AD PowerShell:
1 2 3 | #Connect to Azure Ad PowerShell Write-Host "Connecting to Azure AD PowerShell" -ForegroundColor Green Connect-AzureAD -Credential $LiveCred |
Connect to SharePoint Online PowerShell:
NOTE – MAKE SURE YOU CHANGE TO YOUR COMPANY NAME IN THE URL!!
1 2 3 4 | #Connect to SharePoint Online PowerShell Write-Host "Connecting to SharePoint Online through PowerShell" -ForegroundColor Green Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking Connect-SPOService -Url https://companyname-admin.sharepoint.com -credential $LiveCred |
Connect to Exchange Online PowerShell:
1 2 3 4 | #Connect to Exchange Powershell Write-Host "Connecting to Exchange Online through PowerShell" -ForegroundColor Green $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell/ -Credential $LiveCred -Authentication Basic -AllowRedirection Import-PSSession $Session |
Connect to Skype For Business Online PowerShell:
1 2 3 4 5 | #Connect the Skype For Business Online Powershell Module Write-Host "Connecting to Skype For Business Online through PowerShell" -ForegroundColor Green Import-Module SkypeOnlineConnector $sfboSession = New-CsOnlineSession -Credential $LiveCred Import-PSSession $sfboSession |
Connect to the Security & Compliance PowerShell:
NOTE – This one I still get “Access Denied” when trying to connect. I have looked for an answer to that issue, but have not found one. Please comment with a link if you have an answer so that I can update this script!
1 2 3 4 | #Connect the Security & Compliance Center PowerShell Module Write-Host "Connecting to O365 Security & Compliance Online through PowerShell"-ForegroundColor Green $SccSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.compliance.protection.outlook.com/powershell-liveid -Credential $LiveCred -Verbose -Authentication Basic -AllowRedirection Import-PSSession $SccSession -Prefix cc |
Lastly, put in a note to show that the PS load is completed:
1 2 3 | Write-Host "Be sure to check for any connectivity errors!" -ForegroundColor Green Write-Host "Also, Remember to run 'Get-PSSession | Remove-PSSession' before closing your PowerShell Window!" -ForegroundColor Green Write-Host "Successfully connected to all O365 PowerShell Services for (CompanyName)!" -ForegroundColor Green |
So Here is the final script in its entirety:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | #Set the PowerShell CLI. Make sure you have a directory named "C:\PowerShell" set-location c:\PowerShell $a = (Get-Host).UI.RawUI $a.BackgroundColor = "black" $a.ForegroundColor = "yellow" $a.WindowTitle = "(Your Company Name) PowerShell for ALL O365 PowerShell Services" $curUser= (Get-ChildItem Env:\USERNAME).Value function prompt {"(Your Company Name) O365 PS: $(get-date -f "hh:mm:ss tt")>"} $host.UI.RawUI.WindowTitle = "(Your Company Name) O365 PowerShell >> User: $curUser >> Current Directory: $((Get-Location).Path)" #Setup Execution Policy and Credentials using your Tenant Admin Credentials Set-ExecutionPolicy Unrestricted $user = "tenantadmin@companyname.onmicrosoft.com" $pass = "adminpassword" $secpass = $pass | ConvertTo-SecureString -AsPlainText -Force $LiveCred = New-Object System.Management.Automation.PSCredential -ArgumentList $user, $secpass #Set up the powershell cmdlets for Office365 Write-Host "Getting MSOnline Module" -ForegroundColor Green Get-Module MSOnline #Connect the MS Online Service Write-Host "Connecting to the MSOnline Service" -ForegroundColor Green Connect-MSOLService -Credential $LiveCred #Connect to Azure Ad PowerShell Write-Host "Connecting to Azure AD PowerShell" -ForegroundColor Green Connect-AzureAD -Credential $LiveCred #Connect to SharePoint Online PowerShell Write-Host "Connecting to SharePoint Online through PowerShell" -ForegroundColor Green Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking Connect-SPOService -Url https://companyname-admin.sharepoint.com -credential $LiveCred #Connect to Exchange Powershell Write-Host "Connecting to Exchange Online through PowerShell" -ForegroundColor Green $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell/ -Credential $LiveCred -Authentication Basic -AllowRedirection Import-PSSession $Session #Connect the Skype For Business Online Powershell Module Write-Host "Connecting to Skype For Business Online through PowerShell" -ForegroundColor Green Import-Module SkypeOnlineConnector $sfboSession = New-CsOnlineSession -Credential $LiveCred Import-PSSession $sfboSession #Connect the Security & Compliance Center PowerShell Module Write-Host "Connecting to O365 Security & Compliance Online through PowerShell"-ForegroundColor Green $SccSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.compliance.protection.outlook.com/powershell-liveid -Credential $LiveCred -Verbose -Authentication Basic -AllowRedirection Import-PSSession $SccSession -Prefix cc Write-Host "Be sure to check for any connectivity errors!" -ForegroundColor Green Write-Host "Also, Remember to run 'Get-PSSession | Remove-PSSession' before closing your PowerShell Window!" -ForegroundColor Green Write-Host "Successfully connected to all O365 PowerShell Services for (CompanyName)!" -ForegroundColor Green |
Now you can create your icon for your desktop so that you can easily access the script. I would save the script to your Scripts directory.
That will usually be C:\Users\’username’\Documents\WindowsPowerShell\Scripts or wherever directory you choose.
To start, right click the desktop and choose New > Shortcut
In the Target Field, enter the following for your PowerShell Shortcut, pointing to the path of your script:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -noexit -ExecutionPolicy Unrestricted -File “C:\Users\username\Documents\WindowsPowerShell\Scripts\ConnectO365All.ps1”
Click on the Advanced button and check the box: Run As Administrator
Under the General Tab, name your shortcut: (CompanyName) O365 All PowerShell
Click OK to save the shortcut to your desktop.
LAST BUT NOT LEAST, RUN THE FOLLOWING COMMAND BEFORE EXITING OR CLOSING YOUR POWERSHELL WINDOW. THIS WILL REMOVE ALL THE SESSIONS YOU’VE CONNECTED TO:
Get-PSSession | Remove-PSSession
HAPPY SCRIPTING!
LEARN, DO, LIVE!
References:
Connect to all O365 Services in one PowerShell Window
How to connect to all O365 Services through PowerShell
Connecting to Office 365 “Everything” via PowerShell
PowerDNS Script
I was compiling some scripts to be able to modify DNS records in my previous post. While browsing through different scripts in the TechNet Gallery, I came across the following Script that provides a menu, options, and different settings which really make it a great script to use if you do a lot of DNS Modification and want to do it through PowerShell.
Here is the link to the original script page, but I have updated and modified the script to include being able to add/remove DNS Zones as well.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 | <# Welcome to PowerDNS V1.0! This Script is prepared by Subhro Majumder. Modified by Lance Lingerfelt - LDLNET LLC. #> <# The script has been prepared and tested in Powershell 4.0. So when you run this script, please make sure that the Powershell version is 4.0 or above. #> <# Using this script , you can Look-Up, Create and Delete A,PTR and CNAME Records. You can also Look-Up, Create, and Remove DNS Zones.#> <# This script is suitable for single record entry as well as bulk entry. For every operation there are two options, Single and Bulk. For Bulk entry you need to create an input file in CSV format. Sample input files has been provided along with this script. #> <# While Deleting records, you will get a confirmation box for each record, to ensure that you are going to delete the correct record. #> <# While this script has been tested multiple times, I recommend that you test it in a test environment before using it in a production environment.#> <# While every attempt has been made to make this script error free, I will not take any responsibilty for any consequence that would happen while running the script.#> function Show-Menu { param ( [string]$Title = 'Welcome to the PowerDNS! This is a one stop solution for all DNS related work' ) cls Write-Host "===== $Title =====" -ForegroundColor Yellow Write-Host " " Write-Host "Using this script, you can Look-Up, Create and Delete A,PTR and CNAME Records. You can also Look-Up, Create, and Remove DNS Zones." -ForegroundColor Yellow Write-Host " " Write-Host "The script has been prepared and tested in Powershell 4.0. So when you run this script, please make sure that the Powershell version is 4.0 or above." -ForegroundColor Yellow Write-Host " " Write-Host "This script is suitable for single record entry as well as bulk entry. For every operation there are two options, Single and Bulk." -ForegroundColor Yellow Write-Host " " Write-Host "For Bulk entry you need to create an input file in CSV format. Sample input files has been provided along with this script." -ForegroundColor Yellow Write-Host " " Write-Host "While Deleting records/zones, you will get a confirmation box for each record/zone, to ensure that you are going to delete the correct record/zone." -ForegroundColor Yellow Write-Host " " Write-Host "While this script has been tested multiple times, I recommend that you test it in a test environment before using it in a production environment." -ForegroundColor Yellow Write-Host " " Write-Host "While every attempt has been made to make this script error free, I will not take any responsibilty for any consequence that would happen while running the script." -ForegroundColor Yellow Write-Host " " Write-Host "--------------MENU----------------" -ForegroundColor Yellow Write-Host " " Write-Host "1: Press 1 to Lookup DNS Records." -ForegroundColor DarkYellow Write-Host " " Write-Host "2: Press 2 to Create Host Records." -ForegroundColor Green Write-Host " " Write-Host "3: Press 3 to Create PTR Records." -ForegroundColor Green Write-Host " " Write-Host "4: Press 4 to Create CNAME Records." -ForegroundColor Green Write-Host " " Write-Host "5: Press 5 to Delete Host Records." -ForegroundColor Red Write-Host " " Write-Host "6: Press 6 to Delete PTR Records." -ForegroundColor Red Write-Host " " Write-Host "7: Press 7 to Delete CNAME Records." -ForegroundColor Red Write-Host " " Write-Host "8: Press 8 to Create DNS Primary Zones." -ForegroundColor DarkGreen Write-Host " " Write-Host "9: Press 9 to Create DNS Secondary Zones." -ForegroundColor DarkGreen Write-Host " " Write-Host "10: Press 10 to Create DNS Stub Zones." -ForegroundColor DarkGreen Write-Host " " Write-Host "11: Press 11 to Remove DNS Zones." -ForegroundColor Magenta Write-Host " " Write-Host "Q: Press 'Q' to quit this Program." -ForegroundColor Yellow Write-Host "----------------------------------" -ForegroundColor Yellow Write-Host " " } function Lookup-SingleDNS { cls 'You chose Single DNS Lookup' $input = Read-Host "Please enter the value you want to look up" Resolve-DNSName -Name "$input" -nohostsfile -type All | ft Name,Type,TTL,IPAddress,NameExchange,Preference,Namehost,Strings -autosize " " " " } function Lookup-BulkDNS { cls 'You chose Bulk DNS Lookup' $inputfile = Read-Host "Please enter the location of the input file" $output1= Read-Host "Please enter the location of the output file" $records= Get-Content $inputfile foreach ($record in $records) { $output=Resolve-DnsName -Name $record -NoHostsFile -Type ALL $output | ft Name,Type,TTL,IPAddress,NameExchange,Preference,Namehost,Strings | Out-File -FilePath $output1 -Append -NoClobber } } function Create-SingleHostEntry { Write-Host "Please Note: While creating the Host Record, corresponding PTR Record will also be created." -ForegroundColor Yellow $dnsserver= Read-Host "Please enter the DNS Server Name where you want to create the Record" $zone= Read-Host "Please enter the Zone Name" $name= Read-Host "Please enter the Record value (Without FQDN )" $IP= Read-Host "Please enter the IP Address" Add-DnsServerResourceRecordA -ComputerName $dnsserver -Name $name -ZoneName $zone -IPv4Address $IP -TimeToLive 05:00:00 -CreatePtr -Confirm } function Create-BulkHostEntry { Write-Host "Please Note: While creating the Host Records, corresponding PTR Records will also be created." -ForegroundColor Yellow $dnsserver= Read-Host "Please enter the DNS Server Name where you want to create the Record" $zone= Read-Host "Please enter the Zone Name" $inputfile = Read-Host "Please enter the location of the input file. Input file format is CSV. Include these headers: name,IP" import-csv -Path $inputfile | foreach-object {Add-DnsServerResourceRecordA -computername $dnsserver -Name $_.name -ZoneName $zone -IPv4Address $_.IP -TimeToLive 05:00:00 -CreatePtr -Confirm} } function Create-SinglePTREntry { $dnsserver= Read-Host "Please enter the DNS Server Name where you want to create the Record" $zone= Read-Host "Please enter the Reverse Zone Name" $FQDN= Read-Host "Please enter the record value (With FQDN )" $IP= Read-Host "Please enter the Host octects of the IP Address.Please do not include Network octets." Add-DnsServerResourceRecord -ComputerName $dnsserver -Name $IP -PTR -ZoneName $zone -PtrDomainName $FQDN -TimeToLive 05:00:00 -Confirm } function Create-BulkPTREntry { $dnsserver= Read-Host "Please enter the DNS Server Name where you want to create the Record" $zone= Read-Host "Please enter the Reverse Zone Name" $inputfile = Read-Host "Please enter the location of the input file. Input file format is CSV. Include these headers: IP,FQDN" Import-Csv -Path $inputfile| foreach-object { Add-DnsServerResourceRecord -ComputerName $dnsserver -Name $_.IP -PTR -ZoneName $zone -PtrDomainName $_.FQDN -TimeToLive 05:00:00 -Confirm} } function Create-SingleCNAMEEntry { $dnsserver= Read-Host "Please enter the DNS Server Name where you want to create the Record" $alias= Read-Host "Please enter the Alias Name" $zonename= Read-Host "Please enter the Zone Name" $FQDN= Read-Host "Please enter the FQDN for the alias" Add-DnsServerResourceRecord -CName -ComputerName $dnsserver -Name $alias -HostNameAlias $FQDN -ZoneName $zonename -TimeToLive 05:00:00 -Confirm } function Create-BulkCNAMEEntry { $dnsserver= Read-Host "Please enter the DNS Server Name where you want to create the Record" $zonename= Read-Host "Please enter the Zone Name" $inputfile = Read-Host "Please enter the location of the input file. Input file format is CSV: alias,FQDN." Import-Csv -Path $inputfile| foreach-object { Add-DnsServerResourceRecord -CName -ComputerName $dnsserver -Name $_.alias -HostNameAlias $_.FQDN -ZoneName $zonename -TimeToLive 05:00:00 -Confirm } } function Delete-SingleHostEntry { Write-Host "There can be multiple Host Records (IPs) for a given Host Name. Only one Record for the matching IP address will be deleted" -ForegroundColor Yellow Write-Host "For Example: In contoso.com zone there are two records test.contoso.com > 192.168.1.23 and test.contoso.com > 192.168.1.24. This action removes only one of the entries of test.contoso.com, matching the IP address." -ForegroundColor Yellow $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to delete the Record" $zonename= Read-Host "Please enter the Zone Name" $name= Read-Host "Please enter the Host Record Name (Without FQDN)" $IP= Read-Host "Please enter the IP address of the Host Record which you want to delete" Remove-DnsServerResourceRecord -computername $dnsserver -ZoneName $zonename -RRType "A" -Name $name -RecordData $IP -Confirm } function Delete-BulkHostEntry { Write-Host "There can be multiple Host Records (IPs) for a given Host Name. Only one Record for the matching IP address will be deleted" -ForegroundColor Yellow Write-Host "For Example: In contoso.com zone there are two records test.contoso.com > 192.168.1.23 and test.contoso.com > 192.168.1.24. This action removes only one of the entries of host.contoso.com, matching the IP address." -ForegroundColor Yellow Write-Host " " $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to delete the record" $zonename= Read-Host "Please enter the Zone Name where the Record is located" $inputfile = Read-Host "Please enter the location of the input file. Input file format is CSV.Please include these Headers: name,IP." Import-Csv -Path $inputfile| foreach-object { Remove-DnsServerResourceRecord -computername $dnsserver -ZoneName $zonename -RRType "A" -Name $_.name -RecordData $_.IP -Confirm } } function Delete-SinglePTREntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to delete the record" $zonename= Read-Host "Please enter the Reverse Zone Name" $IP= Read-Host "Please enter the IP address of the PTR Record. Only the Host octet is required." Remove-DnsServerResourceRecord -computername $dnsserver -ZoneName $zonename -RRType "PTR" -Name $IP -Confirm } function Delete-BulkPTREntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to delete the record" $zonename= Read-Host "Please enter the Reverse Zone Name" $inputfile = Read-Host "Please enter the location of the input file. Input file format is CSV.Include the Header: IP." Import-Csv -Path $inputfile| foreach-object {Remove-DnsServerResourceRecord -computername $dnsserver -ZoneName $zonename -RRType "PTR" -Name $_.IP -Confirm} } function Delete-SingleCNAMEEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to delete the record" $zonename= Read-Host "Please enter the Zone Name where the Alias Record is located." $name= Read-Host "Please enter the Alias Name (Without FQDN)" Remove-DnsServerResourceRecord -computername $dnsserver -ZoneName $zonename -RRType "CNAME" -Name $name -Confirm } function Delete-BulkCNAMEEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to delete the record" $zonename= Read-Host "Please enter the Zone Name where the Alias Record is located." $inputfile = Read-Host "Please enter the location of the input file. Input file format is CSV. Include the Header: aliasname." Import-Csv -Path $inputfile | foreach-object {Remove-DnsServerResourceRecord -computername $dnsserver -ZoneName $zonename -RRType "CNAME" -Name $_.alias -Confirm} } function Add-PrimaryDNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to create the DNS zone" $zonename= Read-Host "Please enter the Primary DNS Zone Name you want to create. (i.e. ldlnet.local)" $replicationScope= Read-Host "Please enter the AD replication scope for the zone. (i.e. Forest, Domain, or Legacy)" $dynamicupdate= Read-Host "Please specify how the Zone accepts dynamic updates. (i.e. None, Secure, NonSecureAndSecure)" Add-DnsServerPrimaryZone -ComputerName $dnsserver -Name $zonename -ReplicationScope $replicationScope -DynamicUpdate $dynamicupdate -PassThru -Confirm:$true } function Add-PrimaryBulkDNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to create the DNS zones" $replicationScope= Read-Host "Please enter the AD replication scope for the zone. (i.e. Forest, Domain, or Legacy)" $dynamicupdate= Read-Host "Please specify how the Zone accepts dynamic updates. (i.e. None, Secure, NonSecureAndSecure)" $inputfile = Read-Host "Please enter the full path of the input file. Input file format is CSV. Include the Header: zone." Import-Csv -Path $inputfile | foreach-object {Add-DnsServerPrimaryZone -ComputerName $dnsserver -Name $_.zone -ReplicationScope $replicationScope -DynamicUpdate $dynamicupdate -PassThru -Confirm:$true} } function Add-SecondaryDNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to create the DNS zone" $zonename= Read-Host "Please enter the Secondary DNS Zone Name you want to create. (i.e. ldlnet.local)" $replicationScope= Read-Host "Please enter the AD replication scope for the zone. (i.e. Forest, Domain, or Legacy)" $dynamicupdate= Read-Host "Please specify how the Zone accepts dynamic updates. (i.e. None, Secure, NonSecureAndSecure)" Add-DnsServerSecondaryZone -ComputerName $dnsserver -Name $zonename -ReplicationScope $replicationScope -DynamicUpdate $dynamicupdate -PassThru -Confirm:$true } function Add-SecondaryBulkDNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to create the DNS zones" $replicationScope= Read-Host "Please enter the AD replication scope for the zone. (i.e. Forest, Domain, or Legacy)" $dynamicupdate= Read-Host "Please specify how the Zone accepts dynamic updates. (i.e. None, Secure, NonSecureAndSecure)" $inputfile = Read-Host "Please enter the full path of the input file. Input file format is CSV. Include the Header: zone." Import-Csv -Path $inputfile | foreach-object {Add-DnsServerSecondaryZone -ComputerName $dnsserver -Name $_.zone -ReplicationScope $replicationScope -DynamicUpdate $dynamicupdate -PassThru -Confirm:$true} } function Add-StubDNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to create the DNS zone" $zonename= Read-Host "Please enter the Stub DNS Zone Name you want to create. (i.e. ldlnet.local)" $replicationScope= Read-Host "Please enter the AD replication scope for the zone. (i.e. Forest, Domain, or Legacy)" $ms= Read-Host "Please specify the ip of the server that holds the authoritative records for the zone. (i.e. 192.168.3.107)" Add-DnsServerStubZone -ComputerName $dnsserver -Name $zonename -ReplicationScope $replicationScope -MasterServers $ms -PassThru -Confirm:$true } function Add-StubBulkDNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to create the DNS zones" $replicationScope= Read-Host "Please enter the AD replication scope for the zone. (i.e. Forest, Domain, or Legacy)" $ms= Read-Host "Please specify the ip of the server that holds the authoritative records for the zone. (i.e. 192.168.3.107)" $inputfile = Read-Host "Please enter the full path of the input file. Input file format is CSV. Include the Header: zone." Import-Csv -Path $inputfile | foreach-object {Add-DnsServerStubZone -ComputerName $dnsserver -Name $_.zone -ReplicationScope $replicationScope -MasterServers $ms -PassThru -Confirm:$true} } function Remove-DNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to remove the DNS zone" $zonename= Read-Host "Please enter the Stub DNS Zone Name you want to remove. (i.e. ldlnet.local)" Remove-DnsServerZone -ComputerName $dnsserver -Name $zonename -PassThru -Confirm:$true -Verbose } function Remove-BulkDNSZoneEntry { $dnsserver= Read-Host "Please enter the DNS Server Name from where you want to remove the DNS zones" $inputfile = Read-Host "Please enter the full path of the input file. Input file format is CSV. Include the Header: zone." Import-Csv -Path $inputfile | foreach-object {Remove-DnsServerZone -ComputerName $dnsserver -Name $_.zone -PassThru -Confirm:$true -Verbose} } do { Show-Menu $input = Read-Host "Please make a selection" switch ($input) { '1' { cls 'You have selected option #1' 'Press 1 for Single DNS Lookup' 'Press 2 for Bulk DNS Lookup' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Lookup-SingleDNS } 2 { Lookup-BulkDNS } Q { Show-Menu } } } '2' { cls 'You have selected option #2' 'Press 1 for Single Host Entry' 'Press 2 for Bulk Host Entries' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Create-SingleHostEntry } 2 { Create-BulkHostEntry } Q { Show-Menu } } } '3' { cls 'You have selected option #3' 'Press 1 for Single PTR Record' 'Press 2 for Bulk PTR Records' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Create-SinglePTREntry } 2 { Create-BulkPTREntry } Q { Show-Menu} } } '4' { cls 'You have selected option #4' 'Press 1 for Single CNAME Record' 'Press 2 for Bulk CNAME Records' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Create-SingleCNAMEEntry } 2 { Create-BulkCNAMEEntry } Q { Show-Menu } } } '5' { cls 'You have selected option #5' 'Press 1 to Delete Single Host Record' 'Press 2 to Delete multiple Host Records ' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Delete-SingleHostEntry } 2 { Delete-BulkHostEntry } Q { Show-Menu } } } '6' { cls 'You have selected option #6' 'Press 1 to Delete Single PTR Record' 'Press 2 to Delete multiple PTR Records ' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Delete-SinglePTREntry } 2 { Delete-BulkPTREntry } Q { Show-Menu} } } '7' { cls 'You have selected option #7' 'Press 1 to Delete Single CNAME Record' 'Press 2 to Delete multiple CNAME Records ' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Delete-SingleCNAMEEntry } 2 { Delete-BulkCNAMEEntry } Q { Show-Menu } } } '8' { cls 'You have selected option #8' 'Press 1 to Create a Single Primary DNS Zone' 'Press 2 to Create multiple Primary DNS Zones' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Add-PrimaryDNSZoneEntry } 2 { Add-PrimaryBulkDNSZoneEntry } Q { Show-Menu } } } '9' { cls 'You have selected option #9' 'Press 1 to Create a Single Secondary DNS Zone' 'Press 2 to Create multiple Secondary DNS Zones' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Add-SecondaryDNSZoneEntry } 2 { Add-SecondaryBulkDNSZoneEntry } Q { Show-Menu } } } '10' { cls 'You have selected option #10' 'Press 1 to Create a Single Stub DNS Zone' 'Press 2 to Create multiple Stub DNS Zones' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Add-StubDNSZoneEntry } 2 { Add-StubBulkDNSZoneEntry } Q { Show-Menu } } } '11' { cls 'You have selected option #11' 'Press 1 to Remove a Single DNS Zone' 'Press 2 to Remove multiple DNS Zones' 'Press Q to Return to the Main Menu' $input1=Read-Host "Please make a selection" switch ($input1) { 1 { Remove-DNSZoneEntry } 2 { Remove-BulkDNSZoneEntry } Q { Show-Menu } } } 'q' { return } } pause } until ($input -eq 'q') |
The DNS Zone functions have not been tested as of yet. I still have to get on my server farm at home and run this. It will save time though with me having to switch servers when adding a bulk list of DNS zones for my website farm. Play with the script and let me know what you think!
Removing a DNS Record through Powershell
In most environments, an admin usually just jumps on the server that they need to work from and does their work from there. An example of this would be an admin working on an IIS Web server and needing to remove a DNS A record from DNS without having to logon to the DNS server itself so that they can quickly make their changes in IIS.
A quick way to do this would be to run the following ps1 script in PowerShell in order to be able to remove the record quickly:
1 2 3 4 5 6 7 8 9 10 11 | $NodeToDelete = Read-Host "Please Input the Name of the A Record you want to delete. NO FQDN" $DNSServer = Read-Host "Please Input your DNS Server FQDN" $ZoneName = Read-Host "Please Input the DNS Zone the A Record is residing in" $NodeDNS = $null $NodeDNS = Get-DnsServerResourceRecord -ZoneName $ZoneName -ComputerName $DNSServer -Node $NodeToDelete -RRType A -ErrorAction SilentlyContinue if($NodeDNS -eq $null){ Write-Host "The DNS A Record You Were Looking For Was Not Found" -ForeGroundColor Red } else { Remove-DnsServerResourceRecord -ZoneName $ZoneName -ComputerName $DNSServer -InputObject $NodeDNS -Force Write-Host "Your DNS A Record $NodeToDelete Has Been Removed" -ForeGroundColor Green } |

Now this works for a single DNS A Record. If there are multiple IPs for the same DNS record, for example, test.ldlnet.local points to both 192.168.1.23 and 192.168.1.24, then you probably need to run the following script listed here to keep the script from failing with an error. I have also expanded the entries to help the input be more specific:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | Write-Host "This script will remove a DNS Record based on the information provided" -ForegroundColor Yellow Write-Host "There can be multiple Host Records (IPs) for a given Host Name. Only one Record going with the matching IP address that is input will be deleted" -ForegroundColor Red Write-Host "For Example: In the ldlnet.org zone there are two records test.ldlnet.org > 192.168.1.23 and test.ldlnet.org > 192.168.1.24. This action removes only one of the entries of test.ldlnet.org, matching the IP address that was input." -ForegroundColor Red $NodeToDelete = Read-Host "Please Input the Name of the DNS Record you want to delete. (NO FQDN)" $DNSServer = Read-Host "Please Input your DNS Server FQDN" $ZoneName = Read-Host "Please Input the DNS Zone the DNS Record is residing in" $RecordType = Read-Host "Please Input the Type of DNS Record It Is (A, CNAME, TXT, etc...)" $IP = Read-Host "Please Input the IP Address of the Associcated DNS Record" $NodeDNS = $null $NodeDNS = Get-DnsServerResourceRecord -ZoneName $ZoneName -ComputerName $DNSServer -Node $NodeToDelete -RRType $RecordType -ErrorAction SilentlyContinue if($NodeDNS -eq $null){ Write-Host "The DNS A Record You Were Looking For Was Not Found" -ForeGroundColor Red } else { Remove-DnsServerResourceRecord -ZoneName $ZoneName -ComputerName $DNSServer -RecordData $IP -Name $NodeToDelete -RRType $RecordType -Force -ErrorAction Stop Write-Host "Your DNS A Record $NodeToDelete Has Been Removed" -ForeGroundColor Green } |

I have found some other good scripts that I will post to the blog to help manage DNS records through PowerShell. This should get things started for now. Happy Troubleshooting!
MaxConcurrentAPI Script for Netlogon Issues
I get incidents from time to time that deal with Netlogon Service Issues. For example: Semaphore Waiters, Semaphore Timeouts, Semaphore Acquires, etc…
Here is a script I got from the Microsoft Gallery
In some enterprise environments the sheer volume of NTLM authentication can produce performance bottlenecks on servers. To help make the problem easier to detect, this PowerShell script was written.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 | PARAM ([Switch]$CheckMaxConcurrentApi, [switch]$GetNetlogonInstances, [string]$Computer = "Localhost", [string]$Instance = "_Total", [bool]$CalcMCA = $False) #************************************************ # CheckMaxConcurrentApiScript.ps1 # Version 1.0 # Date: 3/22/2013 # Author: Tim Springston [MSFT] # Description:Uses .Net methods and WMI to query # a remote or local computer for MaxConcurrentApi # problems and return details back in a PSObject. #************************************************ function CheckMaxConcurrentApi ([string]$InstanceName = "_Total", [string]$ComputerName = "localhost", [bool]$Calc = $false) { #This function takes three optional parameters to select Netlogon Instance (can be obtained by using #sister function GetNetlogonInstances, computer to run against and whether to run #MaxConcurrentApi calculation-which takes longer. #It returns details about the computer, whether the problem is detected, and suggested MaxConcurrentApi value. $ProblemDetected = $false $Date = Get-Date #Get role, OSVer, hotfix data. $cs = gwmi -Namespace "root\cimv2" -class win32_computersystem -Impersonation 3 -ComputerName $ComputerName $DomainRole = $cs.domainrole $OSVersion = gwmi -Namespace "root\cimv2" -Class Win32_OperatingSystem -Impersonation 3 -ComputerName $ComputerName $bn = $OSVersion.BuildNumber $150Hotfix = $false #Determine how long the computer has been running since last reboot. $wmi = Get-WmiObject -Class Win32_OperatingSystem -ComputerName $ComputerName $LocalDateTime = $wmi.LocalDateTime $Uptime = $wmi.ConvertToDateTime($wmi.LocalDateTime) – $wmi.ConvertToDateTime($wmi.LastBootUpTime) $Days = $Uptime.Days.ToString() $Hours = $Uptime.Hours.ToString() $UpTimeStatement = $Days + " days " + $Hours + " hours" #Get SystemRoot so that we can map the right drive for checking file versions. $objReg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine',$ComputerName) $ObjRegKey = $ObjReg.OpenSubKey("SOFTWARE\\Microsoft\\Windows NT\\CurrentVersion") $SystemRoot = $ObjRegKey.GetValue("SystemRoot") #Parse the drive letter for the remote systemroot and then map the network drive. First though #make sure the drive you will map to doesnt exist already. $Drives = gwmi -Namespace "root\cimv2" -ComputerName $ComputerName -Impersonation Impersonate -Class Win32_LogicalDisk $DriveLetters = @() ForEach ($Drive in $Drives) { $Caption = $Drive.Caption $DriveLetters += $Caption $Caption = $null } if ($ComputerName -ne "localhost") { $PossibleLetters = [char[]]"DEFGHIJKLMNOPQRTUVWXY" $devices = get-wmiobject win32_logicaldisk | select -expand DeviceID $PossibleLetters | where {$DriveLetters -notcontains "$($_):"} | Select -First 1 | % { $AvailableDriveLetter = $_ } $DriveToMap = $SystemRoot $DriveToMap = $DriveToMap.Replace(":\Windows","") $DriveToMap = "\\" + $ComputerName + "\" + $DriveToMap + "$" $AvailableDriveLetter = $AvailableDriveLetter + ":" $NETUSEReturn = net use $AvailableDriveLetter $DriveToMap $RemoteSystem32Folder = $AvailableDriveLetter + "\Windows\System32" $NetlogonDll = $RemoteSystem32Folder + "\Netlogon.dll" } else { $NetlogonDll = $SystemRoot + "\System32\Netlogon.dll" } #Check the file versions for the hotfixes. $FileVer = [System.Diagnostics.FileVersionInfo]::GetVersionInfo($NetlogonDll).FileVersion switch -exact ($bn) {"6002" {#Hotfix Check for MCA to 150 KB975363 http://support.microsoft.com/kb/975363 $6002HotfixVer = "6.0.6002.22289" if ($6002HotfixVer -eq $FileVer) {$150Hotfix = $true} } "7600" {#Hotfix Check for MCA to 150 KB975363 http://support.microsoft.com/kb/975363 $7600HotfixVer = "6.1.7600.20576" if ($7600HotfixVer -eq $FileVer) {$150Hotfix = $true} } "7601" {#Hotfix Check for MCA to 150 KB975363 http://support.microsoft.com/kb/975363 $150Hotfix = $true } } if ($ComputerName -ne "localhost") { $NETUSEReturn = net use $AvailableDriveLetter /d } #Determine effective MaxConcurrentApi setting based on OS, hotfix presence, role and registry setting. $objReg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $ComputerName) $objRegKey= $objReg.OpenSubKey("SYSTEM\\CurrentControlSet\\services\\Netlogon\\Parameters") $MCARegVal = $objRegKey.GetValue('MaxConcurrentApi') $CurrentMCA = 0 if ($DomainRole -gt 1) { if ($bn -ge 9200) { If (($MCARegVal -lt 10) -or ($MCARegVal -eq $null) -or ($MCARegVal -gt 9999)) {$CurrentMCA = 10} elseif (($MCARegVal -gt 10) -and ($MCARegVal -lt 9999)) {$CurrentMCA = $MCARegVal} } else { If (($MCARegVal -gt 10) -and ($150Hotfix -eq $true)) { $CurrentMCA = $MCARegVal} elseif (($MCARegVal -gt 10) -and ($150Hotfix -eq $false)) {$CurrentMCA = 2} elseif (( $MCARegVal -gt 2 ) -and ($MCARegVal -le 10)) {$CurrentMCA = $MCARegVal} elseif ($MCARegVal -lt 2) {$CurrentMCA = 2} elseif ($MCARegVal -eq $null) {$CurrentMCA = 2} } } #Get a sample of the counters. $Category = "Netlogon" $CounterASHT = "Average Semaphore Hold Time" $CounterST = "Semaphore Timeouts" $CounterSA = "Semaphore Acquires" $CounterSH = "Semaphore Holders" $CounterSW = "Semaphore Waiters" #Query remote computer for counters. $NetlogonRemoteASHT = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterASHT,$InstanceName,$ComputerName) $NetlogonRemoteST = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterST,$InstanceName,$ComputerName) $NetlogonRemoteSA = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterSA,$InstanceName,$ComputerName) $NetlogonRemoteSW = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterSW,$InstanceName,$ComputerName) $NetlogonRemoteSH = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterSH,$InstanceName,$ComputerName) #Cook values $CookedASHT = $NetlogonRemoteASHT.NextValue() $CookedST = $NetlogonRemoteST.NextValue() $CookedSA = $NetlogonRemoteSA.NextValue() $CookedSW = $NetlogonRemoteSW.NextValue() $CookedSH = $NetlogonRemoteSH.NextValue() if ((($CookedSW -gt 0) -and (-not($CookedSW -gt 4GB))) -or ($CookedSH -eq $CurrentMCA) -or ((($CookedST -gt 0) -and (-not($CookedST -gt 4GB))) -and (($CookedSW -gt 0) -and (-not($CookedSW -gt 4GB))))) {$ProblemDetected = $true} #Do a second data sample and compare results in order to run the "suggested MCA" math. if (($ProblemDetected -eq $true) -and ($Calc -eq $true)) { Start-Sleep -Seconds 60 $NetlogonRemoteASHT = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterASHT,$InstanceName,$ComputerName) $NetlogonRemoteST = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterST,$InstanceName,$ComputerName) $NetlogonRemoteSA = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterSA,$InstanceName,$ComputerName) $NetlogonRemoteSW = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterSW,$InstanceName,$ComputerName) $NetlogonRemoteSH = New-Object System.Diagnostics.PerformanceCounter($Category,$CounterSH,$InstanceName,$ComputerName) #Cook values $SecondCookedASHT = $NetlogonRemoteASHT.NextValue() $SecondCookedST = $NetlogonRemoteST.NextValue() $SecondCookedSA = $NetlogonRemoteSA.NextValue() $SecondCookedSW = $NetlogonRemoteSW.NextValue() $SecondCookedSH = $NetlogonRemoteSH.NextValue() #Next, calculate the suggested MCA #using formula from http://support.microsoft.com/kb/2688798 #(semaphore_acquires + semaphore_timeouts) * average_semaphore_hold_time / time_collection_length =< New_MaxConcurrentApi_setting #subtract Sample1SA from Sample2SA = SampleSADelta $SampleSADelta = ($SecondCookedSA - $CookedSA) $SampleSTDelta = ($SecondCookedST - $CookedST) $ASHT = ($SecondCookedASHT + $CookedASHT) $SampleASHTDelta = ($ASHT / 2 ) $SamplesDeltaSAST = ($SampleSADelta + $SampleSTDelta) $AllSampleDeltas = ($SampleASHTDelta * $SamplesDeltaSAST) $AllSampleDeltas /= 90 $SuggestedMCA = $AllSampleDeltas $SuggestedMCA = "{0:N0}" -f $SuggestedMCA if ($SuggestedMCA -le 2) {$SuggestedMCA = $CurrentMCA} } #Create PSObject for returned data. $ReturnedData = New-Object PSObject add-member -inputobject $ReturnedData -membertype noteproperty -name "Detection Time" -value $Date add-member -inputobject $ReturnedData -membertype noteproperty -name "Problem Detected" -value $ProblemDetected add-member -inputobject $ReturnedData -membertype noteproperty -name "Server Name" -value $cs.Name if ($cs.DomainRole -le 1) {add-member -inputobject $ReturnedData -membertype noteproperty -name "Server Role" -value "Client"} if (($cs.DomainRole -eq 3) -or ($cs.DomainRole -eq 2)) {add-member -inputobject $ReturnedData -membertype noteproperty -name "Server Role" -value "Member Server"} if ($cs.DomainRole -ge 4) {add-member -inputobject $ReturnedData -membertype noteproperty -name "Server Role" -value "Domain Controller"} add-member -inputobject $ReturnedData -membertype noteproperty -name "Domain Name" -value $cs.Domain add-member -inputobject $ReturnedData -membertype noteproperty -name "Operating System" -value $OSVersion.Caption add-member -inputobject $ReturnedData -membertype noteproperty -name "Time Since Last Reboot" -value $UpTimeStatement add-member -inputobject $ReturnedData -membertype noteproperty -name "Current Effective MaxConcurrentApi Setting" -value $CurrentMCA if ($SuggestedMCA -eq $null) {add-member -inputobject $ReturnedData -membertype noteproperty -name "Suggested MaxConcurrentApi Setting (may be same as current)" -value $CurrentMCA} else {add-member -inputobject $ReturnedData -membertype noteproperty -name "Suggested MaxConcurrentApi Setting (may be same as current)" -value $SuggestedMCA} add-member -inputobject $ReturnedData -membertype noteproperty -name "Current Threads in Use (Semaphore Holders)" -value $CookedSH add-member -inputobject $ReturnedData -membertype noteproperty -name "Clients Currently Waiting (Semaphore Waiters)" -value $CookedSW add-member -inputobject $ReturnedData -membertype noteproperty -name "Cumulative Client Timeouts (Semaphore Timeouts) " -value $CookedST add-member -inputobject $ReturnedData -membertype noteproperty -name "Cumulative MaxConcurrentApi Thread Uses (Semaphore Acquires)" -value $CookedSA add-member -inputobject $ReturnedData -membertype noteproperty -name "Duration of Calls (Avg Semaphore Hold Time)" -value $CookedASHT return $ReturnedData } function GetNetlogonInstances ([string]$RemoteComputerName = "localhost") { #This function takes a computer name as input (default to local computer) and returns #the instances-analagous to secure channels-a computer has. #Format returned is \\hostname.domainname.com. if ($RemoteComputerName -eq $null) { $LocalNetlogon = New-Object System.Diagnostics.PerformanceCounterCategory("Netlogon",$RemoteComputerName) $LocalInstances = $LocalNetlogon.GetInstanceNames() $AllLocalInstances = @() foreach ($LocalInstance in $LocalInstances) { if ($LocalInstance -ne "_total") { $AllLocalInstances += $LocalInstance } } if ($AllLocalInstances -eq $null) { WriteTo-StdOut "The local computer was missing its DC perf instance so getting DC name from WMI." -shortformat $Query = "select * from win32_ntdomain where description = '" + $env:userdomain + "'" $v2 = get-wmiobject -query $Query $DCName = $v2.DomainControllerName $AllLocalInstances += $DCName WriteTo-StdOut "DCName is $AllLocalInstances" -shortformat } return $AllLocalInstances } else { $RemoteNetlogon = New-Object System.Diagnostics.PerformanceCounterCategory("Netlogon",$RemoteComputerName) $RemoteInstances = $RemoteNetlogon.GetInstanceNames() $AllRemoteInstances = @() foreach ($RemoteInstance in $RemoteInstances) { if ($RemoteInstance -ne "_Total") { $AllRemoteInstances += $RemoteInstance } } if ($AllRemoteInstances -eq $null) { #If the local computer was missing its DC perf instance so getting DC name from WMI. $Query = "select * from win32_ntdomain where description = '" + $env:userdomain + "'" $v2 = get-wmiobject -query $Query $DCName = $v2.DomainControllerName $AllRemoteInstances += $DCName } return $AllRemoteInstances } } if (($CheckMaxConcurrentApi) -and ($Instance -ne "_Total") -and ($Computer -ne "Localhost") -and ($CalcMCA -eq $true)) {CheckMaxConcurrentApi -instancename $Instance -ComputerName $Computer -Calc $CalcMCA | FL } elseif (($CheckMaxConcurrentApi) -and ($Instance -ne "_Total") -and ($Computer -ne "Localhost")) {CheckMaxConcurrentApi -instancename $Instance -ComputerName $Computer | FL } elseif (($CheckMaxConcurrentApi) -and ($Instance -ne "_Total")) {CheckMaxConcurrentApi -instancename $Instance | FL} elseif (($CheckMaxConcurrentApi) -and ($Computer -ne "Localhost")) {CheckMaxConcurrentApi -ComputerName $Computer | FL} elseif (($CheckMaxConcurrentApi) -and ($CalcMCA -eq $true)) {CheckMaxConcurrentApi -Calc $calcmca | FL} elseif ($CheckMaxConcurrentApi) {CheckMaxConcurrentApi | FL} if (($GetNetlogonInstances) -and ($Computer -ne "Localhost")) {GetNetlogonInstances | FL} elseif ($GetNetlogonInstances) {GetNetlogonInstances -RemoteComputerName $Computer | FL} |
Execution:
Now, I modified this script taking out the clear screen parameter so that I could be run against multiple servers. Place the script in your Scripts directory and name it CheckMaxConcurrentApiScript.ps1
First, in PowerShell, gather your list of servers:
1 | $DCList = Get-ADDomainController -Filter * | Sort-Object Name | Select-Object Name |
Or
1 | $EXList = Get-ExchangeServer | Sort-Object Name | Select-Object Name |
Next, run the command to run the ps1 against those servers:
1 | foreach ($DC in $DCList) { $DC.Name ; Invoke-Command -Command {C:\Scripts\CheckMaxConcurrentApiScript.ps1 -checkmaxconcurrentapi -Computer $DC.Name -CalcMCA $true} } |
Or
1 | foreach ($EX in $EXList) { $EX.Name ; Invoke-Command -Command {C:\Scripts\CheckMaxConcurrentApiScript.ps1 -checkmaxconcurrentapi -Computer $EX.Name -CalcMCA $true} } |
Sample Output:
DC03
Detection Time : 12/13/2018 7:56:16 PM
Problem Detected : False
Server Name : DC03
Server Role : Domain Controller
Domain Name : ldlnet.org
Operating System : Microsoft Windows Server 2008 R2 Enterprise
Time Since Last Reboot : 4 days 22 hours
Current Effective MaxConcurrentApi Setting : 10
Suggested MaxConcurrentApi Setting (may be same as current) : 10
Current Threads in Use (Semaphore Holders) : 0
Clients Currently Waiting (Semaphore Waiters) : 0
Cumulative Client Timeouts (Semaphore Timeouts) : 17
Cumulative MaxConcurrentApi Thread Uses (Semaphore Acquires) : 3493999
Duration of Calls (Avg Semaphore Hold Time) : 0
EXCH02
Detection Time : 12/13/2018 8:00:53 PM
Problem Detected : False
Server Name : EXCH02
Server Role : Member Server
Domain Name : ldlnet.org
Operating System : Microsoft Windows Server 2008 R2 Standard
Time Since Last Reboot : 4 days 23 hours
Current Effective MaxConcurrentApi Setting : 10
Suggested MaxConcurrentApi Setting (may be same as current) : 10
Current Threads in Use (Semaphore Holders) : 0
Clients Currently Waiting (Semaphore Waiters) : 0
Cumulative Client Timeouts (Semaphore Timeouts) : 570
Cumulative MaxConcurrentApi Thread Uses (Semaphore Acquires) : 1682257
Duration of Calls (Avg Semaphore Hold Time) : 0
Hopefully, this script will assist you with gathering the needed information to help you balance the netlogon load between your servers when needed in your environment.
HAPPY TROUBLESHOOTING!
Protected AD Groups and the problems they can cause accounts
I have run into this issue over the years with accounts being in the Domain Admins group and having issues running PowerShell cmdlets as well as not being able to connect to ActiveSync from a mobile device with the account.
These issues are due to the AdminSDHolder Template in AD and the SDProp Process that is run every 60 Minutes in AD.
This is explained in fantastic detail through the following Microsoft article: Protected Accounts & Groups In Active Directory
Here is an example of an issue that occurred in one of the environments that I was managing. A user was trying to run the following AD cmdlet in PowerShell on DC01:
1 | Set-ADUser lancel -Server dc01.ldlnet.org -Replace @{title="Senior Operations Engineer"} |
The user got the following error when the cmdlet was executed:
Set-ADUser : Insufficient access rights to perform the operation
At line:1 char:1
+ Set-ADUser lancel -Server dc01.ldlnet.org -Replace @{title=”Senior O …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo: NotSpecified: (lancel:ADUser) [Set-ADUser], ADException
+ FullyQualifiedErrorId : ActiveDirectoryServer:8344,Microsoft.ActiveDirectory.Management.Commands.SetADUser
The issue was that the admin account used to run the cmdlet was in the Domain Admins group and was not inheriting permissions per the AdminSDHolder template that was applied to the account:
I checked to see that the admin account was in a protected group:

I next went to the Security Tab > Advanced Button and saw that the Enable Inheritance button was visible:

This verifies that the account is protected due to being in the Domain Admins group. Now, there are two workarounds for this particular error that we were experiencing.
- Click the Enable Inheritance button. This will cause the permissions to be inherited temporarily. When SDProp is cycled again, the account will lose any inherited permissions and will be essentially “broken” again. This is not good if you’re going to be running cmdlets regularly to modify AD Accounts.
- The preferred method to work around this issue is to set the -Server parameter to point to a different DC than the one you are on. So, essentially, we tell the cmdlet to execute on DC02 when running the cmdlet from DC01.
1 | Set-ADUser lancel -Server dc02.ldlnet.org -Replace @{title="Senior Operations Engineer"} |
Either method will allow the cmdlet to execute successfully and modify the object. You would think that Microsoft would have noticed this issue with running an admin cmdlet for Active Directory, but they have not fixed this issue as of yet nor do i think they plan to. I would just go with workaround number two and remain sane.
Another example of this Protected Group issue comes with an account in a Protected Group that has a mailbox not being able to connect to Exchange ActiveSync when setting up their mobile device.
- You usually get a 500 error on the device that you cannot connect.
- You will also see event 1053 in Event Viewer alluding to not having sufficient access to create the container for the user in AD.
Read this page for more information: Exchange ActiveSync Permissions Issue with Protected Groups
So, in your endeavors admins, keep this in mind when running into these types of problems. Happy Troubleshooting!