There is a current BUG is has been filed with Microsoft that relates to AIP/MIP Scanner and running a Unified Labeling content scan on premises. The main issue is with the Security and Compliance Center and it replicating the Policies that you create for your Sensitivity Labels in your M365 Tenant.
Since these Policies will not replicate, your content scans will fail and you will see the following error within the Azure Portal under Azure Information Protection:
Error: Policy is missing in AIP
You will be able to verify that the Policies are present in the Security and Compliance Center under the Information Protection page and the Policies Page in Azure Information Protection:
NOTE: I had created my labels and label policies in Azure AIP and migrated them to Security and Compliance Center via the following LINK
What you will also notice, if you create a policy in SCC, it will NOT replicate to Azure.
Next, I checked to see if the AIP Scanner Service Account has the policies applied to it as a member recipient of the policies. It needs to so that the account can apply labels to on premises accounts through the policy.
Let’s continue troubleshooting…
The AIP Scanner account was a member of a defined policies in the Security & Compliance Center and you are still having issues:
Is the AIP Scanner service started?
If the answer is no, start it
From PowerShell run the following:
1
Get-AIPScannerStatus|fl
Sample Output
It says it is scanning, but you are not getting results AND you have that Error: Policy is missing statement in the Nodes Tab of AIP.
The next thing to verify this whether or not the policy is replicating from SCC to Azure. This is done through PowerShell by running the following:
Connect to SCC PowerShell
Connect to SCC PowerShell
1
2
3
4
5
6
7
$userCredential=get-credential
Write-Host"Connecting to your Security and Compliance Center PowerShell Console"
Normal replication is up to 24 hours for a change or policy addition. So, if your WhenChanged or WhenCreated values are more than 24 hours old, then they are NOT replicating. You can further verify this by running the following:
Get Label Policy Detail Replication Status
1
get-labelpolicy-identity"Name of your Policy"|fl DistributionResults,LastStatusUpdateTime
Sample Output Replication Taking Longer Than Expected Error
What do I do next?
If you have this error, it would be best to log a support call with Microsoft and explain that you have the AIP UL Policy Replication Error. From my sources they are saying this is a known issue with the SCC and Azure that will be remediated by the end of October.
So, in the meantime, I guess we will wait!
WORKAROUND
After troubleshooting with this issue with some of my Microsoft Colleagues, I was able to get the Scanner to start scanning properly with out the error being listed in the Azure Portal. Here are the steps.
On the scanner node, right-click a file or folder and choose to protect it:
Next, within the AIP Application, choose Help and Feedback
Next, choose Reset Settings
Click Continue
Once completed, click Close, then exit the AIP application. This clears all the registry settings within the scanner node.
Now you will want to reset all the local files for the scanner
First, stop the scanner services for the scanner and network discovery
Stop AIP Scanner and Related Services
1
2
3
Stop-Service AIPScanner
Stop-Service AIPNetworkDiscovery
Next, navigate to the following folder for the local account that is used for AIP scanner. Example – C:\Users\AIPScanner\AppData\Local\Microsoft\MSIP Rename or Delete the MIP folder in that MSIP directory. (I renamed my folder to mip-old2)
Rename or Delete the mip folder
Restart the services you stopped
Stop AIP Scanner and Related Services
1
2
3
Start-Service AIPScanner
Start-Service AIPNetworkDiscovery
You should now see the scanner as Running and Working within the Azure Portal. No more errors should be listed.
Thanks to Angel Marroquin at Microsoft for the assistance on this workaround!
With the release of Unified Labeling in Azure and M365, there is now a way to protect your data and label your data appropriately for confidentiality and encryption for your files shares and files on your on premises devices. The following shows how to install the latest AIP_UL client and configure it in Azure to apply those Unified Labeling Policies.
This is a detailed process and I had some issues myself with getting the process simplified. I will do my best here to make this as smoot has possible with as many reference documents that I can input. Always feel free to comment as this data is ever changing and updating as Microsoft updates the offering.
Prerequisites
Please refer to this document for a full list of pre-requisites before deploying the scanner:
The prerequisites below are still required for successful AIP scanner installation.
A Windows Server 2012 R2 or greater Server to run the service
Minimum 4 CPU and 4GB RAM physical or virtual. NOTE: More RAM is better. The scanner will allocate RAM 2.5-3 times of size of all files being scanned in parallel. Thus, if you scan 40 files that are 20MB each at the same time, it should take about 202.540=2GB RAM. However, if you have one big 1GB file it can take 3GB of RAM just for that file.
Internet connectivity necessary for Azure Information Protection
A SQL Server 2012+ local or remote instance (Any version from Express or better is supported)
Sysadmin role needed to install scanner service (the user running Install-AIPScanner, not the service account)
NOTE: If using SQL Server Express, the SQL Instance name is ServerName\SQLExpress.
NOTE: At this time, a different SQL instance is needed for each AIP Scanner node.
Service account created in On Premises AD (I will call this account AIPScanner in this document).
Service requires Log on locally right and Log on as a service right (the second will be given during scanner service install).
Service account requires Read permissions to each repository for discovery and Read/Write permissions for classification/protection.
AzInfoProtection_UL.exe is available on the Microsoft Download Center (The scanner bits are included with the AIP Client)
The Azure AD Preview PowerShell module. From the machine you’re installing AIP Scanner on, run the following from an Administrator PowerShell:
1
Install-Module AzureADPreview
Configure the scanner in the Azure portal
Before you install the scanner, or upgrade it from an older general availability version, configure or verify your scanner settings in the Azure Information Protection area of the Azure portal.
To configure your scanner:
Sign in to the Azure portal with one of the following roles:
Compliance administrator
Compliance data administrator
Security administrator
Global administrator
Then, navigate to the Azure Information Protection pane. For example, in the search box for resources, services, and docs, start typing Information and select Azure Information Protection.
Create a scanner cluster. This cluster defines your scanner and is used to identify the scanner instance, such as during installation, upgrades, and other processes.
From the Scanner menu on the left, select Clusters.
On the Azure Information Protection – Clusters pane, select Add.
On the Add a new cluster pane, enter a meaningful name for the scanner, and an optional description. The cluster name is used to identify the scanner’s configurations and repositories. For example, you might enter Europe to identify the geographical locations of the data repositories you want to scan. You’ll use this name later on to identify where you want to install or upgrade your scanner.
Select Save to save your changes.
On the Add a new cluster pane, enter a meaningful name for the scanner, and an optional description. The cluster name is used to identify the scanner’s configurations and repositories. For example, you might enter Europe to identify the geographical locations of the data repositories you want to scan. You’ll use this name later on to identify where you want to install or upgrade your scanner.
Select Save to save your changes.
Create a network scan job (public preview)
Starting in version 2.8.85.0, you can scan your network for risky repositories. Add one or more of the repositories found to a content scan job to scan them for sensitive content.
Note: The network discovery interface is currently in gradual deployment and will be available in all regions by September 15, 2020.
Log in to the Azure portal, and go to Azure Information Protection. Under the Scanner menu on the left, select Network scan jobs (Preview).
On the Azure Information Protection – Network scan jobs pane, select Add.
On the Add a new network scan job page, define the following settings:
Network scan job name: Enter a meaningful name for this job. This field is required. Description: Enter a meaningful description. Select the cluster: From the dropdown, select the cluster you want to use to scan the configured network locations.
Tip: When selecting a cluster, make sure that the nodes in the cluster you assign can access the configured IP ranges via SMB.
Configure IP ranges to discover: Click to define an IP address or range.
In the Choose IP ranges pane, enter an optional name, and then a start IP address and end IP address for your range.
Tip: To scan a specific IP address only, enter the identical IP address in both the Start IP and End IP fields.
Set schedule: Define how often you want this network scan job to run.
If you select Weekly, the Run network scan job on setting appears. Select the days of the week where you want the network scan job to run. Set start time (UTC): Define the date and time that you want this network scan job to start running. If you’ve selected to run the job daily, weekly, or monthly, the job will run at the defined time, at the recurrence you’ve selected.
Note: Be careful when setting the date to any days at the end of the month. If you select 31, the network scan job will not run in any month that has 30 days or fewer.
Select Save to save your changes.
Tip: If you want to run the same network scan using a different scanner, change the cluster defined in the network scan job. Return to the Network scan jobs pane, and select Assign to cluster to select a different cluster now, or Unassign cluster to make additional changes later.
Analyze risky repositories found
Repositories found, either by a network scan job, a content scan job, or by user access detected in log files, are aggregated and listed on the Scanner > Repositoriesrepositories icon pane.
If you’ve defined a network scan job and have set it to run at a specific date and time, wait until it’s finished running to check for results. You can also return here after running a content scan job to view updated data.
Under the Scanner menu on the left, select Repositories. The repositories found are shown as follows:
The Repositories by status graph shows how many repositories are already configured for a content scan job, and how many are not.
The Top 10 unmanaged repositories by access graph lists the top 10 repositories that are not currently assigned to a content scan job, as well as details about their access levels. Access levels can indicate how risky your repositories are.
Do any of the following: Select Columns to change the table columns displayed. If your scanner has recently run network scan results, select Refresh to refresh the page. Select one or more repositories listed in the table, and then select Assign Selected Items to assign them to a content scan job. Filter The filter row shows any filtering criteria currently applied. Select any of the criteria shown to modify its settings, or select Add Filter to add new filtering criteria. Select Filter to apply your changes and refresh the table with the updated filter. In the top-right corner of the unmanaged repositories graph, click the Log Analytics icon to jump to Log Analytics data for these repositories.
Repositories with public access
Repositories where Public access is found to have read or read/write capabilities may have sensitive content that must be secured. If Public access is false, the repository not accessible by the public at all.
The accounts defined in these parameters are used to simulate the access of a weak user to the repository. If the weak user defined there can access the repository, this means that the repository can be accessed publicly.
To ensure that public access is reported correctly, make sure that the user specified in these parameters is a member of the Domain Users group only.
Create a content scan job
Deep dive into your content to scan specific repositories for sensitive content.
You may want to do this only after running a network scan job to analyze the repositories in your network, but can also define your repositories yourself.
Under the Scanner menu on the left, select Content scan jobs.
On the Azure Information Protection – Content scan jobs pane, select Add.
For this initial configuration, configure the following settings, and then select Save but do not close the pane.
Content scan job settings – Schedule: Keep the default of Manual – Info types to be discovered: Change to Policy only – Configure repositories: Do not configure at this time because the content scan job must first be saved.
Policy enforcement – Enforce: Select Off – Label files based on content: Keep the default of On – Default label: Keep the default of Policy default – Relabel files: Keep the default of OffConfigure file settings– Preserve “Date modified”, “Last modified” and “Modified by”: Keep the default of On – File types to scan: Keep the default file types for Exclude – Default owner: Keep the default of Scanner Account
Now that the content scan job is created and saved, you’re ready to return to the Configure repositories option to specify the data stores to be scanned. Specify UNC paths, and SharePoint Server URLs for SharePoint on-premises document libraries and folders.
Note: SharePoint Server 2019, SharePoint Server 2016, and SharePoint Server 2013 are supported for SharePoint. SharePoint Server 2010 is also supported when you have extended support for this version of SharePoint.
To add your first data store, while on the Add a new content scan job pane, select Configure repositories to open the Repositories pane:
On the Repositories pane, select Add:
On the Repository pane, specify the path for the data repository, and then select Save.
For a network share, use \\Server\Folder.
For a SharePoint library, use http://sharepoint.contoso.com/Shared%20Documents/Folder.
For a local path: C:\Folder
For a UNC path: \\Server\Folder
Note: Wildcards are not supported and WebDav locations are not supported.
If you add a SharePoint path for Shared Documents:
Specify Shared Documents in the path when you want to scan all documents and all folders from Shared Documents. For example:http://sp2013/SharedDocuments
Specify Documents in the path when you want to scan all documents and all folders from a subfolder under Shared Documents. For example: http://sp2013/Documents/SalesReports or, specify only the FQDN of your SharePoint, For example http://sp2013 to discover and scan all SharePoint sites and subsites under a specific URL and subtitles under this URL.
Grant scanner Site Collector Auditor rights to enable this.
For the remaining settings on this pane, do not change them for this initial configuration, but keep them as Content scan job default. The default setting means that the data repository inherits the settings from the content scan job.
Use the following syntax when adding SharePoint paths: Root path: http://<SharePoint server name>
Scans all sites, including any site collections allowed for the scanner user. Requires additional permissions to automatically discover root content
Specific SharePoint subsite or collection: One of the following: – http://<SharePoint server name>/<subsite name> – http://SharePoint server name>/<site collection name>/<site name>
Specific SharePoint library: One of the following: – http://<SharePoint server name>/<library name> – http://SharePoint server name>/.../<library name>
Specific SharePoint folder: http://<SharePoint server name>/.../<folder name>
Repeat the previous steps to add as many repositories as needed.
When you’re done, close both the Repositories and Content scan job panes.
Back on the Azure Information Protection – Content scan job pane, your content scan name is displayed, together with the SCHEDULE column showing Manual and the ENFORCE column is blank.
You’re now ready to install the scanner with the content scanner job that you’ve created. Continue with Scanner Installation.
Scanner Installation
Now that we have verified that all prerequisites and configured the AIP in Azure, we can go through the basic scanner install.
Log onto the server where you will install the AIP Scanner service using an account that is a local administrator of the server and has permission to write to the SQL Server master database. (more restrictive scenarios are documented in the official documentation)
Run AzInfoProtection_UL.exe on the server and step through the client install (this also drops the AIP Scanner bits). WARNING: This blog is based on the current version of the AIP Client. If you want to update to the Preview client, please install the GA first and then install the preview client and use Update-AIPScanner after installation.
Next, open an Administrative PowerShell prompt.
At the PowerShell prompt, type the following command and press Enter:
If you are not using Azure AD Sync for your Service account, you will need to create a service account in the cloud tenant to use for AIP authentication. If you have synced your on premises service account, you can skip this task.
Run the command below to connect to Azure AD.
1
Connect-AzureAD
When prompted, provide tenant Global Admin credentials.
To create an account in the cloud, you must first define a password profile object. Run the commands below to define this object.
When prompted, enter the tenant name you want to use for the UserPrincipalName for the cloud service account (e.g. tenant.onmicrosoft.com).
Creating the Azure AD Application in Azure
Next, we will configure the App Registration for the Web App that is required to run the Set-AIPAuthentication command that will be used to get the authentication token. We will also assign the necessary Oauth2Permissions for the Web App to have delegated rights to the App.
Run the commands below to create the Web App, associated Service Principal, and key password.
Next, we need to run some commands to build the RequiredResourceAccess object that is needed to automate delegation of permissions for the native application.
In this task, we will use the command created previously to authenticate the AIP Scanner to the AIP Service.
Open PowerShell using Run as a different user and use the on premises Scanner Service account which should have Run As Administrator rights.
Run the commands in the following PowerShell session with the Run as Administrator option, which is required for the OnBehalfOf parameter.
The first command creates a PSCredential object and stores the specified Windows user name and password in the $pscreds variable. When you run this command, you are prompted for the password for the user name that you specified.
The second command acquires an access token that is combined with the application so that the token becomes valid for 1 year, 2 years, or never expires, according to your configuration of the registered app in Azure AD. The user name of scanner@contoso.com sets the user context to download labels and label policies from your labeling management center, such as the Office 365 Security & Compliance Center.
Configure the scanner to apply classification and protection
The default settings configure the scanner to run once, and in reporting-only mode.
To change these settings, edit the content scan job:
In the Azure portal, on the Azure Information Protection – Content scan jobs pane, select the cluster and content scan job to edit it.
On the Content scan job pane, change the following, and then select Save:
From the Content scan job section: Change the Schedule to Always
From the Policy enforcement section: Change Enforce to On
Tip: You may want to change other settings on this pane, such as whether file attributes are changed and whether the scanner can relabel files. Use the information popup help to learn more information about each configuration setting.
Make a note of the current time and start the scanner again from the Azure Information Protection – Content scan jobs pane:
Alternatively, run the following command in your PowerShell session:
1
Start-AIPScan
The scanner is now scheduled to run continuously. When the scanner works its way through all configured files, it automatically starts a new cycle so that any new and changed files are discovered.
Microsoft has put out a new standard for security defaults in a tenant that harden default settings in the org. Security defaults make it easier to help protect your organization from these attacks with preconfigured security settings:
Requiring all users to register for Azure Multi-Factor Authentication.
Requiring administrators to perform multi-factor authentication.
Blocking legacy authentication protocols.
Requiring users to perform multi-factor authentication when necessary.
Protecting privileged activities like access to the Azure portal.
Now, there might be many reasons why you would not want these defaults enabled in your tenant, just remember that you will need to setup these things manually should you change the security defaults setting.
How to change security defaults in Azure/M365
Log into https://portal.azure.com with your Global Admin account.
Click on Azure Active Directory to navigate to that pane.
In the list to the left, click Properties.
Scroll to the bottom of the screen on the right and click Manage Security Defaults
Make the appropriate change: YES/NO
(IMPORTANT) Save the changes by clicking the Save button
Manage Security Defaults for Tenant AAD
This should set the defaults for your O365 tenant as you wish to have them. Please refer to the references below for more information and detail into each of the security defaults.
MORE POSTS TO COME ON SECURITY AND COMPLIANCE! HAVE A WONDERFUL DAY!
I have a VM that is joined to my Azure AD test tenant domain. I was having issues using RDP to access the box with my Azure AD credentials (username@tenant.onmicrosoft.com). I kept getting the following when trying to connect:
Azure AD Credentials would not work to RDP to the client.
So I started researching and found that this was an common issue that many have started to face with their Azure AD Joined machines. Unfortunately, at this time it isn’t quite as easy as “open up a new RDP connection, type in the computer, type my email, and connect”. Here are the steps to connect a session to that Azure AD joined computer.
Steps to connect RDP to an Azure AD joined computer.
First, open remote desktop as if you were going to connect to any other computer. Type in the computer name or IP address and expand the the Show Options section. Next, click the Save As button to save the RDP file to your computer. At this point you can close the Remote Desktop Connection window as it isn’t needed any longer.
Next, open Notepad. Click File -> Open -> location your RDP file that was saved in the previous step.
Go to the very bottom of the list of parameters and add the following two lines:
enablecredsspsupport:i:0 authentication level:i:2
Save the changes to the .rdp file
NOTE: You can also add your username that will be used to connect to the session in the file as well:
As many of you know I have an existing Azure/M365 Tenant that I use with my company as many of you all do as well. When you get an MCT Certification, you get access to a monthly credit in Azure.
So I clicked on Activate and when activating the subscription, it created a new Azure tenant that was linked to my MCP ID and not my LDLNET ID. The problem here was that my Microsoft Identity was a different email address (@live.com) from my tenant Microsoft Identity (@ldlnet.net).
So now I have a new tenant that I really don’t need, so I started thinking, could I transfer the subscription to my LDLNET tenant and keep the monthly credit. The answer is Yes AND No.
Here is what I had to do to transfer the subscription to my LDLNET Directory
First, I created a guest account in my LDLNET tenant for the @live.com email address and then temporarily gave the account Global Admin privileges. This was so that I could access the subscription when transferred and assure that the proper accounts that needed subsequent access to the subscription get what the owner permissions by logging on with the @live.com account in the LDLNET tenant. I then activate the account in LDLNET.
NOTE: This is probably NOT the most secure option to start, but I will update as I find the article’s that define least privilege for setting this up.I’ve seen a couple of articles, but it wasn’t the exact same way. The thing here is that the billing cannot be transferred since it is being handled by Microsoft Directly with the credits. So, I have to keep the @live.com account active in the LDLNET tenant so that it bills correctly.
Next, I go to the setup tenant and look at the subscription Overview:
At the top of the screen, I choose Change Directory. Since my @Live.com account was an admin for the LDLNET Directory now, I could choose the directory on the following screen:
Change Directory to the destination tenant.
NOTE: I couldn’t change the billing on the setup tenant nor transfer it since it was through Microsoft, but why would I want to anyway since it’s my credit that was given to me monthly. Also, on my visual studio subscription, I made sure that my @ldlnet.net address was an alternate access account on the subscription. I want to make sure the credit stays after this month!!
So, I received the email asking to accept the transfer and clicked Accept The Transfer:
Accept The Transfer E-Mail
Once the subscription was accepted and transferred to LDLNET, I logged into LDLNET tenant with my @Live.com account. I then went to the subscription and made sure to add all necessary accounts to the subscription so that they would get access:
Once completed, I logged in with my Original Global Admin account and changed the @Live.com account permissions to a Global Reader in my LDLNET tenant, then gave them Owner Access permissions to the Subscription specifically.
And that has completed the transfer. Hopefully, the subscription will continue with the monthly credit as per my MCT Certification allows. I will update if something changes. If you have a better way to do this, please comment and I will be happy to verify it and post!
Microsoft365 allows the tenant administrators to grant external users access to content in their tenant by setting them up as a guest in their M365 Tenant. Microsoft365 provides a guest access feature that you can use to grant content access to contractors, partners or others who need access to certain content.
However, the process of setting up a guest user works differently from that of setting up a normal, licensed user from within your organization.
By default, Microsoft365 Admin Center contains a Guest Users screen. You will also notice, however, that this screen does not contain an option to create a guest user. In fact, the only things that you can do are search for a user or delete a user.
Limited Access to Administrate Guest Users in M365
Being that the Guest Users screen doesn’t give you a way to create a guest user, you will need to either delve into PowerShell or perform the task within Azure Active Directory. I prefer using PowerShell, and will write a post about how to perform this via PowerShell, but unless you need to create a large number of guest users, it is usually going to be easier to use the GUI. Below is how to create a guest user via Azure AD.
To create a guest user, expand the Admin Centers container and then click on Azure Active Directory. When the Azure Active Directory Admin Center opens, click on the Users container. You can see that just to the right of the New User option, there is an option to create a New Guest User.
Create New Guest User
NOTE: Creating a guest user account isn’t like creating a normal user account. Rather than providing the account details and clicking a Create button, you will instead need to send an invitation to the user.
Make Sure You Verify Their E-Mail Address Beforehand!!!
Choose Invite User > Enter the Identity Information
Initial Data Entry
Next Enter A Personal Message (optional) > Choose their Group Membership > Update any AAD or M365 Permissions under Roles > Update their Sign In Settings > Click Invite to send the invitation
Enter Data and Settings Then Click Invite Button
After a few minutes, the specified user will receive an e-mail invitation that looks something like the one shown below. The recipient will need to click the Accept Invitation button and accept the terms of use.
Example of Email Generated Invitation
When the guest user completes the registration process, they are logged into Microsoft365 however, there are no applications initially available to the user. This is because unlike a standard user, external users do not automatically get access to applications.
User Has Verified Access and Accepted the Invitation
If you go back to the Guest Users screen, you will see the newly created guest user listed (you may have to refresh the screen). As previously noted, you can’t do much from this screen. You can, however, click on the user to see a few extra details now. Example is below.
More Details Available
The way that you grant an external user access to data is to add the user to a group that has access to the data. Let’s suppose, for example, that for whatever reason, you need to add an external user to a Teams Group named Microsoft Exchange Guys. To do so, you would go to the Groups folder within the Microsoft 365 Admin Center, click on the Microsoft Exchange Guys group, and then edit the Membership list, as shown below.
After clicking the Edit button, click on Add Members and then select the external user that you wish to add. Click Save to complete the process,
The New Guest User Will Show When Searching To Add Users To The Group
If you now go back to the Group’s membership, you are able to see the Microsoft Exchange Guys group membership showing the new guest user as a member.
Guest User Has Been Added To The Group
Granting access in this way does not provide the external user with blanket access to the Teams Group. However, another group member is now able to e-mail the external user a link to the Teams Group. The external user can use this link to access the Group within the Teams app.
User is now in Teams Group
NOTE: Keep in mind that I am only using the Teams Group as an example. You can use somewhat similar techniques to provide access to a variety of Microsoft365 AND Azure AD content.
MORE M365 CONTENT TO COME! POSITIVE ATTITUDE = POSITIVE RESULTS
I was going through my LinkedIn feed as I do daily and found a post with the following document. Great post and document. I wanted to add this here to my blog for reference and to share with all of you!
The document includes the following topics:
Overview Azure Active Directory Identity Protection Azure Advanced Threat Protection Azure Information Protection Office 365 Advanced Threat Protection Office 365 Cloud App Security Microsoft Cloud App Security Office 365 Advanced Data Governance Office 365 Advanced eDiscovery Office 365 Customer Key Office 365 Customer Lockbox Privileged Access Management in Office 365 Data Loss Prevention for Exchange Online, SharePoint Online, and OneDrive for Business Data Loss Prevention for Teams chat and channel conversations Information barriers Advanced Message Encryption
I just received my new laptop for my current project and was setting up Windows 10 to join the company Azure AD domain. When I got to the part where you join, I received the following error:
Error Joining Computer to Azure AD
Turns out that my account is unable to domain join a device to the tenant. This is easily solved though. You have your tenant admin perform the following:
Go to Azure Active Directory -> Devices Check the device settings, in particular the options:
Users may join devices Maximal number of devices
Azure AD Settings Page
Now, in my case, I did not have access as I am NOT a tenant admin:
So, I am currently waiting for my IT department to resolve the access issue and grant me access to join the device to the domain. Just be sure to look at this if you’re having issues setting up your Windows 10 device to join your Azure tenant!
Let’s say you’re an admin that needs to connect to Office365 via PowerShell often. Now, there are many different websites or blogs that will show you how to connect to each session via PowerShell. That can cause a headache since you can end up having five different PowerShell sessions running in five different windows. You end up having to enter a username and password all those times, which can become time consuming.
I want to show you here how to combine all those sessions into one script where, if you’re security is tight enough on your computer, you don’t even have to enter credentials. This way, you can click on one icon and pull up all the O365 PowerShell commands that you’ll need to manage your organization.
First you need to download the following PowerShell Module Installation Files so that your PowerShell Database will have the correct modules installed:
Next, we want to setup the CLI (Command Line Interface) to be too cool for school. I have learned it helps to have knowledge of how to customize the CLI window. You can do all of this in PowerShell ISE or Notepad, which ever you prefer. Here are the commands for the script that I use to setup the CLI:
Setup the CLI for PowerShell
1
2
3
4
5
6
7
8
9
#Set the PowerShell CLI. Make sure you have a directory named "C:\PowerShell"
set-locationc:\PowerShell
$a=(Get-Host).UI.RawUI
$a.BackgroundColor="black"
$a.ForegroundColor="yellow"
$a.WindowTitle="(Your Company Name) PowerShell for ALL O365 PowerShell Services"
$curUser=(Get-ChildItem Env:\USERNAME).Value
functionprompt{"(Your Company Name) O365 PS: $(get-date -f "hh:mm:ss tt")>"}
Next, you want to set your Execution Policy and put in your credentials so that you won’t be prompted to enter the user credentials when you run the script.
NOTE: MAKE SURE YOU KEEP YOUR SCRIPT SAFE AS THE CREDENTIALS ARE VISIBLE WITHIN THE SCRIPT IN PLAIN TEXT!
You can, alternatively, set your script to prompt for credentials every time by using the following:
$LiveCred = Get-Credential
Here is that part of the script:
Setup Execution Policy and Credentials
1
2
3
4
5
6
#Setup Execution Policy and Credentials using your Tenant Admin Credentials
Connect to the Security & Compliance PowerShell: NOTE – This one I still get “Access Denied” when trying to connect. I have looked for an answer to that issue, but have not found one. Please comment with a link if you have an answer so that I can update this script!
Connect to the Compliance and Security Center through PowerShell
1
2
3
4
#Connect the Security & Compliance Center PowerShell Module
Write-Host"Connecting to O365 Security & Compliance Online through PowerShell"-ForegroundColor Green
Write-Host"Be sure to check for any connectivity errors!"-ForegroundColor Green
Write-Host"Also, Remember to run 'Get-PSSession | Remove-PSSession' before closing your PowerShell Window!"-ForegroundColor Green
Write-Host"Successfully connected to all O365 PowerShell Services for (CompanyName)!"-ForegroundColor Green
Now you can create your icon for your desktop so that you can easily access the script. I would save the script to your Scripts directory.
That will usually be C:\Users\’username’\Documents\WindowsPowerShell\Scripts or wherever directory you choose.
To start, right click the desktop and choose New > Shortcut In the Target Field, enter the following for your PowerShell Shortcut, pointing to the path of your script:
Click on the Advanced button and check the box: Run As Administrator Under the General Tab, name your shortcut: (CompanyName) O365 All PowerShell Click OK to save the shortcut to your desktop.
LAST BUT NOT LEAST, RUN THE FOLLOWING COMMAND BEFORE EXITING OR CLOSING YOUR POWERSHELL WINDOW. THIS WILL REMOVE ALL THE SESSIONS YOU’VE CONNECTED TO:
I had received a weird alert for a DB volume for a DAG member being below threshold. This was odd to me due to the fact that there were four DAG members and we only received an alert for one. I went into Azure Log Analytics and ran the following query to render a graph for the past 14 days showing the percent free space of the volume for all the DAG members.
Now the reason I can run the query this way is due to the fact that the Design of the DAG was correctly done and the DB folders are identical on all DAG members. The query rendered the following chart:
As you can see the Green DAG member is way below the other DAG members.
I next went to an Exchange Server in the DAG and got the volume data for all the members in the DAG:
EX02’s volume free space is far below the other DAG members
I went on EX02 and found that there was a subfolder named “Restore” that was not present on the other servers. I ran the following script to get the size of that folder in GB:
Get folder size in GB
1
Write-Host;$Folder="{0:N2}"-f((Get-ChildItem'\\EX02\DAG1DB001\DB\Restore'-Recurse-Force|Measure-Object-PropertyLength-Sum).Sum/1GB);Write-Host"Folder Size Is: $Folder GB"-ForegroundColor Green
The folder size was 185 GB. Removing that folder, along with all subfolders/files, would balance the free space to the other DAG members. I ran the following cmdlet to remove the folder and all subfolders/files:
This remediated the alert and balanced the drive space across all DAG members.
POST YOUR COMMENTS OR QUESTIONS! HAPPY TROUBLESHOOTING!
Cookies Notice for itblog.ldlnet.net
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. AcceptRead More
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.