{"id":201,"date":"2019-02-06T00:11:43","date_gmt":"2019-02-06T05:11:43","guid":{"rendered":"http:\/\/itblog.ldlnet.net\/?p=201"},"modified":"2019-02-06T00:11:43","modified_gmt":"2019-02-06T05:11:43","slug":"checking-drive-space-volumes-for-dag-db-members-through-powershell","status":"publish","type":"post","link":"https:\/\/itblog.ldlnet.net\/index.php\/2019\/02\/06\/checking-drive-space-volumes-for-dag-db-members-through-powershell\/","title":{"rendered":"Checking Drive Space Volumes for DAG DB members through PowerShell"},"content":{"rendered":"\n<p>I had received a weird alert for a DB volume for a DAG member being below threshold. This was odd to me due to the fact that there were four DAG members and we only received an alert for one. I went into Azure Log Analytics and ran the following query to render a graph for the past 14 days showing the percent free space of the volume for all the DAG members. <\/p>\n\n\n\n<p style=\"text-align:center\"><em>Thanks\u00a0Georges\u00a0Moua\u00a0for\u00a0the\u00a0query script!<\/em><\/p>\n\n\n<pre class=\"lang:Azure nums:false\" title=\"Azure Logs Query Script\">let PerfCutoff = ago(14d);\nlet inst =\u00a0\"C:\\\\ExchangeDB\\\\DAG1DB001\\\\DB\";\nsearch\u00a0*\n|\u00a0where\u00a0TimeGenerated > PerfCutoff\n|\u00a0where\u00a0Name_s ==\u00a0\"PercentFreeSpace\"\n|\u00a0where\u00a0InstanceName_s\u00a0contains\u00a0inst\n|\u00a0sort\u00a0by\u00a0TimeGenerated desc\n|\u00a0project\u00a0Resource,Company_s,InstanceName_s,PercentFreeSpace=Value_d,TimeGenerated\n|\u00a0render\u00a0timechart <\/pre>\n\n\n\n<p>Now the reason I can run the query this way is due to the fact that the Design of the DAG was correctly done and the DB folders are identical on all DAG members. The query rendered the following chart:<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"447\" src=\"http:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/AzureQueryPercentFreeSpaceDAG-1024x447.png\" alt=\"\" class=\"wp-image-202\" srcset=\"https:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/AzureQueryPercentFreeSpaceDAG-1024x447.png 1024w, https:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/AzureQueryPercentFreeSpaceDAG-300x131.png 300w, https:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/AzureQueryPercentFreeSpaceDAG-768x335.png 768w, https:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/AzureQueryPercentFreeSpaceDAG.png 1228w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>As you can see the <em>Green <\/em>DAG member is way below the other DAG members.<\/figcaption><\/figure>\n\n\n\n<p>I next went to an Exchange Server in the DAG and got the volume data for all the members in the DAG:<\/p>\n\n\n<pre class=\"lang:PowerShell\" title=\"Drive Volume Space for DAG Members\">$Svrs = Get-MailboxDatabaseCopyStatus DAG1DB001 | Select-Object MailboxServer | Sort-Object MailboxServer \n\nforeach ($Svr in $Svrs) { $Svr.MailboxServer ; Get-WmiObject -ComputerName $Svr.MailboxServer Win32_Volume | ? {$_.Name -like \"*DAG1DB001\\DB*\"} | select Name,FileSystem,FreeSpace,BlockSize,Capacity | % {$_.BlockSize=(($_.FreeSpace)\/($_.Capacity))*100;$_.FreeSpace=($_.FreeSpace\/1GB);$_.Capacity=($_.Capacity\/1GB);$_} | Sort-Object Name | Format-Table Name,@{n='Free,GB';e={'{0:N2}'-f $_.FreeSpace}},@{n='Free,%';e={'{0:N2}'-f $_.BlockSize}},@{n='Capacity,GB';e={'{0:N3}' -f $_.Capacity}},@{n='FS';e={$_.FileSystem}} -AutoSize }<\/pre>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"489\" height=\"475\" src=\"http:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/DAGDBVolumeOutput.png\" alt=\"\" class=\"wp-image-203\" srcset=\"https:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/DAGDBVolumeOutput.png 489w, https:\/\/itblog.ldlnet.net\/wp-content\/uploads\/2019\/02\/DAGDBVolumeOutput-300x291.png 300w\" sizes=\"auto, (max-width: 489px) 100vw, 489px\" \/><figcaption>EX02&#8217;s volume free space is far below the other DAG members<\/figcaption><\/figure>\n\n\n\n<p>I went on EX02 and found that there was a subfolder named &#8220;Restore&#8221; that was not present on the other servers. I ran the following script to get the size of that folder in GB:<\/p>\n\n\n<pre class=\"lang:PowerShell nums:false\" title=\"Get folder size in GB\">Write-Host ; $Folder = \"{0:N2}\" -f ( ( Get-ChildItem '\\\\EX02\\DAG1DB001\\DB\\Restore' -Recurse -Force | Measure-Object -Property Length -Sum ).Sum \/ 1GB ) ; Write-Host \"Folder Size Is: $Folder GB\" -ForegroundColor Green<\/pre>\n\n\n\n<p>The folder size was 185 GB. Removing that folder, along with all subfolders\/files, would balance the free space to the other DAG members. I ran the following cmdlet to remove the folder and all subfolders\/files:<\/p>\n\n\n<pre class=\"lang:PowerShell nums:false\" title=\"Remove folder entirely via PowerShell\">Remove-Item '\\\\EX02\\DAG1DB001\\DB\\Restore' -recurse -force<\/pre>\n\n\n\n<p style=\"text-align:center\">This remediated the alert and balanced the drive space across all DAG members. <\/p>\n\n\n\n<p style=\"text-align:center\" class=\"has-large-font-size\"><strong>POST YOUR COMMENTS OR QUESTIONS! <br>HAPPY TROUBLESHOOTING!<\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>I had received a weird alert for a DB volume for a DAG member being below threshold. This was odd to me<\/p>\n<p class=\"link-more\"><a class=\"myButt \" href=\"https:\/\/itblog.ldlnet.net\/index.php\/2019\/02\/06\/checking-drive-space-volumes-for-dag-db-members-through-powershell\/\">Read More<\/a><\/p>\n","protected":false},"author":1,"featured_media":161,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4,2,3,16],"tags":[90,9,91,92,10,26,8,89,13],"class_list":["post-201","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-exchange","category-general","category-powershell","category-windows","tag-azure","tag-exchange","tag-files","tag-folders","tag-free-space","tag-logging","tag-powershell","tag-query","tag-script","odd"],"_links":{"self":[{"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/posts\/201","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/comments?post=201"}],"version-history":[{"count":2,"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/posts\/201\/revisions"}],"predecessor-version":[{"id":205,"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/posts\/201\/revisions\/205"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/media\/161"}],"wp:attachment":[{"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/media?parent=201"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/categories?post=201"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/itblog.ldlnet.net\/index.php\/wp-json\/wp\/v2\/tags?post=201"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}