Find and list inactive Exchange mailboxes with powershell

So, I tinkered with this a while back and found the following commands very useful to complete the routine task most system admins have of disabling inactive mailboxes. In this case, I needed to list inactive mailboxes on our Exchange 2007 mailbox servers. Then, I needed commands or tasks to run against this list. Here’s my script:

1.Find mailboxes with no login in over 30 days and output list to a txt file. Edit the path to your mailbox database and path to txt file output:
Get-MailboxStatistics | where {$_.Database -eq “Server1StorageGroup1MailboxDatabase1” -and $_.LastLogonTime -lt (get-date).addDays(-90) -and $_.ObjectClass -eq “Mailbox”} | sort-object lastlogontime | ft DisplayName >C:ExchangeCleanupNoLogin90Days.txt
2.Use “.TrimEnd” to clean up trailing spaces and rewrite the cleaned up txt file:
$list1 = get-content C:ExchangeCleanupNoLogin90Days.txt
$list1 | foreach {$_.TrimEnd()} > C:ExchangeCleanupNoLogin90Days.txt
3.Disable the mailboxes that were in the NoLogin90Days.txt file:
get-content “C:ExchangeCleanupNoLogin90Days.txt” | disable-mailbox

Now, you’re correct that I could just take the output from step 1 and go straight into step 3. HOWEVER, I’d like to be able to review the list of inactive users, and edit the list if needed to avoid disabling any system or service mailboxes that don’t actually have users logging into them.
Another task I use this for is to move the inactive accounts to mailbox servers with more space. I usually run this before step 3 above:
get-content “C:ExchangeCleanupNoLogin90Days.txt” | move-mailbox -targetdatabase “Server2StorageGroup2MailboxDatabase2”

I strongly recommend that you please test these commands in a test environment before running them on your production systems. I’ve tested functionality on Exchange 2007 and 2010, but your environment may differ.
Rick Estrada
Pastebin for reference:

Automating a DPM Recovery

OK, so we’re using Microsoft Data Protection Manager 2007 (DPM) as our backup solution for several of our SQL DB and Windows file servers. It’s pretty straight forward stuff, but like most new software, documentation for DPM is limited and “googling” for answers, yields limited help. Let me describe one problem we faced, then give you an overview of the solution I engineered.
The DB system runs on two clustered Live servers, and is required (by the vendor) to be replicated to a Report server for resource consuming queries. This is a SQL2005 implementation.
The vendor put a backup job in place which basically does a DB dump to a .bak file every night at 1AM, and hourly log dumps after that. We have implemented DPM with VSS to backup that .bak file nightly, as soon as the dump finishes.
Now, because the DB must be replicated to the report server nightly, we will be using that to validate our backups by restoring, nightly, as well.
So what we needed to do, looks like this:
1.SQL Dump @1am
2.DPM Backup of .bak file @2am
3.DPM Restore .bak to RPT @3am
4.SQL Restore on RPT @4am
The problem is, there is no way to automate restores in DPM’s GUI, it must be done through powershell. Here’s the script that did it.
$from = Get-Date –f “dd-MMMM-yyyy 00:01:00”
$to = Get-Date –f “dd-MMMM-yyyy HH:mm:ss”
–>>these two lines define the time parameters for the “searchoption”
$pg = Get-ProtectionGroup DPMGROUPNAME
$ds = Get-Datasource $pg[8]
–>>this sets the $pg variable, and $pg[8] uses ninth line from its output to set the $ds variable
–>>it gets a little complex below this line
$so = New-SearchOption -SearchString db_dump_file.BAK -FromRecoveryPoint “$from” -ToRecoveryPoint “$to” -SearchDetail filesfolders -SearchType contains -Recursive -Location “F:”
–>>with the search option variable set, you can now choose which item to recover
$ri = Get-RecoverableItem -Datasource $ds[2] -SearchOption $so
$ro = New-RecoveryOption -TargetServer -RecoveryLocation copytofolder -FileSystem -AlternateLocation “F:MSSQLDailyLiveRestore” -OverwriteType overwrite -RecoveryType Restore
–>>now comes the actual recovery command, much simpler command, only 2 options
Recover-RecoverableItem -RecoverableItem $ri -RecoveryOption $ro
Here’s all the commands without my comments:
$from = Get-Date –f “dd-MMMM-yyyy 00:01:00”
$to = Get-Date –f “dd-MMMM-yyyy HH:mm:ss”
$pg = Get-ProtectionGroup DPMGROUPNAME
$ds = Get-Datasource $pg[8]
$so = New-SearchOption -SearchString db_dump_file.BAK -FromRecoveryPoint “$from” -ToRecoveryPoint “$to” -SearchDetail filesfolders -SearchType contains -Recursive -Location “F:”
$ri = Get-RecoverableItem -Datasource $ds[2] -SearchOption $so
$ro = New-RecoveryOption -TargetServer -RecoveryLocation copytofolder -FileSystem -AlternateLocation “F:MSSQLDailyLiveRestore” -OverwriteType overwrite -RecoveryType Restore
Recover-RecoverableItem -RecoverableItem $ri -RecoveryOption $ro
Working backwards, it should become clear what parameters are required, before the next commandlet will work. Remember, that in a list of items, the first item is 0, not 1. Run the commands like “Get-ProtectionGroup DPMGROUPNAME” so you will know what to use for $pg[#] when setting a datasource. I used the command “Get-Date –f “dd-MMMM-yyyy HH:mm:ss”” because this runs at 3am, and I needed the the SearchOption to limit itself to recovery points between 1am and 3am, on the current day. As you can see, Powershell experience is a must.
OK, on to the automation side of things: PowerShell scripts should be saved with file extension .ps. Now, you can’t just double click the file and it executes; you have to tell windows powershell to load the DPM commandlets, to interpret the commands in the .ps file. Only then, can you make this a scheduled task.
Here’s how I did it. First, I customized the commands above for my environment, and saved them in a .ps file (C:ScriptsAutoRecoverSQL-DB-RECOVER.ps1). I then saved a batch file with the following command:
C:WINDOWSsystem32windowspowershellv1.0powershell.exe -PSConsoleFile “C:Program FilesMicrosoft DPMDPMbindpmshell.psc1” -command “.’C:ScriptsAutoRecoverSQL-DB-RECOVER.ps1′”
THEN I created the scheduled task to kick off the batch file, which kicks off the powershell script.
Because of the “-OverwriteType overwrite” option, the recovered .bak file will replace the previous one each night on the report server, and then SQL on the report server will recover it with its own scheduled job. NOTE: a DPM agent must be installed on the target server for DPM to recover to it.
And there it is. I know my post became a little vague towards the end, but I’m getting a little sleepy here. Please, feel free to email me if you need help getting this to work in your environment, or if you just have any questions.
If you took the time to read this, I really hope this has helped you out in your implementation of DPM.
Finally, I’d like to thank my good friend, Sassan K. for getting me started with Microsoft DPM last year.
-Rick Estrada

Live Mesh service is unavailable – Error during install

So I’ve stated playing around with Microsoft’s Live Mesh application and am loving it!

I have in the past used sync tools like SyncToy and online storage, like Google Docs, but this is on a whole other level, not to mention remote desktop-like pc control.
This morning, I decided to install in on my Thinkpad x40, which is running Vista Business, but I kept getting an error:

Live Mesh service is unavailable.
Please retry your installation later.
[80072F78] The server returned an invalid or unrecognized response

I rebooted, over and over, but had no success. Later I realized my AV software was still runing. As soon as I disabled it, Live Mesh finished installing and started up with no problem. After a reboot, the AV came up and so did Live Mesh and it has been working OK since.
Just fyi, I’m running Avast! 4.8 Home edition.
Hope this helps, because I found no help google-ing this error.
-Rick Estrada