Find and list inactive Exchange mailboxes with powershell

So, I tinkered with this a while back and found the following commands very useful to complete the routine task most system admins have of disabling inactive mailboxes. In this case, I needed to list inactive mailboxes on our Exchange 2007 mailbox servers. Then, I needed commands or tasks to run against this list. Here’s my script:

1.Find mailboxes with no login in over 30 days and output list to a txt file. Edit the path to your mailbox database and path to txt file output:
Get-MailboxStatistics | where {$_.Database -eq “Server1StorageGroup1MailboxDatabase1” -and $_.LastLogonTime -lt (get-date).addDays(-90) -and $_.ObjectClass -eq “Mailbox”} | sort-object lastlogontime | ft DisplayName >C:ExchangeCleanupNoLogin90Days.txt
2.Use “.TrimEnd” to clean up trailing spaces and rewrite the cleaned up txt file:
$list1 = get-content C:ExchangeCleanupNoLogin90Days.txt
$list1 | foreach {$_.TrimEnd()} > C:ExchangeCleanupNoLogin90Days.txt
3.Disable the mailboxes that were in the NoLogin90Days.txt file:
get-content “C:ExchangeCleanupNoLogin90Days.txt” | disable-mailbox

Now, you’re correct that I could just take the output from step 1 and go straight into step 3. HOWEVER, I’d like to be able to review the list of inactive users, and edit the list if needed to avoid disabling any system or service mailboxes that don’t actually have users logging into them.
Another task I use this for is to move the inactive accounts to mailbox servers with more space. I usually run this before step 3 above:
get-content “C:ExchangeCleanupNoLogin90Days.txt” | move-mailbox -targetdatabase “Server2StorageGroup2MailboxDatabase2”

I strongly recommend that you please test these commands in a test environment before running them on your production systems. I’ve tested functionality on Exchange 2007 and 2010, but your environment may differ.
Rick Estrada
Pastebin for reference:

Automate backups of SQL Express database

We have a workorder tracking system that runs on SQL Express 2005 edition. One of the many limitations of SQL Express, is its lack of the SQL Agent found on full versions of SQL, which allows automated tasks to be scheduled, one being scheduled backups.

So to work around this limitation, I created a scheduled task in Windows, which runs the following commands from a batch file to automatically backup the database:
osql -E -S SERVERSQLINSTANCE -Q “BACKUP DATABASE [DATABASENAME] TO  [BACKUP_DEVICE] WITH  RETAINDAYS = 7, NOFORMAT, NOINIT,  NAME = N’DATABASENAME-Full Database Backup’, SKIP, NOREWIND, NOUNLOAD,  STATS = 10″
osql -E -S SERVERSQLINSTANCE -Q “BACKUP LOG DATABASENAME WITH TRUNCATE_ONLY”
osql -E -S SERVERSQLINSTANCE -Q “DBCC SHRINKDATABASE (DATABASENAME, TRUNCATEONLY)”
OK, so one thing to do before you go out and use this, is you need to create a “Backup Device”, which you can do with SQL Management Studio Express. If you need a copy, just google it; it is a free download from Microsoft. The second and third commands are for basic DB maintenance, but it doesn’t hurt to run them as well, at the same time.
That’s about it, just edit the commands above, set your retention period (RETAINDAYS = x), put in a batch file, and create a scheduled task with your time requirements.
Email me if you have any questions or suggestions.
–Rick Estrada

Automating a DPM Recovery

OK, so we’re using Microsoft Data Protection Manager 2007 (DPM) as our backup solution for several of our SQL DB and Windows file servers. It’s pretty straight forward stuff, but like most new software, documentation for DPM is limited and “googling” for answers, yields limited help. Let me describe one problem we faced, then give you an overview of the solution I engineered.
The DB system runs on two clustered Live servers, and is required (by the vendor) to be replicated to a Report server for resource consuming queries. This is a SQL2005 implementation.
The vendor put a backup job in place which basically does a DB dump to a .bak file every night at 1AM, and hourly log dumps after that. We have implemented DPM with VSS to backup that .bak file nightly, as soon as the dump finishes.
Now, because the DB must be replicated to the report server nightly, we will be using that to validate our backups by restoring, nightly, as well.
So what we needed to do, looks like this:
1.SQL Dump @1am
2.DPM Backup of .bak file @2am
3.DPM Restore .bak to RPT @3am
4.SQL Restore on RPT @4am
The problem is, there is no way to automate restores in DPM’s GUI, it must be done through powershell. Here’s the script that did it.
$from = Get-Date –f “dd-MMMM-yyyy 00:01:00”
$to = Get-Date –f “dd-MMMM-yyyy HH:mm:ss”
–>>these two lines define the time parameters for the “searchoption”
$pg = Get-ProtectionGroup DPMGROUPNAME
$ds = Get-Datasource $pg[8]
–>>this sets the $pg variable, and $pg[8] uses ninth line from its output to set the $ds variable
–>>it gets a little complex below this line
$so = New-SearchOption -SearchString db_dump_file.BAK -FromRecoveryPoint “$from” -ToRecoveryPoint “$to” -SearchDetail filesfolders -SearchType contains -Recursive -Location “F:”
–>>with the search option variable set, you can now choose which item to recover
$ri = Get-RecoverableItem -Datasource $ds[2] -SearchOption $so
$ro = New-RecoveryOption -TargetServer rptserver.domain.com -RecoveryLocation copytofolder -FileSystem -AlternateLocation “F:MSSQLDailyLiveRestore” -OverwriteType overwrite -RecoveryType Restore
–>>now comes the actual recovery command, much simpler command, only 2 options
Recover-RecoverableItem -RecoverableItem $ri -RecoveryOption $ro
Here’s all the commands without my comments:
$from = Get-Date –f “dd-MMMM-yyyy 00:01:00”
$to = Get-Date –f “dd-MMMM-yyyy HH:mm:ss”
$pg = Get-ProtectionGroup DPMGROUPNAME
$ds = Get-Datasource $pg[8]
$so = New-SearchOption -SearchString db_dump_file.BAK -FromRecoveryPoint “$from” -ToRecoveryPoint “$to” -SearchDetail filesfolders -SearchType contains -Recursive -Location “F:”
$ri = Get-RecoverableItem -Datasource $ds[2] -SearchOption $so
$ro = New-RecoveryOption -TargetServer rptserver.domain.com -RecoveryLocation copytofolder -FileSystem -AlternateLocation “F:MSSQLDailyLiveRestore” -OverwriteType overwrite -RecoveryType Restore
Recover-RecoverableItem -RecoverableItem $ri -RecoveryOption $ro
Working backwards, it should become clear what parameters are required, before the next commandlet will work. Remember, that in a list of items, the first item is 0, not 1. Run the commands like “Get-ProtectionGroup DPMGROUPNAME” so you will know what to use for $pg[#] when setting a datasource. I used the command “Get-Date –f “dd-MMMM-yyyy HH:mm:ss”” because this runs at 3am, and I needed the the SearchOption to limit itself to recovery points between 1am and 3am, on the current day. As you can see, Powershell experience is a must.
OK, on to the automation side of things: PowerShell scripts should be saved with file extension .ps. Now, you can’t just double click the file and it executes; you have to tell windows powershell to load the DPM commandlets, to interpret the commands in the .ps file. Only then, can you make this a scheduled task.
Here’s how I did it. First, I customized the commands above for my environment, and saved them in a .ps file (C:ScriptsAutoRecoverSQL-DB-RECOVER.ps1). I then saved a batch file with the following command:
C:WINDOWSsystem32windowspowershellv1.0powershell.exe -PSConsoleFile “C:Program FilesMicrosoft DPMDPMbindpmshell.psc1” -command “.’C:ScriptsAutoRecoverSQL-DB-RECOVER.ps1′”
THEN I created the scheduled task to kick off the batch file, which kicks off the powershell script.
Because of the “-OverwriteType overwrite” option, the recovered .bak file will replace the previous one each night on the report server, and then SQL on the report server will recover it with its own scheduled job. NOTE: a DPM agent must be installed on the target server for DPM to recover to it.
And there it is. I know my post became a little vague towards the end, but I’m getting a little sleepy here. Please, feel free to email me if you need help getting this to work in your environment, or if you just have any questions.
If you took the time to read this, I really hope this has helped you out in your implementation of DPM.
Finally, I’d like to thank my good friend, Sassan K. for getting me started with Microsoft DPM last year.
THANKS SASSAN!
-Rick Estrada