Tuesday, February 23, 2016

Office 365 and Azure Powershell Automation for Education or Business

Deploying Office 365 in a large scale environment will almost always require you to leverage Powershell. For a deployment I recently worked on there were two areas that Powershell could be leveraged which saved a lot of time. Setting the default User Principal Name and licensing and activation. If you are syncing your entire Active Directory, running a script as a scheduled task can save you a lot of time instead of manually assigning licenses. This works especially well for education customers where many Microsoft features are given away for free. Your UPN is what Office 365 uses to identify your login name. If you are defaulting to an onmicrosoft account instead of your domain, leveraging Powershell commandlets will save you a lot of time. To set the default UPN was as simple task using the following commands:
Set-MsolUserPrincipalName -UserPrincipalName $upn -NewUserPrincipalName $email 
Setting licenses was more complex. To assign licenses and disable certain plans, the following commandlets were used.
Set-MsolUser -UserPrincipalName $upn -UsageLocation US
$lic = New-MsolLicenseOptions -AccountSkuId isd622org:STANDARDWOFFPACK_IW_FACULTY -DisabledPlans EXCHANGE_S_STANDARD
Set-MsolUserLicense -UserPrincipalName $upn -AddLicenses isd622org:STANDARDWOFFPACK_IW_FACULTY
Set-MsolUserLicense -UserPrincipalName $upn -LicenseOptions $lic
These commands set the user location to the United States, applied a faculty license to the user, and disabled Exchange ( email and calendaring functionality ). To see what licenses are available in your Office 365 environment you can use Get-MsolAccountSku. To see what services are available to a particular user after a license is assigned you can use Get-MsolUser as show below.
#Show Licenses
#Show provisioning status
(Get-MsolUser -UserPrincipalName "user@contoso.com").Licenses.ServiceStatus
In order to run the completed Powershell script as a scheduled task the username and password was passed to the Connect-MsolService commandlet as shown.
$User = "user@user.com"
$Pass = "password"
$Cred = New-Object System.Management.Automation.PsCredential($User,(ConvertTo-SecureString $Pass -AsPlainText -Force))
Import-Module MSOnline
Connect-MsolService -Credential $Cred
A completed script may look something like this. You would obviously want to make tweaks based on your environment.
$User = "user@contoso.com"
$Pass = "password"
$Cred = New-Object System.Management.Automation.PsCredential($User,(ConvertTo-SecureString $Pass -AsPlainText -Force))
Import-Module MSOnline
Connect-MsolService -Credential $Cred

$users=Get-MsolUser -All
foreach ($user in $users){

$upn = $user.UserPrincipalName


#Get rid of *.onmicrosoft.com
if ($user.UserPrincipalName -like "*.onmicrosoft.com"){
    Set-MsolUserPrincipalName -UserPrincipalName $upn -NewUserPrincipalName $email

if (-Not $user.isLicensed ){
    Set-MsolUser -UserPrincipalName $upn -UsageLocation US
    $lic = New-MsolLicenseOptions -AccountSkuId contosocom:STANDARDWOFFPACK_IW_FACULTY -DisabledPlans EXCHANGE_S_STANDARD
    Set-MsolUserLicense -UserPrincipalName $upn -AddLicenses contosocom:STANDARDWOFFPACK_IW_FACULTY
    Set-MsolUserLicense -UserPrincipalName $upn -LicenseOptions $lic
You may consider changing the line $users=Get-MsolUser -All to $users=Get-MsolUser --UnlicensedUsersOnly to speed up the execution in future iterations.

Tuesday, July 1, 2014

Emails Displaying as www-data for Google Apps for Education

Recently I was working with Google Apps for Education and was transitioning an old PHP web application to deliver email using Google's servers. When the mail was being delivered, the name displayed was www-data. In order to change this I had to edit my sendmail_path in the php.ini file which was located at /etc/php5/apache2/php.ini
sendmail_path = '/usr/sbin/sendmail -t -i -fnoreply@joshwesley.com -Fnoreply'
Once the sendmail path was modified a quick reboot of apache was done and emails delivered with the proper name, not www-data, in the Google Apps for Education environment.

Sunday, March 3, 2013

VMWare ESXi 5.1 RAID Email Alerts

So I bought myself a new 3ware 9650SE-4LPML RAID Controller for my ESXi 5.1 server and ran into a few issues regarding email alerts. I got it installed just fine but there wasn't any software that would automatically check the status of the array and email me if something was wrong. After a little searching, I found that an app called tw_cli that can check the status of the RAID array from the command line. You should be able to download it here:


The real issue arose when I wanted to send an email using gmail as my smtp server. That was a bunch of headaches trying to figure out the syntax. I finally got it working using openssl which can be seen in the code below. The trick was making the input sleep for gmail's servers to respond. Without the sleep command, openssl would just hang after about 2 lines of input.

VMWare has a bunch of hoops you have to jump through in order to schedule tasks and where to put files so they aren't erased after a reboot. For ESXi 5.1 you have to edit /etc/rc.local.d/local.sh and add the following lines:
/bin/kill $(cat /var/run/crond.pid)
/bin/echo "*/5 * * * * /vmfs/volumes/vol/emailalerts/tw_diskcheck" >> /var/spool/cron/crontabs/root
/usr/lib/vmware/busybox/bin/busybox crond

This will setup a cron job so the script runs every 5 minutes. You also have to make sure your files are stored on a volume at /vmfs/volumes so they don't get erased after a reboot.

Hopefully this script can also help someone who wants to use openssl to send email using gmail. That was a real pain to get working.

UPDATE! Thanks to Paul Atherton for creating a modified version of the script. Apparently there was some issues with the way the date was parsed outside of the USA. Paul also added some enhancements, including much better comments, to the script which is now below. My old script is still running on my ESX system and working fine. In case anyone wants to reference the old version it is located here. Thanks Paul!
# To setup to run every 5 mins via cron, edit /etc/rc.local and add the following lines:
# /bin/kill $(cat /var/run/crond.pid)
# /bin/echo "*/5  *    *   *   *   /vmfs/volumes/Datastore/3Ware/tw_diskcheck" >> /var/spool/cron/crontabs/root
# /bin/crond

# To set this up instantly (before reboot), write these lines to a script, prefix these lines with:
# chmod u+w /var/spool/cron/crontabs/root
# save the script, make it executable (chmod 755 script_name), and run this script directly (./script_name)
# if all is working, after 5 mins, a lol.log file should appear in /vmfs/volumes/Datastore/3Ware/ and you should receive your first status email.

# User defined variables
USERNAME=myemail@gmail.com              # your SMTP username
PASSWORD=mypassword                     # your SMTP password
ADDRESS=smtp.gmail.com                  # your SMTP server FQDN
PORT=465                                # your SMTP server port number
TO=toemail@mymaildomain.com             # your destination e-mail address
FROM=senderemail@mydomain.com           # the sending e-mail address
PROG_PATH=/vmfs/volumes/Datastore/3Ware # the server path of this script (and tw_cli)


# Create log file if it doesn't exist - used to record changes in unit status
if [ ! -f $PROG_PATH/lol.log ]; then
  echo `date`" START OF FILE" > $PROG_PATH/lol.log

# Create Firewall Exception file and restart service to apply - runs only if not already present
# (a restart will lose the exception and file, so first run of this script will re-create it)

if [ ! -f /etc/vmware/firewall/email.xml ]; then
  echo "" > /etc/vmware/firewall/email.xml
  echo "" >> /etc/vmware/firewall/email.xml
  echo "    " >> /etc/vmware/firewall/email.xml
  echo "        email" >> /etc/vmware/firewall/email.xml
  echo "        " >> /etc/vmware/firewall/email.xml
  echo "            outbound" >> /etc/vmware/firewall/email.xml
  echo "            tcp" >> /etc/vmware/firewall/email.xml
  echo "            dst" >> /etc/vmware/firewall/email.xml
  echo "            $PORT" >> /etc/vmware/firewall/email.xml
  echo "        " >> /etc/vmware/firewall/email.xml
  echo "        true" >> /etc/vmware/firewall/email.xml
  echo "        false" >> /etc/vmware/firewall/email.xml
  echo "    " >> /etc/vmware/firewall/email.xml
  echo "" >> /etc/vmware/firewall/email.xml
  esxcli network firewall refresh

# Test up to 3 times to see if firewall rule is present
for i in 1 2 3
  WORKING_EMAIL=`esxcli network firewall ruleset list | grep email | awk '{print $2}'`
  echo "Checking Firewall rule exists - attempt: "$i
  if [ "$WORKING_EMAIL" = true ]; then
    echo "Firewall rule checked out OK on attempt: "$i
if [ "$WORKING_EMAIL" != true ]; then
  echo `date`" After 3 attempts the firewall rule could not be detected. Aborting." # >> $PROG_PATH/lol.log

ENC_PASS=`echo -ne "\0"$USERNAME"\0"$PASSWORD | openssl base64` #encode username and password
CTL_NAME=`$TWCLI info|grep -E "^c"|awk '{print $1}'` #get controller name

# Get day name for use below in Sunday status update
DAY=`date|awk '{print $1}'`

# Build time as a serial - i.e. remove colons - used as time source for Sunday status update
TIME=`date|awk '{print $4}'`
HH=`echo $TIME | awk -F\: '{print $1}'`
MM=`echo $TIME | awk -F\: '{print $2}'`
SS=`echo $TIME | awk -F\: '{print $3}'`

# Get unit status for each unit - all on one line - each unit staus separated by space
UNITSTATUS=`$TWCLI info $CTL_NAME unitstatus|grep -E "^u"|awk '{printf "%s ",$3}'|sed 's/ *$//'`

# Get the last unit status report from the log file
LAST_STATUS=`tail -1 $PROG_PATH/lol.log`

# Write status to screen
echo "Previous Unit Status   (from log): "$LAST_STATUS
echo "Current Unit Status (from tw_cli): "$UNITSTATUS

# If the unit status has changed since the last log report then...
if [ "$UNITSTATUS" != "$LAST_STATUS" ]; then
  # Compose and send the e-mail
  (echo -e "EHLO $LOCALHOST";echo -e "AUTH PLAIN $ENC_PASS";echo -e "MAIL FROM: <$FROM>";sleep $SLEEP;echo -e "RCPT TO: <$TO>";sleep $SLEEP;echo -e 'DATA';sleep $SLEEP;echo -e "SUBJECT: `hostname` DISK STATUS: $UNITSTATUS";sleep $SLEEP;$TWCLI info $CTL_NAME;sleep $SLEEP;echo -e '.';sleep $SLEEP;echo -e 'quit')|openssl s_client -pause -connect $ADDRESS:$PORT -ign_eof -crlf
  # then write the new status update to the log
  echo `date` >> $PROG_PATH/lol.log
  echo $UNITSTATUS >> $PROG_PATH/lol.log

# Email once on Sunday around 10am. Lets me know the script is still running.
if [ "$DAY" == "Sun" ] && [ "$TIME" -gt "100000" ] && [ "$TIME" -lt "101010" ]; then
  (echo -e "EHLO $LOCALHOST";echo -e "AUTH PLAIN $ENC_PASS";echo -e "MAIL FROM: <$FROM>";sleep $SLEEP;echo -e "RCPT TO: <$TO>";sleep $SLEEP;echo -e 'DATA';sleep $SLEEP;echo -e "SUBJECT: `hostname` WEEKLY DISK CHECK: $UNITSTATUS";sleep $SLEEP;$TWCLI info $CTL_NAME;sleep $SLEEP;echo -e '.';sleep $SLEEP;echo -e 'quit')|openssl s_client -pause -connect $ADDRESS:$PORT -ign_eof -crlf
  echo `date` >> $PROG_PATH/lol.log
  echo " $UNITSTATUS" >> $PROG_PATH/lol.log

Tuesday, February 26, 2013

Remote Desktop Connection Manager 2.2 Clear Text Gateway Passwords?

So if you are like me and want to save your RDCM rdg file in a central location you will have to store your passwords in clear text. I do this so I can share the RDG file across multiple machines. By default, gateway passwords are encrypted in your RDG file preventing you from opening it on another machine. However if you edit the RDG file in a text editor you can make some simple changes to disable the encryption.

Find the line <password storeAsClearText="False"> and switch it to <password storeAsClearText="False">. Replace the cryptic characters between the <password> tags with your password, save the file and you should be good to go.

Additionally, you'll also have to store your login credential passwords in clear text. There is a check box builtin to RDCM that allows this.

You need to take special care to make sure your rdg file is in a secure location as your passwords to your rdp clients will be visible in plain text.

SCCM 2012 SP1 Operating System Installing to the D: Drive

When deploying a Windows 7 image using SCCM 2012 SP1, I noticed that the drive that it was defaulting to install to was D: and not C:. After a little head scratching I found a new variable that was added to SP1 called OSDPreserveDriveLetter.

Here is the official word from Microsoft.com as to what the OSDPreserveDriveLetter is used for:

OSDPreserveDriveLetter: This variable determines whether the task sequence uses the drive letter on the operating system image WIM file. In Configuration Manager with no service pack, the drive letter on the WIM file was used when it applied the operating system image WIM file. In Configuration Manager SP1, you can set the value for this variable to False to use the drive letter that you specify in the task sequence.

Here is how I fixed my issue. I had to manually set the OSDPreserveDriveLetter variable to false by adding a Set Task Sequence Variable as shown below.

Next I set a drive letter variable for the sequence.

Then this variable can then be used in the Apply Operating System task.

Now when you run your OSD, Windows 7 should install to the first drive letter C: drive instead of D:.