Greg's Tech blog

My technical journal where I record my challenges with Linux, open source SW, Tiki, PowerShell, Brewing beer, AD, LDAP and more...

Sorting AD Objects by OU

Wednesday 30 of September, 2020

Today I was trying to a list of of AD computer objects to group them by OU.  The obvious way was to try sorting by distinguishedname.  But sorting is simply alphabetical and so it is simply a sort by the computer name since that is listed first


Then I tried canonicalname and got results I could use


Here is the Powershell I used

get-adcomputer -filter * -properties operatingsystem,canonicalName|select name,operatingsystem,canonicalName,distinguishedname|sort canonicalName|ft

Note: You must specify "-properties canonicalname" to force the server to return the canonicalname property  so you can use it.  


converting audio with ffmpeg

Monday 08 of April, 2019

Needed to convert a couple wma files to mp3 because their stream layout wouldn't play with subsonic without a special conversion command. This took care of it:

for file in *.wma; do ffmpeg -i "$file" -map 0:1? -b:a 320k -v 32 -f mp3 "/$(basename -s .wma "$file").mp3"; done   

The tricky part was using basename to cut off the extension so I could rename the file.


Manipulating Native Excel Files in Powershell

Wednesday 03 of October, 2018

The first time I used Import-CSV to access data in a CSV file, I knew I was leaving VBScript behind for PowerShell.  That was probably more than 10 years ago and I have had a lot of fun with PowerShell since then.  Today, I had reason to manipulate native Excel files (OOXML) and found the process as simple as CSV.  This all starts with a module from the PowerShell Gallery called ImportExcel and it can be installed runnning Install-Module ImportExcel in a pwsh window.

Once installed, accessing an excel file is as simple as accessing a csv file.  Execute

$Data = Import-Excel C:\temp\somefile.xlsx

As with Import-csv, you can now access the various columns of the worksheet using the header name and the rows by the array index of the $Data object.  

EX: $Data2.Name will display the value of the column Name in the second row.


Managin TPLink Wifi Smart PLug with Powershell

Thursday 14 of June, 2018

I was given a TPLink HP100 WiFi Smart Plug the other day.  HS100 It works wit TPLink's Kasa service and from your smart phone.  Kasa does support remote control of the devices and this is done through a Restful API service.  As  have a SmartThings hub, I went looking for a way to integrate it with that.  Integration isn't directly available but I found this useful article from the ITNerd Space blog that shows you how to use .json posts to a Rest service to manages the device.

My scripting language of choice is PowerShell (now available on Linux) so I wrote a couple scripts to turn the plug on & off and query the state.

The script handles all the setup, you need only provide it with your Kasa service credentials and the alias for the smart plug.



Building a BrewPi

Tuesday 23 of January, 2018

I was a technology geek for years prior to brewing. I have a degree in Computer Science and have worked as a Systems Engineer and Engineering manager for 30 years.  Since 2012, brewing gave me a hobby different in many ways from my work.  At the same time it feeds my need to learn new things. 
For Christmas this year my wife gave me a Tilt Hydrometer.  It's a great device that measures specific gravity by   floating in the wort and reports it via bluetooth to a smart phone app.

Of course, the first question I asked was - what if I want to log this data. A bit of research lead  me to the BrewPi software and then to Fermentrack. Fermentrack is derived from Brewpi and it has the ability to receive the bluetooth dat from Tilt and graph it over time.  Here's my  experience building the device. 

First, I bought  a CanaKit from Amazon.



High Resolution screen, Remote Desktop and VirtualBox

Wednesday 27 of September, 2017

I bought a 2016 Yoga laptop with a hi-res (3200 x 1800) screen. I'm running the Windows Insider builds of Windows 10. Running a hi-res screen turns up several issues with apps that aren't prepared for all the resolution. One area I had an issue with was remote desktop which I'venow fixed.

The frst thing you do to deal with this is change Windows desktop scaling factor . Windows recommends 250% scale factor for my machine and I'm using that. The next thing to do is read a great reference from Scott Hanselman on living the hi-res lifestyle.

What I exerienced with Remote Desktop with another Win 10 machine was a small window unreadable to my 55 and over eyesight. 

Doing more research, I came across this article from Falafel on Remote Desktop and Hi-Res.  The tip from Falafel is to make use of Remote Descktop Connection Manager and configure the desplay settings to use Full Screen. This will scale thee remote desktop window to match your local screen and it solved my problem.

The last issue was VirtualBox.  One of my remote PCs has a virtualBox VM running Slackware. After scaling the remote desktop I opened the VM and it had not scaled.  After saying "hmmm", I went poking around the display settings for the VM  I found the Scale Factor setting. Setting this  to 200% gave me a usuable VM in a remote desktop session.

Powershell on Linux

Monday 18 of September, 2017

I've been learning a lot about Microsoft's Linux initiatives over the past couple weeks.  I've started using Windows Services for Linux in lieu of putty for connecting to my Linux machines and recently started playing with their PowerShell implementation on Linux.  Last week I had a need to do some scripting on Linux and wanted to re-use some code I had on hand. 

PowerShell can be installed from the repository on most machines.  The PowerShell github page has the details on how to configure your package manager to draw directly from the repository.

For my challenge, I wanted to profile the download speed of a particular website I help manage.  I already have a PS script that does most of what I wanted.  It was a simple task of reconfguring it and testing to be sure all the features were available in the current Linux PS beta.  Here's the script.

$url = "http://files.myfakeco.com/downloads/filedownload/d3974543.zip"
$timer = measure-command {
    $req = system.Net.WebRequest::Create($url)
    try {
        $res = $req.GetResponse()
        $requestStream = $res.GetResponseStream()
        $readStream = New-Object System.IO.StreamReader $requestStream
    catch System.Net.WebException {
        $res = $_.Exception.Response
$sec =  $($timer.totalmilliseconds)/1000
$size = $res.Contentlength
$kbs =  "{0:N2}" -f ($size/$sec/1000)
$ssec =  "{0:N2}" -f $sec
echo "$size, $ssec, $kbs"
"$(get-date -f "MM-dd-yyyy hh:mm tt"), $($res.StatusCode), $size, $ssec, $kbs `r`n"|out-file -append /mnt/samba/Docs/dllog.txt
The script makes use of the .Net WebRequest API. The API downloads the file and reorts status and stats derived from timing the download using measure-command.

But the best part of this is that the exact code runs on Windows Powershell.  I only modified the code to meet my specific needs for this report.

Fun with WSL (Ubuntu on Windows)

Tuesday 15 of August, 2017

I'm running WIndows 10 1703 and have been toying with the Windows Subsystem for Linux (WSL). THis version is based on Ubuntu.  There is some fun it making it useful.  

SSH into WSL

I want to use putty from anywhere to access the shell. SSH requires a few things to make it useful.  Start the bash shell and edit /etc/ssh/sshd_config (sudo nano /etc/ssh/sshd_config)

  • Change the listener.
    • port 2222
  • Turn on Password Authentication (I'll discuss key auth in a bit)
  • Turn off Privilege separation. Rumor has it it isn't implemented
  • Allow TCP port 2222 in the Windows Firewall
  • Generate host key
    • sudo ssh-keygen -A
  • Restart ssh service
    • sudo service ssh --full-restart

You should be able to ssh into the host.






Using Powershell to post data to IFTTT WebHooks service

Monday 07 of August, 2017

IFTTT has many useful triggers and I like Webhooks because it can enable so many fun interactions.  My goal today is sending JSON key:value pairs to WebHooks from Powershell (my preferred scripting language and now available on Linux!).  

WebHooks will accept three named parameters vis JSON (also form data and url parameters) that can be referenced within the Action of your applet.  The paramaeters are named value1, 2 & 3. so the JSON should look like this: 

    "value1":  "Good Morning",
    "value3":  "That is all.",
    "value2":  "Greg"

PowerShell has two methods for posting this to a URL Invoke-WebRequest and Invoke-Restmethod.  The latter is apparently a wrapper of the former and return onthe the string output from the POST. Because of the possible error-checking needs, I'll focus on Invoke-WebRequest.  

Here is the code that made this work:

$BaseURL = "https://maker.ifttt.com/trigger/GMhit/with/key/enteryourkeyhere"
  1. Note: The key (last part of URL is user unique
  2. The Trigger here is GMhit and unique to me. You would declare your own in the IFTTT service

$body = @{ value1="Good Morning" value2="Greg" value3="That is all." }

  1. Either works. Webrequest return status code
  2. Invoke-RestMethod -URI $BaseURL -Body (ConvertTo-Json $body) -Method Post -ContentType application/json

Invoke-WebRequest -URI $BaseURL -Body (ConvertTo-Json $body) -Method Post -ContentType application/json


  • Setting the ContentType to `application/json` is important here.  This call didn't work until this was set correctly.
  • The value names are fixed and cannot be customized.

Recovering from a Bad Drive in a Greyhole storage pool

Monday 13 of February, 2017

I run an Amahi home server which hosts a number of web apps (inlcuding this blog) as well a a large pool of storage for my home.  Amahi uses greyhole (see here and here) to pool disparate disks into a single storage pool. Samba shares are then be added to the pool and greyhole handles distributing data across the pool to use up free space in a controlled manner.  Share data can be made redundant by choosing to make 1, 2 or max copies of the data (where max means a copy on every disk).

The benefit over, say, RAID 5 is that 1) different size disks may be used; 2) each disk has its own complete file system which does not depend on disk grouping; 3) each file system is mounted (and can be unmounted) separately or on a different machine.

So right before the holidays, the 3TB disk on my server (paired with a 1 TB disk) started to go bad.  Reads were succeeeding but took a long time.  Eventually we could no longer watch video files we store on the server and watch through WDTV.  Here is how I went about recovering service and the data (including the mistakes I made).

  • Bought a new 3TB drive and formatted it with ext4 and mounted it (using an external drive dock) and added it to the pool as Drive6.
  • Told greyhole the old disk was going away (drive4)
    greyhole --going=/var/hda/files/drives/drive4/gh

    Greyhole will look to copy any data off the drive that is not copied elsewhere in the pool. It has no effect on the data on the `going` disk (nothing is deleted) except it could cause further damage. The command ran for several days and due to disk errors didn't accomplish much, so I killed the process and took a new tact.


I decided to remove the disk from the pool and attempt an alternate method for recovering the data. 


  This took about two weeks to accomplish due to drive errors.  And because I was making a disk image, I eventually ran out of space on the new disk before it completed.

  • Told greyhole the drive was gone.
    greyhole --gone=/var/hda/files/drives/drive4/gh 
    Greyhole will no longer look for the disk or the data on it.  It has no effect on the data on disk. 

  • Used safecopy to make a drive image of the old disk to a file on the new disk. (if you not used safecopy, check it out.  It will run different levels of data extraction, can be stopped and restarted using the same command and will resume where it left off.
    safecopy --stage1 /dev/sdd1 /var/hda/files/drives/Drive6/d1 -I /dev/null

rsync -av "/run/media/user/5685259e-b425-477b-9055-626364ac095e/gh/Video"  "/mnt/samba/"


  • Bought a  4TB drive and mounted it using an external dock as drive7; copied over and deleted the drive image from the Drive6.
  • Marked the 1TB drive (drive5) as going (see command above) and gone. This moved any good data off the 1TB drive to drive7 but left plenty of room to complete the drive image.

  • Swapped drive5 (1TB) and drive7 (4TB) in the server chassis. Retired the 1TB drive.

  • Mounted the bad 3TB drive in the external dock and resumed the safecopy using:
    safecopy --stage1 /dev/sdd1 /var/hda/files/drives/Drive7/d1 -I /dev/null
  • Mounted the drive image. The base OS for the server is Fedora 23. The drive tool inlcudes a menu item to mount a drive image.  It worked pretty simply to mount the image at /run/media/username/someGUID.

  • Used rsync to copy the data form the image to the data share.  I use a service script called mount_shares_locally as the preferred method for putting data into greyhole pool is by copying it to the samba share.  The one caveat here is that greyhole must stage the data while it copies it to the permanent location. That staging area is on the / partition under /var/hda.  I have about 300GB free on that partition so I had to monitor the copy and kill the rsync every couple hours. Fortunately, rsync handles this gracefully which is why I chose it over a straight copy.


A couple observations.  First, because of the way I had greyhole shares setup, I had safe copies of the critical data. All my docs, photos and music had a safe second copy. The data on the failed disk was disposable.  I undertook the whole process because I wanted to see if it would work and whatever I recovered would only be a plus.  

This took some time and a bit of finesse on my part to get the data back.  But I like how well greyhole performed and how having the independent filesystems gave me the option to recover data on my time. Finding safecopy simplified this a lot and added a new weapon to my recovery toolkit!.


Reset a Whirlpool Duet washer

Monday 06 of April, 2015

We accidentally started the washer with hot water feed turned off. When the washer tried to fill, it couldn't and generated F08 E01 error codes. After clearing the codes and restarting, we eventually got to a point where the panel wouldn't light up at all. Unplugging and re-plugging the power would do nothing except start the pump.
It was obvious it needed to be cleared. After too much searching, I found this link on forum.appliancepartspros.com (cache).

It tells you to press "Wash Temp", "Spin", "Soil" three times in a row to reset the washer. Once it resets, the original error will display. Press power once to clear it. After that - all was well (of course I turned on the water first)

Adjusting Brewing Water

Tuesday 10 of March, 2015

I recently got hold of (well, asked for and received) a water analysis from the Perkasie Borough Authority and have been staring at it for more than a month wondering what to do with it. I've read the section on water in Palmer's How To Brew and some of his Water book. These are both excellent resources and while I have a science background, they are quite technical and I've been unable to turn all the details into an action to take, if any, on my brewing water.

The March 2015 issue of Zymurgy (cache) has an article by Martin Brungard on Calcium and Magnesium that has helped me turn knowledge into action. At the risk of oversimplyfying the guidance, I want to draw some conclusions for my use.

Some of Martin's conclusions

  • A level of 50mg/L Calcium is a nice minimum for many beer styles
  • You may want less than 50mg/L for lagers (wish I knew that a week ago) but not lower than 20
  • A range of 10-20mg/L Magnesium is a safe bet for most beers
  • Yeast starters need magnesat the high end pf that range to facilitate yeast growth

The Water Authority rep who gave me the report said two wells feed water to our part of the borough. Looking at the two wells, the Ca and Mg values are similar averaging 85 mg/L and 25mg/L respectively.

This leaves my water right in the sweet spot for average beers styles. What about some of the edge cases like IPAs and lagers.

  • For lagers, next time I'll dilute the mash water with 50% reverse osmosis (RO) water to reduce the Ca to about 40. I may want to supplement the Mg to bring it back to 20.
  • For IPAs, I may want to add Mg to bring it up near 40 mg/L.

Building a Temperature Controller from an STC-1000

Sunday 04 of August, 2013

My son & I have been brewing beer together for 8 months now. We've been very intentional about moving slowly into the process of building both our knowledge and our brew system. As I'm already a tech geek, it is real easy for me to become a brewing geek as well and to go broke in the process. When we started collecting brewing equipment, I agreed to try to buy everything half price. Home Brew Findshas been invaluable when looking for the cheapest way to solve a brewing problem.

With the summer months and the need to lager a Dopplebock, I converted a 20 yr old dorm fridge into a fermentation fridge using 1.5" foam insulation.
Fermentation Fridge
Fermentation Fridge
And while this allowed me to lager, in this configuration it doesn't actually control the temperature so I went looking for a way to do that.

I settled the Elitech STC-1000 as it is a cheap alternative to the Johnson Controls controller (cache). Of course, the latter controller is a full package with power cord and power connectors for the cooling and heating units. The STC-1000 unit by contrast consists only of the controller and control panel. Oh, and it is Celsius only so you need to convert. But Google makes that easy ("convert 68 to celsius") My unit cost $35 to make while the Johnson is about $70.

To make use of the STC-1000, I had to build it into a package that allows for convenient use. Here's how I did it.

When I looked at the size of the STC-1000, it appeared the right size to fit in a standard outlet box (in the US and it was real close in size to the GFCI cutout. I bought a plastic cover to hold the GFCI and duplex outlet. I then modified to make the GFCI opening maybe 1/4" longer.

faceplate mod Mounted STC-1000

Next step was to mount the duplex outlet and wire it up. Keep in mind we need to control heating and cooling so we need to power the outlets individually. To do this, you have to break the copper tab on the black wire side of the outlet. I didn't take a before picture, but here it is after mod.
Outlet Modification

Now we can run a wire from the heating side to one outlet and from the cooling side to the other.

The other tricky piece is understanding that the STC-1000 only provides a relay service for activating the heating and cooling circuits - it doesn't actually supply power. I dealt with that by tapping the in-bound hot lead (black wire) to both the heating and cooling connectors. This is seen here with the first loop coming from the power in going to heating and the second loop from heating to cooling:

Outlet Modification

To power the outlet, I took a short wire from the heating to the outlet and from the cooling to the other outlet. The white wiring is pretty straight forward. Simply tap the in-bound white wire and connect it to one of the white lugs. No need for separate connections as the common wire is, um, common.
Finally, we add a tension-reliever to the box, run the temperature sensor through it, mount the outlet and buckle it up

tensioner Mounted STC-1000

- I used a new orange 25' extension cord for the power side. I cut it in half and used wire from the unused half to do the wiring. I then added a new plug to the remaining cord so I had a usable cord.
- The STC-1000 was $19, the extension cord - $10, the box and cover $6. So this controller cost $35 plus two hours labor.

- Here is a wiring diagram
Wiring Diagram

Using getElementByID in Powershell

Thursday 07 of March, 2013

I was asked to pull a piece of information from a web page using Powershell. Fortunately for me, the item is tagged by an html "ID" element. While testing I discovered the following code worked when I stepped through it line by line, but failed when run as a script.
(Note 1: The following is a public example, not my actual issue. This snippet returns the number vote total for the question)

$ie = new-object -com InternetExplorer.Application
$ie.Document.getElementByID('post-832-score')|select innertext

The code is straightforward. It creates a new COM object to runs Internet Explorer. navigates to a specific page, then looks for a specific "id" tag in the html and outputs the value. The problem we saw was when we attempted to run the $doc.getElementByID command we received an error saying it could not be run on a $null object.
The question was asked during the Philly PowerShell meeting that perhaps the script needed to wait for the $ie.Navigate command to complete before moving on. And indeed this appears to be an asynchronous command. That is, PowerShell executes it but doesn't wait for it to complete before moving on to the next command.

The solution was the addition of a single line of code:

while ($ie.Busy -eq $true) { Start-Sleep 1 }

It simply loops until $ie is no longer busy.

The revised script looks like this:

$ie = new-object -com InternetExplorer.Application
while ($ie.Busy -eq $true) { Start-Sleep 1 }
$ie.Document.getElementByID('post-832-score')|select innertext

Adding XCache for PHP on Fedora

Tuesday 26 of February, 2013

I want to run XCache on my Amahi server to help speed up some php apps. Details on adding it are found here (cache)

  • yum install php-devel
  • yum groupinstall 'Development Tools' (44 packages - yikes)
  • yum groupinstall 'Development Libraries' (78 packages)

cd /tmp
wget http://xcache.lighttpd.net/pub/Releases/3.0.1/xcache-3.0.1.tar.gz
tar xvfz xcache-3.0.1.tar.gz
cd xcache-3.0.1

./configure --enable-xcache
make install
(Note: I ran make test and one test failed "xcache_set/get test")

Our First Brew

Friday 28 of December, 2012

Priscilla, Kyle & I have been enjoying local craft beer recently. There are several local breweries (Free Will, Round Guys, Prism) and many pubs are now serving craft beer along with the mass market beer. It's no surprise then that I received a home brew kit for Christmas this year. Kyle & I brewed our first batch today. He likes to call it Frozen Water Brewing.

Here are a few prep notes:

Brew notes:

  • We used an old aluminum pot for brewing - seemed to work well.
  • Warm the extract early so it pours easily
  • When first boil happens, be ready for the over-boil. It happens fast. Pull the pot off the heat quickly
  • Sanitize, sanitizes, sanitize
  • The bottle filler tube from the True Brew kit works well to pull a sample out through the airlock hole. Sanitize tube before dipping
  • First spec gravity test, 4 hours after starting fermentation read 1.045 - right on target
  • Fermentation should be done when SG reaches 1.009 - 1.003

We celebrated by running back to Keystone for bottles in the snowstorm then stopping by Prism Brewing in North Wales to try their beer.

Now we wait.

Hiding WordPress Login page

Saturday 08 of December, 2012

Our security guy showed me how to harvest editor names Wordpress. This combined with the known location of the login page makes the site susceptible to script kiddies plying their wares. A simple way to combat this is to create a redirect page somewhere and then restricting access to wp-login.php to visits coming from that page. I borrowed this idea from here. To implement this, I created my redirect page and added the following to the .htaccess file for the site.

# protect wp-login.php
Files wp-login.php (wrap in angle brackets)
   Order deny,allow 
   RewriteEngine on 
   RewriteCond %{HTTP_REFERER} !^http://www.mywebplace.com/wp-content/uploads/anoddname.html$ [NC] 
   RewriteRule .* - [F] 

/Files (wrap in angle brackets)

These lines are interpreted like this:

  •  for all files called wp-login.php
    • default to deny
    • If the HTTP_Referrer is not anoddname.html
    • don't rewrite the page, but return Forbidden HTTP code

I then created 'anoddfilename.html' and added a meta-redirect like this:

META HTTP-EQUIV="refresh" CONTENT="0;URL=http://www.mywebplace.com/wp-login.php"

These changes worked as expected. The site was fine, but to login you have to visit the site by hitting anoddname.html first.  There is one problem.  You cannot logout form the site.  That's because to logout you call wp-login.php again with ?action=logout appended to the url. Since you are on a page other then AnOddName.html at the time, you are forbidden from getting to the wp-login.php

To fix this, I added two more lines to the .htaccess file

.htaccess more
RewriteCond %{QUERY_STRING} ^action=logout [NC]
RewriteRule .* - [L]

With these lines added, .htaccess now checks first to see if you are calling with "?action=logout" Query_String. If so, it does not rewrite and stops. The complete .htaccess section is now:

Complete .htaccess
# protect wp-login.php
Files wp-login.php (wrap in angle brackets)
    Order deny,allow
    RewriteEngine  on
    RewriteCond %{QUERY_STRING} ^action=logout [NC]
    RewriteRule .* - [L]w
    RewriteCond %{HTTP_REFERER} !^http://www.mywebplace.com/wp-content/uploads/tbirdsarego.html$ [NC]
    RewriteRule .* - [F]

/Files (wrap in angle brackets)

Processing Object Properties in the PowerShell Pipeline

Sunday 21 of October, 2012

I was running a quick AD query via Powershell today and needed to export the results to a csv. The results contained two fields (lastLogonTimestamp and pwdLastSet) that are not human readable, but I needed them to be. There is a quick transform you can make on these field to convert them to a datetime value. It is simply:

Convert system time to datetime

That part is easy, but how do we do that in the pipeline so all the objects get converted before the export? To do so, we can create a calculated field using a specially crafted hash-table to describe the members to Select-Object. So this:

Select 'before'
select name, description, distinguishedname,pwdlastset,lastLogonTimestamp


Select after
select name, description, distinguishedname,@{LABEL='pwdlastset';EXPRESSION={[datetime]::FromFileTimeUTC($_.pwdlastset)}},@{LABEL='lastLogonTimestamp';EXPRESSION={[datetime]::FromFileTimeUTC($_.lastLogonTimestamp)}}

Looking a bit more in-depth, the pwdLastSet was modified to specify the column LABEL and the EXPRESSION in an array, with the EXPRESSION now set to the new, calculated value.


in case you are wondering, my full, one-liner, is a report on all disabled accounts in our directory

Get-Disabled Accounts
Get-ADUser -LDAPFilter {(useraccountcontrol:1.2.840.113556.1.4.803:=2)} -properties pwdlastset,lastLogonTimestamp,description| select name, description, distinguishedname,@{LABEL='pwdlastset';EXPRESSION={[datetime]::FromFileTimeUTC($_.pwdlastset)}},@{LABEL='lastLogonTimestamp';EXPRESSION={[datetime]::FromFileTimeUTC($_.lastLogonTimestamp)}}|export-csv d:\data\disabled_accts.csv -notypeinformation

Using Windows Server Web Platform Installer

Saturday 20 of October, 2012

Quick instructions for installing using the Windows Web Platform Installer tool in Windows server 2008 & Win 7. Note - if you are installing from Microsoft's Web Gallery, the process is much more automatic. The following procedure is necessary only if you need to install from the stand-alone package.

  • Open Server Manager and navigate down Roles/Web Server to the Default Web Site. Select the Default Web Site.
  • Using the Features View, in the Action Pane (Right), select Import Application(option missing? - see below)*
  • Browser to the zip file, select it and click Open, then Next.
  • The Application package will open with all steps selected. Click Next.
  • Select your database options (MySQL, and Create New are defaults) - Click Next.
  • Enter a new application path and database server info if you don't like the defaults. Scroll down to enter the sa password and the tiki db user password.
  • Click Next to run the install
  • After the install completes, open up a browser and point it at the site. (Default URL is http://localhost/tiki).
  • At this point the Tiki install will begin

(*Note: if this option is missing, download and install the Web Platform Installer v4 from here http://www.microsoft.com/web/downloads/platform.aspx. You will also need to install MySQl from the database category of Web PI as well as php from the Frameworks category)

Posting large zip files to Tiki image galleries

Thursday 23 of August, 2012

I have a bunch of photos from Africa I want to upload my tiki-based site. See the Zambia blog for more about that!. Now Tiki is great, but who wants to post phots one at a time? Fortunately, tiki offers a solution - it will unzip zipped files and post the photos. So I tried it and it failed. I had to make 4 changes to make this work. Here are the errors and the associated fixes: -First error: PHP Warning: upload_max_filesize of 2097152 bytes exceeded This was fixed with a change to the _upload_max_filesize_ entry in php.ini The default is 2MB. I upped it to 40M -Second error: PHP Fatal error: Allowed memory size of 25165824 bytes exhausted This was fixed with a change to _memory_limit_ in php.ini. I upped it to 48MB. -Third error: POST Content-Length of 14402734 bytes exceeds the limit of 8388608 bytes This was fixed with a change to _post_max_size_ in php.ini. I upped it to 40M. -Fourth error MAX_FILE_SIZE of 10000000 bytes exceeded This required editing tiki/templates/tiki-upload_image.tpl to change the MAX_FILE_UPLOAD parameter for the upload page.

Nagios, NSClient++ and Powershell

Monday 13 of August, 2012

I did a talk recently at FOSSCON (cache) on how to monitor anything with Nagios. The focus of the talk was how to write NRPE checks to check obscure things in your IT environment. One of the checks I presented used a PowerShell script to check the files in a web site to ensure they hadn't changed. During development of and testing for my demo, I discovered that while my script was returning the correct errorlevel (in my case a 2), NSClient++ would only return 0 or 1.

The zero or one indicated to me that Powershell was merely reporting whether it exited normally or not.

Here is the first configuration I tried in NSClient.

[NRPE Handlers]
check_sitesign=cmd /c echo check_sitesign.ps1|powershell.exe -command -

This config, calls cmd.exe with the /c switch (run & exit), echos the script name and pipes it to powershell on STDOUT.
Powershell is executed with the -command parameter (execute the command and exit and told via the '-' to read that command from the STDIN.

After some investigation and testing from the command line it became clear the -command was an issue. The script I was passing it was seen as a single command and it either succeeded or failed hence the 0 or 1.

Next config line we tried was

[NRPE Handlers]
check_sitesign=cmd /c echo check_sitesign.ps1|powershell.exe -file -

This looked promising and tested fine on the command line but still only returned 0 or 1.

Finally, I stumbled upon this idea

[NRPE Handlers]
check_sitesign=cmd /c echo check_sitesign.ps1; exit($lastexitcode)|powershell.exe -command -

This tells Powershell to run two commands. The first runs the script and the second tells Powershell to exit with the value of the last exit code.

Counting a collection in Powershell

Wednesday 04 of July, 2012

I'm writing a script for AD that will manage for old computer accounts and disable or delete them per our policy. I ran into an interesting issue that I want to document so I remember it.

The query I ran to find old accounts sometimes return 0 objects, sometimes 1, and most times > 1. While testing the code, I noticed that It was having an issue when the return set was zero so I added this code to test for that case:

$oldComps = get-adcomputer -filter ...
if ($oldComps -eq $null) {
        $count = 0 } else {
        $count = ($oldComps).count
    write-host "Processing $count accounts over $daysToDisable for disable"

When this runs I get a nice message saying how many accounts will get processed - unless there is one item returned by the query - in which case $count is blank. Hmmmm.

I went nuts over this for 15 minutes then thought to ask the internet using this query:
"powershell AD collection count one item"
I found an article that suggested the issue was that when one item is returned from the PowerShell command, it is returned as a scalar (single object as everything in Powershell is an object) and not as a collection. The simple fix for this is to wrap the command to force it to return an array
@( ... )

So my code from earlier becomes:

$oldComps = @(get-adcomputer -filter ...<snip>)
if ($oldComps -eq $null) {
        $count = 0 } else {
        $count = ($oldComps).count
    write-host "Processing $count accounts over $daysToDisable for disable"

We can credit this as today's thing learned.

Sending Freecycle posts to Twitter

Sunday 01 of July, 2012

I've been a member of the Upper Bucks Freecycle list for years. I recently realized I missed most of the postings to the list and as a frequent user if Twitter thought there would be handy if the postings were there as well. I had just looked at the If This Then That service ifttt.com and realized I could solve this for the group.

There are a couple simple steps for this. First was to activate two channels, GMail and Twitter. Activation is simple and generally involves authorizing these application to access accounts on your behalf. Ifttt makes use of OAuth when possible so this is generally secure (but use a strong password on your ifttt account, OK?).

For this use, I used my personal GMail account since I already had a FreeCycle subscription, but I set up a twitter account for this specific case (@UBFC_post (cache)).

Ifttt uses the concept of triggers(events) and actions. The trigger for this recipe was a GMail label of UBFC. (I'd already defined a GMail rule to file my UBFC messages).

The action is "Post a new tweet". The contents of the What's happening box are a GMail ingredient {{Subject}} and some extra text to take the reader to the freecycle list ("More Here: http://groups.yahoo.com... #freecycle #ubfc" )

Once you save the recipe, it runs regularly against your mailbox looking for changes and posts a tweet if the action filter matches.

Things I'd like to fix.
- The Freecycle list is closed so I cannot link the reader directly to the message. You must join to see the details
- I really should create an other gmail account for this specific use. Currently if I post something to the list, all of my responses go to Twitter as well since they get the UBFC label

Fixing product activation in Windows XP

Saturday 30 of June, 2012

If the key is being marked as "Incorrect", use the Product key Update Tool (PKUT) to change the key. Download and run PKUT from here (cache) and enter the key from you COA when asked.

Once the system reboots, try activating again using this command-line (paste into run box)

oobe/msoobe /a