Loading...
 

Greg`s Tech blog

My technical journal where I record my challenges with Linux, open source SW, Tiki, PowerShell, Brewing beer, AD, LDAP and more...

High Resolution screen, Remote Desktop and VirtualBox

Wednesday 27 of September, 2017

I bought a 2016 Yoga laptop with a hi-res (3200 x 1800) screen. I'm running the Windows Insider builds of Windows 10. Running a hi-res screen turns up several issues with apps that aren't prepared for all the resolution. One area I had an issue with was remote desktop which I'venow fixed.


The frst thing you do to deal with this is change Windows desktop scaling factor . Windows recommends 250% scale factor for my machine and I'm using that. The next thing to do is read a great reference from Scott Hanselman on living the hi-res lifestyle.


What I exerienced with Remote Desktop with another Win 10 machine was a small window unreadable to my 55 and over eyesight. 



Doing more research, I came across this article from Falafel on Remote Desktop and Hi-Res.  The tip from Falafel is to make use of Remote Descktop Connection Manager and configure the desplay settings to use Full Screen. This will scale thee remote desktop window to match your local screen and it solved my problem.


The last issue was VirtualBox.  One of my remote PCs has a virtualBox VM running Slackware. After scaling the remote desktop I opened the VM and it had not scaled.  After saying "hmmm", I went poking around the display settings for the VM  I found the Scale Factor setting. Setting this  to 200% gave me a usuable VM in a remote desktop session.


Powershell on Linux

Monday 18 of September, 2017

I've been learning a lot about Microsoft's Linux initiatives over the past couple weeks.  I've started using Windows Services for Linux in lieu of putty for connecting to my Linux machines and recently started playing with their PowerShell implementation on Linux.  Last week I had a need to do some scripting on Linux and wanted to re-use some code I had on hand. 


PowerShell can be installed from the repository on most machines.  The PowerShell github page has the details on how to configure your package manager to draw directly from the repository.


For my challenge, I wanted to profile the download speed of a particular website I help manage.  I already have a PS script that does most of what I wanted.  It was a simple task of reconfguring it and testing to be sure all the features were available in the current Linux PS beta.  Here's the script.



$url = "http://files.myfakeco.com/downloads/filedownload/d3974543.zip"


$timer = measure-command {

    $req = system.Net.WebRequest::Create($url)

    try {

        $res = $req.GetResponse()

        $requestStream = $res.GetResponseStream()

        $readStream = New-Object System.IO.StreamReader $requestStream

        $data=$readStream.ReadToEnd()

    }

    catch System.Net.WebException {

        $res = $_.Exception.Response

    }

}

$sec =  $($timer.totalmilliseconds)/1000

$size = $res.Contentlength

$kbs =  "{0:N2}" -f ($size/$sec/1000)

$ssec =  "{0:N2}" -f $sec


echo "$size, $ssec, $kbs"

"$(get-date -f "MM-dd-yyyy hh:mm tt"), $($res.StatusCode), $size, $ssec, $kbs `r`n"|out-file -append /mnt/samba/Docs/dllog.txt


The script makes use of the .Net WebRequest API. The API downloads the file and reorts status and stats derived from timing the download using measure-command.


But the best part of this is that the exact code runs on Windows Powershell.  I only modified the code to meet my specific needs for this report.


Fun with WSL (Ubuntu on Windows)

Tuesday 15 of August, 2017

I'm running WIndows 10 1703 and have been toying with the Windows Subsystem for Linux (WSL). THis version is based on Ubuntu.  There is some fun it making it useful.  



SSH into WSL



I want to use putty from anywhere to access the shell. SSH requires a few things to make it useful.  Start the bash shell and edit /etc/ssh/sshd_config (sudo nano /etc/ssh/sshd_config)



  • Change the listener.
    • port 2222


  • Turn on Password Authentication (I'll discuss key auth in a bit)

  • Turn off Privilege separation. Rumor has it it isn't implemented

  • Allow TCP port 2222 in the Windows Firewall

  • Generate host key
    • sudo ssh-keygen -A


  • Restart ssh service
    • sudo service ssh --full-restart


You should be able to ssh into the host.


 


 


 


 


 


Using Powershell to post data to IFTTT WebHooks service

Monday 07 of August, 2017

IFTTT has many useful triggers and I like Webhooks because it can enable so many fun interactions.  My goal today is sending JSON key:value pairs to WebHooks from Powershell (my preferred scripting language and now available on Linux!).  


WebHooks will accept three named parameters vis JSON (also form data and url parameters) that can be referenced within the Action of your applet.  The paramaeters are named value1, 2 & 3. so the JSON should look like this: 


{
    "value1":  "Good Morning",
    "value3":  "That is all.",
    "value2":  "Greg"
}


PowerShell has two methods for posting this to a URL Invoke-WebRequest and Invoke-Restmethod.  The latter is apparently a wrapper of the former and return onthe the string output from the POST. Because of the possible error-checking needs, I'll focus on Invoke-WebRequest.  


Here is the code that made this work:


$BaseURL = "https://maker.ifttt.com/trigger/GMhit/with/key/enteryourkeyhere"


  1. Note: The key (last part of URL is user unique

  2. The Trigger here is GMhit and unique to me. You would declare your own in the IFTTT service

$body = @{ value1="Good Morning" value2="Greg" value3="That is all." }



  1. Either works. Webrequest return status code

  2. Invoke-RestMethod -URI $BaseURL -Body (ConvertTo-Json $body) -Method Post -ContentType application/json

Invoke-WebRequest -URI $BaseURL -Body (ConvertTo-Json $body) -Method Post -ContentType application/json


Notes:



  • Setting the ContentType to `application/json` is important here.  This call didn't work until this was set correctly.

  • The value names are fixed and cannot be customized.


Recovering from a Bad Drive in a Greyhole storage pool

Monday 13 of February, 2017

I run an Amahi home server which hosts a number of web apps (inlcuding this blog) as well a a large pool of storage for my home.  Amahi uses greyhole (see here and here) to pool disparate disks into a single storage pool. Samba shares are then be added to the pool and greyhole handles distributing data across the pool to use up free space in a controlled manner.  Share data can be made redundant by choosing to make 1, 2 or max copies of the data (where max means a copy on every disk).


The benefit over, say, RAID 5 is that 1) different size disks may be used; 2) each disk has its own complete file system which does not depend on disk grouping; 3) each file system is mounted (and can be unmounted) separately or on a different machine.


So right before the holidays, the 3TB disk on my server (paired with a 1 TB disk) started to go bad.  Reads were succeeeding but took a long time.  Eventually we could no longer watch video files we store on the server and watch through WDTV.  Here is how I went about recovering service and the data (including the mistakes I made).



  • Bought a new 3TB drive and formatted it with ext4 and mounted it (using an external drive dock) and added it to the pool as Drive6.

  • Told greyhole the old disk was going away (drive4)
    greyhole --going=/var/hda/files/drives/drive4/gh

    Greyhole will look to copy any data off the drive that is not copied elsewhere in the pool. It has no effect on the data on the `going` disk (nothing is deleted) except it could cause further damage. The command ran for several days and due to disk errors didn't accomplish much, so I killed the process and took a new tact.

I decided to remove the disk from the pool and attempt an alternate method for recovering the data. 


  • Told greyhole the drive was gone.
    greyhole --gone=/var/hda/files/drives/drive4/gh 
    Greyhole will no longer look for the disk or the data on it.  It has no effect on the data on disk. 


  • Used safecopy to make a drive image of the old disk to a file on the new disk. (if you not used safecopy, check it out.  It will run different levels of data extraction, can be stopped and restarted using the same command and will resume where it left off.
    safecopy --stage1 /dev/sdd1 /var/hda/files/drives/Drive6/d1 -I /dev/null


This took about two weeks to accomplish due to drive errors.  And because I was making a disk image, I eventually ran out of space on the new disk before it completed.



  • Bought a  4TB drive and mounted it using an external dock as drive7; copied over and deleted the drive image from the Drive6.


  • Marked the 1TB drive (drive5) as going (see command above) and gone. This moved any good data off the 1TB drive to drive7 but left plenty of room to complete the drive image.




  • Swapped drive5 (1TB) and drive7 (4TB) in the server chassis. Retired the 1TB drive.




  • Mounted the bad 3TB drive in the external dock and resumed the safecopy using:
    safecopy --stage1 /dev/sdd1 /var/hda/files/drives/Drive7/d1 -I /dev/null



  • Mounted the drive image. The base OS for the server is Fedora 23. The drive tool inlcudes a menu item to mount a drive image.  It worked pretty simply to mount the image at /run/media/username/someGUID.




  • Used rsync to copy the data form the image to the data share.  I use a service script called mount_shares_locally as the preferred method for putting data into greyhole pool is by copying it to the samba share.  The one caveat here is that greyhole must stage the data while it copies it to the permanent location. That staging area is on the / partition under /var/hda.  I have about 300GB free on that partition so I had to monitor the copy and kill the rsync every couple hours. Fortunately, rsync handles this gracefully which is why I chose it over a straight copy.



rsync -av "/run/media/user/5685259e-b425-477b-9055-626364ac095e/gh/Video"  "/mnt/samba/"



 


A couple observations.  First, because of the way I had greyhole shares setup, I had safe copies of the critical data. All my docs, photos and music had a safe second copy. The data on the failed disk was disposable.  I undertook the whole process because I wanted to see if it would work and whatever I recovered would only be a plus.  


This took some time and a bit of finesse on my part to get the data back.  But I like how well greyhole performed and how having the independent filesystems gave me the option to recover data on my time. Finding safecopy simplified this a lot and added a new weapon to my recovery toolkit!.


 


 


Reset a Whirlpool Duet washer

Monday 06 of April, 2015
We accidentally started the washer with hot water feed turned off. When the washer tried to fill, it couldn't and generated F08 E01 error codes. After clearing the codes and restarting, we eventually got to a point where the panel wouldn't light up at all. Unplugging and re-plugging the power would do nothing except start the pump.
It was obvious it needed to be cleared. After too much searching, I found this link on forum.appliancepartspros.com (cache).

It tells you to press "Wash Temp", "Spin", "Soil" three times in a row to reset the washer. Once it resets, the original error will display. Press power once to clear it. After that - all was well (of course I turned on the water first)


Adjusting Brewing Water

Tuesday 10 of March, 2015
I recently got hold of (well, asked for and received) a water analysis from the Perkasie Borough Authority and have been staring at it for more than a month wondering what to do with it. I've read the section on water in Palmer's How To Brew and some of his Water book. These are both excellent resources and while I have a science background, they are quite technical and I've been unable to turn all the details into an action to take, if any, on my brewing water.

The March 2015 issue of Zymurgy (cache) has an article by Martin Brungard on Calcium and Magnesium that has helped me turn knowledge into action. At the risk of oversimplyfying the guidance, I want to draw some conclusions for my use.

Some of Martin's conclusions
  • A level of 50mg/L Calcium is a nice minimum for many beer styles
  • You may want less than 50mg/L for lagers (wish I knew that a week ago) but not lower than 20
  • A range of 10-20mg/L Magnesium is a safe bet for most beers
  • Yeast starters need magnesat the high end pf that range to facilitate yeast growth

The Water Authority rep who gave me the report said two wells feed water to our part of the borough. Looking at the two wells, the Ca and Mg values are similar averaging 85 mg/L and 25mg/L respectively.

This leaves my water right in the sweet spot for average beers styles. What about some of the edge cases like IPAs and lagers.

  • For lagers, next time I'll dilute the mash water with 50% reverse osmosis (RO) water to reduce the Ca to about 40. I may want to supplement the Mg to bring it back to 20.
  • For IPAs, I may want to add Mg to bring it up near 40 mg/L.


Building a Temperature Controller from an STC-1000

Sunday 04 of August, 2013
My son & I have been brewing beer together for 8 months now. We've been very intentional about moving slowly into the process of building both our knowledge and our brew system. As I'm already a tech geek, it is real easy for me to become a brewing geek as well and to go broke in the process. When we started collecting brewing equipment, I agreed to try to buy everything half price. Home Brew Findshas been invaluable when looking for the cheapest way to solve a brewing problem.
With the summer months and the need to lager a Dopplebock, I converted a 20 yr old dorm fridge into a fermentation fridge using 1.5" foam insulation.
Fermentation Fridge
Fermentation Fridge
And while this allowed me to lager, in this configuration it doesn't actually control the temperature so I went looking for a way to do that.
I settled the Elitech STC-1000 as it is a cheap alternative to the Johnson Controls controller (cache). Of course, the latter controller is a full package with power cord and power connectors for the cooling and heating units. The STC-1000 unit by contrast consists only of the controller and control panel. Oh, and it is Celsius only so you need to convert. But Google makes that easy ("convert 68 to celsius") My unit cost $35 to make while the Johnson is about $70.

To make use of the STC-1000, I had to build it into a package that allows for convenient use. Here's how I did it.

When I looked at the size of the STC-1000, it appeared the right size to fit in a standard outlet box (in the US and it was real close in size to the GFCI cutout. I bought a plastic cover to hold the GFCI and duplex outlet. I then modified to make the GFCI opening maybe 1/4" longer.

faceplate mod Mounted STC-1000


Next step was to mount the duplex outlet and wire it up. Keep in mind we need to control heating and cooling so we need to power the outlets individually. To do this, you have to break the copper tab on the black wire side of the outlet. I didn't take a before picture, but here it is after mod.
Outlet Modification

Now we can run a wire from the heating side to one outlet and from the cooling side to the other.

The other tricky piece is understanding that the STC-1000 only provides a relay service for activating the heating and cooling circuits - it doesn't actually supply power. I dealt with that by tapping the in-bound hot lead (black wire) to both the heating and cooling connectors. This is seen here with the first loop coming from the power in going to heating and the second loop from heating to cooling:
Outlet Modification

To power the outlet, I took a short wire from the heating to the outlet and from the cooling to the other outlet. The white wiring is pretty straight forward. Simply tap the in-bound white wire and connect it to one of the white lugs. No need for separate connections as the common wire is, um, common.
Finally, we add a tension-reliever to the box, run the temperature sensor through it, mount the outlet and buckle it up
tensioner Mounted STC-1000

Notes:
- I used a new orange 25' extension cord for the power side. I cut it in half and used wire from the unused half to do the wiring. I then added a new plug to the remaining cord so I had a usable cord.
- The STC-1000 was $19, the extension cord - $10, the box and cover $6. So this controller cost $35 plus two hours labor.
- Here is a wiring diagram
Wiring Diagram


Using getElementByID in Powershell

Thursday 07 of March, 2013
I was asked to pull a piece of information from a web page using Powershell. Fortunately for me, the item is tagged by an html "ID" element. While testing I discovered the following code worked when I stepped through it line by line, but failed when run as a script.
(Note 1: The following is a public example, not my actual issue. This snippet returns the number vote total for the question)

$ie = new-object -com InternetExplorer.Application
$ie.Navigate('http://linuxexchange.org/questions/832/programming-logic-question')
$ie.Document.getElementByID('post-832-score')|select innertext


The code is straightforward. It creates a new COM object to runs Internet Explorer. navigates to a specific page, then looks for a specific "id" tag in the html and outputs the value. The problem we saw was when we attempted to run the $doc.getElementByID command we received an error saying it could not be run on a $null object.
The question was asked during the Philly PowerShell meeting that perhaps the script needed to wait for the $ie.Navigate command to complete before moving on. And indeed this appears to be an asynchronous command. That is, PowerShell executes it but doesn't wait for it to complete before moving on to the next command.

The solution was the addition of a single line of code:
while ($ie.Busy -eq $true) { Start-Sleep 1 }

It simply loops until $ie is no longer busy.

The revised script looks like this:
$ie = new-object -com InternetExplorer.Application
$ie.Navigate('http://linuxexchange.org/questions/832/programming-logic-question')
while ($ie.Busy -eq $true) { Start-Sleep 1 }
$ie.Document.getElementByID('post-832-score')|select innertext
$ie.quit()

Adding XCache for PHP on Fedora

Tuesday 26 of February, 2013
I want to run XCache on my Amahi server to help speed up some php apps. Details on adding it are found here (cache)
  • yum install php-devel
  • yum groupinstall 'Development Tools' (44 packages - yikes)
  • yum groupinstall 'Development Libraries' (78 packages)

cd /tmp
wget http://xcache.lighttpd.net/pub/Releases/3.0.1/xcache-3.0.1.tar.gz
tar xvfz xcache-3.0.1.tar.gz
cd xcache-3.0.1

phpize
./configure --enable-xcache
make
make install
(Note: I ran make test and one test failed "xcache_set/get test")

Our First Brew

Friday 28 of December, 2012
Priscilla, Kyle & I have been enjoying local craft beer recently. There are several local breweries (Free Will, Round Guys, Prism) and many pubs are now serving craft beer along with the mass market beer. It's no surprise then that I received a home brew kit for Christmas this year. Kyle & I brewed our first batch today. He likes to call it Frozen Water Brewing.

Here are a few prep notes:

Brew notes:
  • We used an old aluminum pot for brewing - seemed to work well.
  • Warm the extract early so it pours easily
  • When first boil happens, be ready for the over-boil. It happens fast. Pull the pot off the heat quickly
  • Sanitize, sanitizes, sanitize
  • The bottle filler tube from the True Brew kit works well to pull a sample out through the airlock hole. Sanitize tube before dipping
  • First spec gravity test, 4 hours after starting fermentation read 1.045 - right on target
  • Fermentation should be done when SG reaches 1.009 - 1.003

We celebrated by running back to Keystone for bottles in the snowstorm then stopping by Prism Brewing in North Wales to try their beer.

Now we wait.

Hiding WordPress Login page

Saturday 08 of December, 2012
Our security guy showed me how to harvest editor names Wordpress. This combined with the known location of the login page makes the site susceptible to script kiddies plying their wares. A simple way to combat this is to create a redirect page somewhere and then restricting access to wp-login.php to visits coming from that page. I borrowed this idea from here. To implement this, I created my redirect page and added the following to the .htaccess file for the site.
.htaccess
# protect wp-login.php
Files wp-login.php (wrap in angle brackets)
   Order deny,allow 
   RewriteEngine on 
   RewriteCond %{HTTP_REFERER} !^http://www.mywebplace.com/wp-content/uploads/anoddname.html$ [NC] 
   RewriteRule .* - [F] 

/Files (wrap in angle brackets)

These lines are interpreted like this:
  •  for all files called wp-login.php
    • default to deny
    • If the HTTP_Referrer is not anoddname.html
    • don't rewrite the page, but return Forbidden HTTP code
I then created 'anoddfilename.html' and added a meta-redirect like this:
AnOddname.html
META HTTP-EQUIV="refresh" CONTENT="0;URL=http://www.mywebplace.com/wp-login.php"

These changes worked as expected. The site was fine, but to login you have to visit the site by hitting anoddname.html first.  There is one problem.  You cannot logout form the site.  That's because to logout you call wp-login.php again with ?action=logout appended to the url. Since you are on a page other then AnOddName.html at the time, you are forbidden from getting to the wp-login.php

To fix this, I added two more lines to the .htaccess file

.htaccess more
RewriteCond %{QUERY_STRING} ^action=logout [NC]
RewriteRule .* - [L]

With these lines added, .htaccess now checks first to see if you are calling with "?action=logout" Query_String. If so, it does not rewrite and stops. The complete .htaccess section is now:
Complete .htaccess
# protect wp-login.php
Files wp-login.php (wrap in angle brackets)
    Order deny,allow
    RewriteEngine  on
    RewriteCond %{QUERY_STRING} ^action=logout [NC]
    RewriteRule .* - [L]w
    RewriteCond %{HTTP_REFERER} !^http://www.mywebplace.com/wp-content/uploads/tbirdsarego.html$ [NC]
    RewriteRule .* - [F]

/Files (wrap in angle brackets)

Processing Object Properties in the PowerShell Pipeline

Sunday 21 of October, 2012
I was running a quick AD query via Powershell today and needed to export the results to a csv. The results contained two fields (lastLogonTimestamp and pwdLastSet) that are not human readable, but I needed them to be. There is a quick transform you can make on these field to convert them to a datetime value. It is simply:
Convert system time to datetime
[datetime]::FromFileTimeUTC($_.lastLogonTimestamp)


That part is easy, but how do we do that in the pipeline so all the objects get converted before the export? To do so, we can create a calculated field using a specially crafted hash-table to describe the members to Select-Object. So this:

Select 'before'
select name, description, distinguishedname,pwdlastset,lastLogonTimestamp


becomes:
Select after
select name, description, distinguishedname,@{LABEL='pwdlastset';EXPRESSION={[datetime]::FromFileTimeUTC($_.pwdlastset)}},@{LABEL='lastLogonTimestamp';EXPRESSION={[datetime]::FromFileTimeUTC($_.lastLogonTimestamp)}}


Looking a bit more in-depth, the pwdLastSet was modified to specify the column LABEL and the EXPRESSION in an array, with the EXPRESSION now set to the new, calculated value.
@{LABEL='pwdlastset';EXPRESSION={[datetime]::FromFileTimeUTC($_.pwdlastset)}}

in case you are wondering, my full, one-liner, is a report on all disabled accounts in our directory

Get-Disabled Accounts
Get-ADUser -LDAPFilter {(useraccountcontrol:1.2.840.113556.1.4.803:=2)} -properties pwdlastset,lastLogonTimestamp,description| select name, description, distinguishedname,@{LABEL='pwdlastset';EXPRESSION={[datetime]::FromFileTimeUTC($_.pwdlastset)}},@{LABEL='lastLogonTimestamp';EXPRESSION={[datetime]::FromFileTimeUTC($_.lastLogonTimestamp)}}|export-csv d:\data\disabled_accts.csv -notypeinformation

Using Windows Server Web Platform Installer

Saturday 20 of October, 2012
Quick instructions for installing using the Windows Web Platform Installer tool in Windows server 2008 & Win 7. Note - if you are installing from Microsoft's Web Gallery, the process is much more automatic. The following procedure is necessary only if you need to install from the stand-alone package.


  • Open Server Manager and navigate down Roles/Web Server to the Default Web Site. Select the Default Web Site.
  • Using the Features View, in the Action Pane (Right), select Import Application(option missing? - see below)*
  • Browser to the zip file, select it and click Open, then Next.
  • The Application package will open with all steps selected. Click Next.
  • Select your database options (MySQL, and Create New are defaults) - Click Next.
  • Enter a new application path and database server info if you don't like the defaults. Scroll down to enter the sa password and the tiki db user password.
  • Click Next to run the install
  • After the install completes, open up a browser and point it at the site. (Default URL is http://localhost/tiki).
  • At this point the Tiki install will begin

(*Note: if this option is missing, download and install the Web Platform Installer v4 from here http://www.microsoft.com/web/downloads/platform.aspx. You will also need to install MySQl from the database category of Web PI as well as php from the Frameworks category)

Posting large zip files to Tiki image galleries

Thursday 23 of August, 2012

I have a bunch of photos from Africa I want to upload my tiki-based site. See the Zambia blog for more about that!. Now Tiki is great, but who wants to post phots one at a time? Fortunately, tiki offers a solution - it will unzip zipped files and post the photos. So I tried it and it failed. I had to make 4 changes to make this work. Here are the errors and the associated fixes: -First error: PHP Warning: upload_max_filesize of 2097152 bytes exceeded This was fixed with a change to the _upload_max_filesize_ entry in php.ini The default is 2MB. I upped it to 40M -Second error: PHP Fatal error: Allowed memory size of 25165824 bytes exhausted This was fixed with a change to _memory_limit_ in php.ini. I upped it to 48MB. -Third error: POST Content-Length of 14402734 bytes exceeds the limit of 8388608 bytes This was fixed with a change to _post_max_size_ in php.ini. I upped it to 40M. -Fourth error MAX_FILE_SIZE of 10000000 bytes exceeded This required editing tiki/templates/tiki-upload_image.tpl to change the MAX_FILE_UPLOAD parameter for the upload page.


Nagios, NSClient++ and Powershell

Monday 13 of August, 2012
I did a talk recently at FOSSCON (cache) on how to monitor anything with Nagios. The focus of the talk was how to write NRPE checks to check obscure things in your IT environment. One of the checks I presented used a PowerShell script to check the files in a web site to ensure they hadn't changed. During development of and testing for my demo, I discovered that while my script was returning the correct errorlevel (in my case a 2), NSClient++ would only return 0 or 1.

The zero or one indicated to me that Powershell was merely reporting whether it exited normally or not.

Here is the first configuration I tried in NSClient.
[NRPE Handlers]
check_sitesign=cmd /c echo check_sitesign.ps1|powershell.exe -command -

This config, calls cmd.exe with the /c switch (run & exit), echos the script name and pipes it to powershell on STDOUT.
Powershell is executed with the -command parameter (execute the command and exit and told via the '-' to read that command from the STDIN.

After some investigation and testing from the command line it became clear the -command was an issue. The script I was passing it was seen as a single command and it either succeeded or failed hence the 0 or 1.

Next config line we tried was
[NRPE Handlers]
check_sitesign=cmd /c echo check_sitesign.ps1|powershell.exe -file -

This looked promising and tested fine on the command line but still only returned 0 or 1.

Finally, I stumbled upon this idea
[NRPE Handlers]
check_sitesign=cmd /c echo check_sitesign.ps1; exit($lastexitcode)|powershell.exe -command -

This tells Powershell to run two commands. The first runs the script and the second tells Powershell to exit with the value of the last exit code.


Counting a collection in Powershell

Wednesday 04 of July, 2012
I'm writing a script for AD that will manage for old computer accounts and disable or delete them per our policy. I ran into an interesting issue that I want to document so I remember it.

The query I ran to find old accounts sometimes return 0 objects, sometimes 1, and most times > 1. While testing the code, I noticed that It was having an issue when the return set was zero so I added this code to test for that case:

$oldComps = get-adcomputer -filter ...
if ($oldComps -eq $null) {
        $count = 0 } else {
        $count = ($oldComps).count
        }
    write-host "Processing $count accounts over $daysToDisable for disable"

When this runs I get a nice message saying how many accounts will get processed - unless there is one item returned by the query - in which case $count is blank. Hmmmm.

I went nuts over this for 15 minutes then thought to ask the internet using this query:
"powershell AD collection count one item"

I found an article that suggested the issue was that when one item is returned from the PowerShell command, it is returned as a scalar (single object as everything in Powershell is an object) and not as a collection. The simple fix for this is to wrap the command to force it to return an array
@( ... )

So my code from earlier becomes:
$oldComps = @(get-adcomputer -filter ...<snip>)
if ($oldComps -eq $null) {
        $count = 0 } else {
        $count = ($oldComps).count
        }
    write-host "Processing $count accounts over $daysToDisable for disable"

We can credit this as today's thing learned.

Sending Freecycle posts to Twitter

Sunday 01 of July, 2012
I've been a member of the Upper Bucks Freecycle list for years. I recently realized I missed most of the postings to the list and as a frequent user if Twitter thought there would be handy if the postings were there as well. I had just looked at the If This Then That service ifttt.com and realized I could solve this for the group.

There are a couple simple steps for this. First was to activate two channels, GMail and Twitter. Activation is simple and generally involves authorizing these application to access accounts on your behalf. Ifttt makes use of OAuth when possible so this is generally secure (but use a strong password on your ifttt account, OK?).

For this use, I used my personal GMail account since I already had a FreeCycle subscription, but I set up a twitter account for this specific case (@UBFC_post (cache)).

Ifttt uses the concept of triggers(events) and actions. The trigger for this recipe was a GMail label of UBFC. (I'd already defined a GMail rule to file my UBFC messages).

The action is "Post a new tweet". The contents of the What's happening box are a GMail ingredient {{Subject}} and some extra text to take the reader to the freecycle list ("More Here: http://groups.yahoo.com... #freecycle #ubfc" )

Once you save the recipe, it runs regularly against your mailbox looking for changes and posts a tweet if the action filter matches.

Things I'd like to fix.
- The Freecycle list is closed so I cannot link the reader directly to the message. You must join to see the details
- I really should create an other gmail account for this specific use. Currently if I post something to the list, all of my responses go to Twitter as well since they get the UBFC label




Fixing product activation in Windows XP

Saturday 30 of June, 2012
If the key is being marked as "Incorrect", use the Product key Update Tool (PKUT) to change the key. Download and run PKUT from here (cache) and enter the key from you COA when asked.

Once the system reboots, try activating again using this command-line (paste into run box)
oobe/msoobe /a


Free WiFi for Small Business

Tuesday 05 of June, 2012

I had an interesting problem to solve recently. A small business I sometime work for asked if i could implement free WiFi for their customers. They want to keep implementation cheap. I took a look at their network and found it to be rather insecure. While it needs to be addressed, it was outside the scope of this project. So I started thinking about it and tested a few scenarios (including dd-wr)t and came up with the following. NetGear makes a great little wireless router the WGPS606. Besides broadband routing it offers two wireless SSIDs including a guest wireless. On the guest wireless access can be restricted to only the WAN port of the router. This was key to implementing this securely. The business is using a Comcast connection and the router has very little configuration available. I uplinked the netgear router into the comcast router and most importantly moved all other LAN connections into the new Netgear router so the only connection to the comcast router was the netgear. With this config, traffic for the PCs on the LAN is segregated from traffic on the customer wireless and everything works well. \\Greg ""#3 pencils and quadrille pads." -Seymoure Cray (1925-1996) when asked what CAD tools he used to design the Cray I; he also recommended using the back side of the pages so that the lines were not so dominant."


Powershell for Password Expiration notices

Wednesday 30 of May, 2012
We have a group of folks who only ever remote into our environment and because of that often don't receive password expiration notices from Windows. As luck would have it, often their passwords expire over the weekend and they're locked out until Monday morning. We devised this script to send email notifications in advance of expiration.

First I went spelunking because I'm lazy and I know I'm not the first to need to do this. I found an excellent module in the TechNet Script Repository called Search-ADUserWithExpiringPasswords (cache). It's contributed by Steve Blossom and I thank him for doing the heavy lifting for this project!

The script has one shortcoming. It doesn't allow you to restrict which OUs to search. I've posted a script modification in the Q&A for Steve's upload.

The next step was to wrap some code around the module to do what we needed it to do.

The script starts by setting a few constants and including the Search-ADUserWithExpiringPasswords module.

$smtpserver = "mail.myco.com" 
  $emailFrom = "HelpDesk@myco.com" 
  $HelpDeskTo = "HelpDesk@myco.com"
  $DaysToNotify = "7"
  $SendEmpEmail = $True
  
. C:\NetAdmin\Notify-PasswordExpiration\Search-ADUSerWithExpiringPwd.ps1
Function CPwdLastSet ($pls)
{
	[datetime]::FromFileTimeUTC($pls) 
}


Next, we reset some counters and do the actual search
$logtxt = ""
$count = 0
Search-ADUserWithExpiringPasswords -searchbase "ou=home_employee,ou=user accounts, `
ou=ourOffice,dc=myco,dc=com" -TimeSpan $DaysToNotify `
-Properties mail,PwdLastSet,givenName,sn|`


This command from Steve's module searches the OU specified for passwords expiring in $DaysToNotify (in this case 7) and returns the necessary attributes. Notice that the search command is not terminated but is the beginning of a pipeline to the remainder of the script. The next part of that pipeline processes each returned user object and sends email.

ForEach-Object {
  $today = Get-Date 
  $logdate = Get-Date -format yyyyMMdd 
  $samaccountname = $_.samAccountName 
  $FName = $_.givenName
  $Lname = $_.sn
  $count += 1
  $emailTo = $_.mail  
  
  $passwordLast = cPwdLastSet($_.pwdLastSet) #this is a date now
	$maxAge = (new-object System.TimeSpan((Get-ADObject (Get-ADRootDSE).defaultNamingContext -properties maxPwdAge).maxPwdAge))
	$passwordexpirydate =  $passwordLast.subtract($maxAge)
  $daystoexpiry = ($passwordexpirydate - $today).Days
  $expirationDate = $passwordexpirydate.ToString("D")


This part of the foreach loop calculates the number of days to password expiration, and the expiration date so we can use them in the email message.

$subject = "Your network password will expire soon."     
  $body = "$FName $LName, `n`n" 
  $body += " Your password will expire in $daysToExpiry day(s) on $ExpirationDate.  Please change your password before it expires to ensure you can continue to work. `n`n" 
  $body += "For instruction on how to change your password please refer to this document on the Employee Zone: http://OurSharepoint/SiteDirectory/ee_Info/Shared%20Documents/NetAdmin/HomeworkerPwChange.doc"
  $body += " `n`nIf you are unable to change your password, please contact us at 215 734-2253 `n`n" 
  $body += "Thank you! `n`nYour SysOps Team"
  
  
   #Employee notification
   if ($SendEmpEmail) {
	#Send-MailMessage -To $emailTo -From $emailFrom -cc "gmartin@myco.com" -Subject $subject -Body $body  -SmtpServer $smtpserver 
	Send-MailMessage -To $emailTo -From $emailFrom -Subject $subject -Body $body  -SmtpServer $smtpserver 
	}

   $logtxt += $today.ToString("d")
   $logtxt +=" Email was sent to $samAccountName for password expiring on $passwordexpirydate`n" 
   
  }


The next section of the foreach, generates the text for the mail message and sends it using Send_MalMessage.

Finally, we generate a summary message for the helpdesk.
$logtxt += @"
   
   $count employee(s) notified.
   
   This message is sent from a scheduled task called Notify-PasswordExpiration running on ADMON.  The task queries Active Directory 
   for homeworker accounts whose password are expiring in the next 7 days and emails the employee.  It also notifies Help Desk with a summary. 
   No action is generally necessary except by the notified employees.  Please see the Windows Engineering team if you need assistance.

Task home:
\\ADMON\C$\Netadmin\Notify-PasswordExpiration

Last update:
GjM 
May 2012
"@
	#system notification
  #Send-MailMessage -To $emailFrom -From $emailFrom -Subject "Password Expiration notices" -Body $logtxt  -SmtpServer $smtpserver 
  Send-MailMessage -To $HelpDeskTo -From $emailFrom -cc "gmartin@myco.com" -Subject "Password Expiration notices" -Body $logtxt  -SmtpServer $smtpserver


A coupe other notes:
  • We use a job server for running these maintenance tasks. I like to include the UNC to the job so someone can fix it in my absence
  • All of these scripts are signed with a domain-based CA code-signing cert


Leave a comment if you have questions

Linux Backups

Wednesday 25 of April, 2012

Found a backup script and modified it - see below. The script is saved in /etc/cron.weekly and shoud run 4:30 every Sunday or Monday (Day 0) The tape device is /dev/nst0 -



  1. !/bin/bash

  1. Create backups of /etc, /home, /usr/local, and...

PATH=/bin:/usr/bin tape="/dev/nst0" backupdirs="/etc /root /boot /home /usr/local /var/lib /var/log /var/www/htdocs"



  1. Make MySQl Dump

echo "Dumping mysql databases"


mysqldump password=M@ddexit  flush-logs --opt tiki > /usr/local/MySqlDumps/tikidb


echo "Dumping tiki database"


mysqldump password=M@ddexit flush-logs --opt mysql > /usr/local/MySqlDumps/mysqldb


echo "Dumping mysql admin database"


echo "Rewinding tape"


mt -f $tape rewind for path in $backupdirs do echo "System backup on $path" tar cf $tape $path 1>/dev/null sleep 2 done


echo "System backups complete, status: $?"


echo "Now verifying system backups"


mt -f $tape rewind for path in $backupdirs do echo "Verifying $path...." tar tf $tape 1>/dev/null && \ echo "$path: verified"


echo "$path: errors in verify"


if $? -eq 0 then echo "$path: verified"


else


echo "$path: error(s) in verify" 1>&2 fi mt fsf 1 done


mt -f $tape rewoffl


echo "Please remove backup tape" | wall


Amahi File Sharing Issues

Friday 06 of April, 2012
I run Amahi and use the samba file sharing feature to feed data to my WDTV Live. It works great - until this week. We couldn't get to any of the shares. I found a number of issues.

greyhole error
Can't find a vfs module [greyhole]

This was caused by an install problem with greyhole. The link to the greyhole library was missing from /usr/lib/samba/vfs. The fix was reinstalling greyhole. I run an un-official install from greyhole and used this to fix it:

rpm -Uvh --force http://www.greyhole.net/releases/rpm/i386/hda-greyhole-0.9.9-1.`uname -i`.rpm


Next a saw two issues with samba in /var/log/messages. They may have been related:
/var/log/messages
smbd_open_once_socket: open_socket_in: Address already in use

current master browser = UNKNOWN

The fix for this was to disable ipv6 by running:
sysctl net.ipv6.bindv6only=1

I also added this to rc.local so it takes effect at boot time.

Specify a PowerShell Module Manifest

Friday 06 of April, 2012
I loaded the Authenticode (cache) module from PoshCode today. Thanks to Joel Bennett for the module. In order to not type the cert info all the time, I had to specify a module manifest with privateData. here's the command that worked:

New-ModuleManifest snippet
New-ModuleManifest H:\Documents\WindowsPowerShell\Modules\Authenticode\Authenticode.psd1 -Nested H:\Documents\WindowsPowerShell\Modules\Authenticode\Authenticode.psm1  -ModuleToProcess "Authenticode" -Author "gmartin" -Company "MyCo" `
-Copy "2012" -Desc "script signing tools" -Types @() -Formats @() -RequiredMod @() -RequiredAs @() `
-fileList @() -PrivateData AE713D19867XXXXXXXXXX622F4B69DB5F4EE01B2