Differing output from Where and Where-Object

One of the lesser known features of PowerShell are some “magic” methods that get added to most (all?) collection objects that replace the slower Where-Object and ForEach-Object cmdlets with basically the same functionality. They’re considered magic because they aren’t well documented even years after they were introduced. (Thank goodness for bloggers) I’ve used ForEach quite a bit, but often forget about it’s Where counterpart and apparently had never actually done much with it until today when I ran into a weird issue where I couldn’t set the value of a property on a returned object.

The setup is a pretty classic needle-in-a-haystack problem where you have an array of objects and need to update a property on just one of them. Pretty classically you’d do something like this.

It works great but if you’ve got a really big array of complex objects it can start to take a long time to process. So today I had remembered the aforementioned magic methods and figured it would be a lot faster to use Where to do essentially the same thing. Except I got a really unexpected error.

Maybe it was returning a single element array? Running “$testElement -is [array]” said “False” but “$testElement[0].attr = ‘something'” worked just fine, so what was going on here? Time for some more “magic” in the form of the pstypenames property.

Of course while I was tinkering with all of that and doing some research on the Where and ForEach methods I ran across that article I linked further up and figured out the correct solution to the problem.

Finding an item in an array of PSCustomObjects

(This is mostly so I can find it again someday when I need it again.)

When working with REST interfaces with PowerShell it’s pretty common to get JSON responses that have information that is returned as arrays of PSCustomObjects. If you need to update a property of one of those objects you can’t just do something simple like:

In order to set the value of a property you’re going to have to find the index of that particular object in the array and then manipulate it directly. Thankfully this is easier than it sounds because we have access to the static methods of the .Net Array object, and FindIndex in particular. The previous example actually ends up being something like this:

Standardized Tests for Standardized Parameters

Something that you find when writing PowerShell modules to wrap API functions for external systems is that a lot of your functions tend to have a consistent subset of parameters that get used for things like credentials and specifying an endpoint. For example in the private TeamCity module that I maintain the parameter block for every function that interacts with a server has:

(Whether that is the best pattern I’m still not sure, but it’s beside the point of what I’m talking about here. If you have better ideas I’d love to hear about them!)

If you are writing good unit tests for your functions you need to test those parameters in every single one of those functions and ideally you want to test those parameters consistently to make sure that FunctionA doesn’t use them slightly differently than FunctionB. Additionally if I find a better way of testing those parameters I don’t want to have to update¬†dozens (or more!) of Describe blocks. There had to be a way of writing those tests once and then calling those tests¬†consistently when testing every one of those functions and it turns out to be pretty simple.

The trick is to consider that Pester is pretty consistent about scope inside the Describe and Context blocks so if we were to dot-source some external file in the correct context we should be able to inherit anything that’s in that file. Declaring variables with consistent test values and wrapping tests inside of a function definition means that we can do the dot-source in the scope of a Describe block and then reference those variables and call those sets of tests in every function’s tests.

For example, let’s start with a file called “StandardTests.ps1” that defines two variables and a function to test those two variables:

Then making uses of those variables and tests might look something like this:

The only thing left is to run the tests!
Successful Test output

Reporting Pester Code Coverage Metrics to TeamCity

As previously mentioned I’ve been doing a lot of work with PowerShell modules at work where I have recently gotten all the parts for a full continuous delivery pipeline working for those modules. A big section of that pipeline runs through TeamCity and while the existing ability to have Pester test results show up in the build results is really great, code coverage is slightly less obvious but in the end fairly simple.

The trick is to use the -PassThru parameter with Invoke-Pester and then use TeamCity’s build reporting interaction to get the values into the system. The end result will look a lot like this:

Pester code coverage right in your TeamCity build results!
Pester code coverage right in your TeamCity build results!

Simple test coverage check for script modules

I’ve been spending a lot of time at work writing PowerShell modules and as part of that effort we’ve been trying to make sure we’re doing at least some unit testing on those module functions (Using Pester of course!). Unfortunately we’ve had a few instances where a new function gets added to a module without any unit tests being added. We’ve structured our modules so that every function has it’s own source file and accompanying tests file and all of them are located in a \Functions\ folder in the project. Ideally the CodeCoverage parameter for Invoke-Pester would catch this sort of problem but it only runs tests for files with a certain file name structure and so if it runs across Some-Function.ps1 without an accompanying Some-Function.Tests.ps1 it doesn’t care. Today I finally got a little tired of finding broken functions and decided to do something about it, the result is Coverage.Tests.ps1:

I’ve got to think that it shouldn’t be hard to add a similar test for my other pet peeve: Missing help comment blocks!

Introducing Virtual Micropolis

Virtual Micrpolis Logo

After a few years of owning the domain name, I am finally getting off my proverbial butt and doing something with VirtualMicropolis.com. My original idea had been, as is somewhat usual for me, a bit grandiose. I was going to get the spec moved over there and make it a community for everyone who built Micropolis to come and post their stuff. Because there aren’t any other places on the Internet to build a community (like Flickr, MOCPages, Facebook, Google+, ad infinitum…), or something.

Anyway, the point really came home last weekend in Des Moines where we were displaying our little corner of Micropolis again and we also had the TwinLUG QR Code out on the table next to it. As usual we got several people who tried to use it, and mostly it worked (The lighting was a bit weird), but the overwhelming response to being sent to the TwinLUG site was one of disappointment. What people really wanted to see was lots of pictures and maybe some more stuff about what they were looking at right then. Obviously it was finally time to do something about it.

To get this really rolling though I was going to need to scale things back to just a place where we could put up information and pictures about just the modules that Jennifer and I own. Almost all of them are ones that we designed with the exceptions of some that I bought off a TwinLUG member before he moved out of the country a few years ago (Hi Gary!). Thanks to the wonderful photography skills of Alyska Bailey-Peterson we had a base of some excellent photos to go along with the drek that I manage to shoot so that we could at least get the site off the ground before having to figure out where we were going to get more good pictures.

For this project I think we really needed a Wiki. A blog or other groupware CMS system was just going to have too much overhead and complexity for the basic requirements of setting up easily linkable pages that could be simply protected from spam or other ne’er-do-wells with some file management capabilities. I finally settled on DokuWiki and I’m pretty happy with the results so far. My one small issue is that for some reason there are no simple methods of setting text alignment, but everything else is great so I’m ignoring that as much as possible. If you’re looking for a good Wiki platform you should definitely add them to your list of candidates.

As of right now I’ve got pages up for eight modules and material for a couple more before we start to run out of images, but I think it’s a decent start and hopefully we’ll be able to keep some momentum on the project for awhile.

While right now the site is all about our collection and the layouts that we have been part of but I think we would be glad to broaden the content in the not too distant future. I do have things locked down so that even if you register for an account you can not make any edits until I tweak the account so please contact me if that is your intent.

When did I become _that_ guy?

The other evening @SigridEllis was talking with me about the problems she’s been having with her phone. I am the household tech support and as such it’s important for me to listen in this sorts of circumstances and try and help out as much as I can, or am asked to, and try to minimize the condescension or patronizing tone that people so often associate with those who provide tech support (deservedly or not depending on the circumstances). Unfortunately this time I was having a horrible time paying attention, though I think I got a pretty good idea of what was happening, because I could not stop thinking, “You know, I’ve never had that kind of trouble with my Windows Phone…”.

I did manage to not say it until much later in the conversation when I was able to preface it with a statement about how unhelpful the comment was likely to be and also that it wasn’t actually apropos of the problem at hand but I couldn’t help but realize that at some point I have become that guy. Not even 10 years ago they all used to be Macintosh and Linux fans who seemed to only have single responses to any mention of any Microsoft product along the same lines as that horrible thought I had so much trouble getting past. The situation with my beloved Windows Phone really is about the same as far as marketshare and application availability as either of those platforms during the waning heyday of the desktop PC. All the cool new games come out for iOS or Android. Every one you know has one of those devices, except for those others that you have actively searched for and connected with to share your happiness with your chosen platform. As much as I am irritated at the aspects of humanity that seem to show such an affinity for tribalism it’s always been clear that I’m no exception. I’m just not quite so comfortable with it being so obvious.

Yes, I have invested quite heavily in the Microsoft ecosystem. At this point all of the devices that I use are part of that world and they work really well for me. But please! If you catch me saying something like, “You know I’ve been able to do that on my Windows Phone for years…”, just slap me. I’m pretty sure I’ll deserve it.

No tag for this post.

Some help might be useful

I was out taking a walk through the neighborhood this morning and came up with an idea that, unusually for me, isn’t a crazy idea and most likely won’t lead to my financial ruin so I figured it was worthwhile to follow-up on it.

The city of Saint Paul has an official beautification project to put sidewalk poetry on newly built or replaced sidewalks throughout the city. The northern section of the St Anthony neighborhood, where I live, is no exception to the placement of these poems and it makes for a nice treat to see one while out and about. It occurred to me that it would be nice to have a handy map of such items, and of course the city does have such a map but it is separate from the maps that other cities and organizations maintain of similar art features and doesn’t include something like the Humpty Dumpty sculpture which is private yard art just a few blocks away from my house.

It seemed to me that coming up with a database of such features that user’s (or creators) could submit to easily with associated applications to be able to find local public art would not be much of a project to get off the ground or possibly even to maintain, and that it’s probably something that I should just see about doing.

The problem, as is so common with projects like this, is that all of the obvious domains that I could come up with for the project are already taken or are prohibitively expensive to purchase. So the help I need is to come up with a domain name to use that is available. Ideas?

(I’ll post other thoughts about my plans for the features for the site and apps later, though I would be interested to hear what other people think about the concept and how they might use it.)

New Marvel Trades on Nook (12/23/2011)

New Marvel Comics Available on Nook 12/23/2011

It’s been 11 days since the last batch of new comics got released which seems like an odd number but at a guess it is because Sunday is Christmas in the US and Monday seems to be the day many businesses are giving as a holiday. Could it be that instead of the world of physical comics, where a holiday means delayed shipments, that in the world of electronic comics we’ll have early shipments? It would be nice but it seems more likely that Marvel wanted more titles in the catalog for all the people who are about to receive a shiny new device in the next two days. In any case we’ll know if the two week release window is the general plan on or around Jan 9, 2012.

This release is fairly Spider-Man heavy gives us a little mix of vintages starting with a classic Avengers storyline in “Avengers Under Siege”. (Personal anecdote: That is the story that ran immediately before I started following the Avengers.) In a similarly dark tone we have the first collection from the recent “Dark Avengers”. Both of which I’m looking forward to reading through, Under Siege for the umpteenth time and Dark Avengers for the first.

The spate of Spider-Man titles that arrives this week continues the odd release pattern we’ve got going for most titles. Doing it right is the Ultimate line with the consecutive release of Volume 3 and continuing to make a mess of things is the release of the Civil War story (Pre-Brand New Day), the second volume immediately following Brand New Day, and a volume long AFTER Brand New Day (though previous to “Big Time” released last week), but still no Brand New Day. So for those of you keeping track that gives us the following contemporary issues of Amazing Spider-Man: #532-536, #546-558, #574-577, #648-651.

To balance the complaints about the order of release I do want to point out that at least there are digital releases and I personally find the reading experience to be quite good.

Hope everyone is enjoying the holiday season!

New Marvel Trades on Nook (12/12/2011)

New Marvel Comics Available on Nook 12/12/2011

The list for this week includes a couple of firsts for the the Nook releases. First and foremost it’s the first 616 Spider-Man trades released (Ultimate Spider-Man vol 1 & 2 were much earlier) though it does show part of the problems in the release schedule by putting out books on either side of the One More Day special event without also publishing THAT story. Previous weeks have shown similar issues with the Civil War and Captain America assasination timeline and I expect that it will continue in that vein until someone can talk some sense to whatever poor intern got the job of getting this schedule together.

The second first is the inclusion of the Masterwork trades. Already owning them in physical form I am still debating if I’m going to pick them up or not though I would imagine they look as good as the modern stuff. It does at least point to some intention to publish some of the classics from the vault.