Home

Search Posts:

Archives

Login

January 2014

S M T W H F S
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31

UPDATE

Puppetlabs now provides stdlib, which provides mechanisms for solving this and many other common problems. Check it out!

(The original post remains below, but using stdlib is a good idea!)

One of puppet's design goals is to be legible and useful to non-programmers. This is a laudable objective; not all sysadmins know how to write code, or are interested in doing so. However, this sometimes makes it necessary to... work around the limitations of the language.

Prime annoyance to myself: concatenating arrays for use in templates.

There's only one way in puppet language to concatenate arrays, and that's using the += operator on an array that was defined in a higher scope. Since puppet variables are immutable by design, this itself is actually a bit of chicanery: the += operator expands the variable from a higher scope, appends the data on the right side of the operator to that, and creates a new variable in the current scope with the same name as the one from the higher scope.

Sound confusing? Well, it is a bit. Here's an example of what it looks like:


$sshusers = [ "bob", "sally" ]
class ssh::accounts {
$sshusers += [ "tim", "thelma" ]
}

Within the scope of the ssh::accounts class, "sshusers" will be created as a new local variable, which expands to [ "bob", "sally", "tim", "thelma" ], and can no longer be modified. The original "sshusers" variable in the higher scope has not been modified.

So there you have it. That's how you append arrays in puppet. And that's the *only* way you can append to arrays in puppet.

At first, it might not be evident just how limiting this is. But consider the case that you have a lot of groups of users, defined in variables, and you want to use them all as elements in a single array:


$sysadmins = [ "bob", "sally" ]
$users = [ "tim", "thelma" ]
$dbas = [ "kip", "jim" ]

This is where things get nasty.

So, we know our += trick, but that can only combine *two* arrays at a time, so the only way to get all that junk into a single array is to chain += to get the mega-array we want. We now have a construct like this:

$sysadmins += $users
$dbas += $sysadmins
$sshusers = $dbas

Well, that works, but what if we want to get at only the DBAs for another template or definition? What if we want one definition that uses $sysadmins and $dbas, but another that uses $users and $sysadmins?

Basically, += does make this technically possible, but it makes it ugly. Wouldn't it be better if we could combine an arbitrary number of arrays in a single statement?

As far as I know, while there is no way to concatenate more than two arrays in puppet, you can still combine them into a single variable, like so:

$allusers = [ $dbas, $users, $sysadmins ]

All right, so this should trigger some warning bells. In a normal language, allusers would be an array which contains 3 other arrays. In puppet, though, there's not really a notion of nested arrays anywhere within the DSL itself; using $allusers as a variable in puppet definitions will work as if all the nested arrays have been expanded into a single array, which is what we really want.

For all practical purposes, within puppet's DSL, arrays that contain multiple sub-arrays function as if they are a single array containing all elements of the sub arrays.

Notice a very important qualification in my statement: "within puppet's DSL." This works fine when you're working within the puppet configuration itself, calling definitions, realizing users, etc; but if you try to use such a combined array in a template, it suddenly turns into a nested array again.

I find this duality very confusing; within the DSL, my variable is, in every way that I can interact with it, a single array of 6 strings. But if I try to use this array in a template, all of a sudden it's composed of 3 nested arrays which are in turn composed of 2 strings each.

Here's an example. Say you'd like to construct an sshd_config "AllowUsers" line in a template, which grants access to all of our users. AllowUsers should look like so:

AllowUsers bob sally tim thelma kip jim

Given that, you might define $sshusers like to combine your 3 arrays into a single variable:

$sshusers = [ $dbas, $users, $sysadmins ]

And call an ERB template that looks like:


<% sshusers.each { |i| -%>
<%= i + " "-%>
<% } ->

If sshusers is really an array of strings, this will do what you want: print out each element of the array followed by a space. But that's not what we get if we've combined our 3 smaller arrays, as we did above:
err: Could not retrieve catalog from remote server: Error 400 on SERVER: Failed to parse template ssh/sshd_config_new.erb: can't convert String into Array at /etc/puppet/modules/ssh/manifests/init.pp:79

That's not a very descriptive error (pointing out only the line in which the template is called), but it gives us a clue as to the type conversion problem at the root of our issue. What's happening here is that string concatenation fails since i is not a string. If you omit the '+ " "' stuff and run this template just printing 'i' as you iterate through the array, you see a list of all 6 items all bunched together. But the second you try to manipulate each element of the array, you realize that it's actually operating on 3 arrays, not 6 strings.

Bottom line is: you cannot combine more than two arrays in puppet to form an array of strings for use in a template, except by chaining += statements.

You can sort of work around this in some ways. One option would be to pass multiple variables to your template and use conditionals in the ERB to handle them properly. Thus, you have users1, users2, users3,... - but that leaves us with a rather unfortunate hack. Sure, it's fine for a small number of entries, but how many do you want to support? Wouldn't it be better if we could just get a darn array of strings?

I came up with a nasty hack: embedded in my ERB, I declared a function to "flatten" these nested arrays. It drilled down into the sub-arrays, iterated over them, and dumped each individual item into a single new array, which it returns. I then called that function whenever I needed the raw array.

UPDATE: a commenter posts out a much cleaner option than my original hack job, so I'm updating the post here in case anybody else should need it (there's really zero reason to use my old method, so it's stricken from the record):


AllowUsers <%= sshusers.flatten.join(' ') %>

Now, what do I get:

AllowUsers bob sally tim thelma kip jim

Success! Thanks Jay and duritong for pointing out this solution!

I still feel like this is a bit of a hack; why can't I properly concatenate arrays in the DSL? Why are not-concatenated arrays treated as if they were concatenated in the DSL? The workaround here gets the job done, but to me it was not obvious at all what was going on; this behavior should at least be referred to in the documentation.

Here's the moon, a waxing gibbous from Saturday night; read on for details.

From Nature

My gear: Sony A700, Minolta 500mm f/8 Reflex (a fixed aperture catadioptric lens), tripod.

I found getting good shots more difficult than I had expected. I'm relatively new to photography and while I understand the basics, trying to shoot the moon pretty much causes all those automatic bells and whistles on your camera to become useless.

For starters, the metering system isn't very useful; if you leave it on matrix or center weighted with a lens of this length, it's going to blow out highlights badly due to all the black in the frame. Spot metering is closer to right, but it's still sketchy. The best technique I found so far is going full on manual exposure.

I found that the best results were with shutter speeds in the 1/125 range at ISO 200 (at least, this was the best when the moon was about halfway between the horizon and directly above - it should put off more light the higher it is in the sky). Incidentally, this isn't far off from the "sunny 16" rule, which makes perfect sense when you think about it; the moon is not a source of light in and of itself, rather it's reflected sunlight, so it's logical to use the calculation based on a sunny day. Sunny 16 underexposes by about 2-3 stops in my tests, due to the impact of atmosphere.

Now 1/125 second is going to be difficult to handhold with a 500mm lens. When hand holding, you need the ISO jacked up to around 1600 or better to get shutter speeds high. I try to avoid going that high if I can, so I used a tripod and longer exposure.

Automatic white balance is equally sketchy. It actually did OK sometimes, but it was hit or miss. You either need to set the WB manually, or just plan on fixing it in post processing (I chose the latter).

Now, depending on how accurate your exposure is, you have some work to do in software. The JPEG engine on my A700 did a really poor job with contrast, so I used RAW. I use ufraw and the GIMP; at 1/125 second all I really needed to do was bring up the black point to enhance contrast on the moon's surface. If you underexpose (as I did in this sample) you have to bring the white point down as well.

I had to use the GIMP and ufraw for this since Picasa's contrast adjustments were inadequate. "Auto contrast" is a disaster, but worse is that Picasa "guesses" some initial EV values when using RAW, and those guesses were already clipping highlights. It's not even possible to bring these back down to proper levels within Picasa!

I also applied some unsharp mask (.4 as the value) in the GIMP. I think I'm hitting the limitations of the lens in terms of resolving power, and it just can't fill the A700's entire 12MP sensor. This is another good reason to try and avoid high ISO, as USM will sharpen noise if it exists.

So anyway... that's shooting the moon! It's not hard when you know how to do it, but it took me a bit of time to learn.

I rarely use this site to simply post links, but Bruce Schneier has an excellent article on the security theater of the TSA and other governmental organizations. As he says:

When people are scared, they need something done that will make them feel safe, even if it doesn't truly make them safer. Politicians naturally want to do something in response to crisis, even if that something doesn't make any sense...

Our current response to terrorism is a form of "magical thinking." It relies on the idea that we can somehow make ourselves safer by protecting against what the terrorists happened to do last time.

Schneier is one of the most respected experts in security - electronic or otherwise - and when somebody of his stature speaks out on these issues it gives me some hope that change might be possible.

Not much, mind you.

I've been using Chrome recently, since they've finally released betas for both OS X and Linux.

By and large, it's a great product. It's fast, lightweight, and it has a very minimal UI. I'm nearly ready to throw firefox away and switch for good (in fact, I have switched on my netbook, where Chrome's advantages are paramount).

I'm not switching on my primary system, though. Why? Well, it turns out that Chrome has no facility to store passwords and encrypt them with a master password.

I mention this limitation not because it's overly interesting from a technical perspective, but because I find the Chrome team's process of repeatedly punting on bugs fairly amusing. Firefox's master password feature is certainly no panacea - indeed, if you care about security greatly, you would never store passwords at all - but it's better than nothing. It prevents casual access to stored passwords, and allows a user to be fairly certain that if they forget to lock their workstation a passerby will not then be able to immediately harvest all their credentials.

But reading through the comments in the Chrome bug tracker, it's clear that the engineers completely discount this use case. They claim (rightfully, of course) that an attacker with physical access to a system would then have the ability to gain much of the information stored therein (via a keylogger or other mechanisms) regardless of whether the browser utilized a master password.

They're right, but they're missing the point. Sure, physical access makes it possible for an attacker to gain information by compromising system integrity, but in the real world this isn't the person you're most likely to need protection from. The encrypted password file, combined with a master password, provides nearly complete protection from the most likely enemy: an attacker of opportunity who would casually grab your credentials if it was easy enough, but is not willing to risk detection by manipulating your system.

Chrome on Linux currently stores passwords plaintext on the filesystem, without any encryption. How this is deemed superior to Firefox's master password feature - which encrypts stored passwords using 3DES in CBC mode - is beyond me.

The old saying goes that an illusion of security can be worse than no security at all, which is the argument that the Chrome engineers use to downplay the utility of this feature. But Firefox's mechanism provides more than a simple illusion - it really does make it exceptionally difficult for an attacker to get your passwords, even if they have acquired the file. Contrast with Chrome's technique of providing no security at all, and I'm still going to cast my lot with Firefox on systems where I store passwords.

I find Dinosaur Comics to be one of the funnier things I've seen on the internets. It's great, and you should read it.

In homage, or something, I've been generating a random 2-panel combination of comics. It's actually sometimes quite funny, and I'm posting the best results to a wordpress blog.

Anyway, check it out!

I finally got a chance to play Left 4 Dead multiplayer last night, and man... it's a lot of fun.

In some ways the game reminds me of Serious Sam - fast, furious, straightforward. L4D has a bit more depth to it, but it also has a of purity of concept that just makes it work so well. There's no fluff here, no needless complexity - you and 3 buddies simply kill zombies, and lots of them.

The awesome B-movie horror styling and the cheesy one-liners from the characters are just icing on the cake.

L4D is on sale for $23.99 for the next few days. If you don't already own it, you should pick up a copy.

It is with great restraint that I describe "The Wire" as merely "the best thing I've ever seen on television." It's tempting for me to call this the greatest cinematic work I've ever experienced, period, but I need a bit more time to contemplate that.

Here is the story of Baltimore, of the War on Drugs, of America. The characters of "The Wire" range from obsessive to idealistic to sadistic to almost completely amoral, but one thing ties them together: they are all cogs in a machine that is utterly immutable.

"The Wire" is massive in scope, and plays out more like a series of 5 season-long movies. Individual episodes never stand on their own, and almost nothing is thrown away - seemingly minor characters and events will continue to echo throughout the course of a season. You cannot miss an episode of this series and you cannot watch it out of order.

The violence in the Wire is visceral, but not gratuitous. The street language used is amazingly colorful and entirely credible. The sets used are often actually in Baltimore, and the extras are often actual residents. Everything about the show feels real, in a way that I've never really seen from another TV program.

As the creators have noted, the main character of the series is really the city of Baltimore. The ensemble cast wends its way through the various organizations that infect the City: the gangs, the dock workers, the police, the politicians, the educators, the press. All of these systems are equally dysfunctional, and their systemic dysfunction ultimately infects the lives of their inhabitants.

It's easy to understand why "The Wire" was never widely accepted. It is not only much more complex than other shows, it's also horrifically bleak in a way that is almost never seen on American television. Please, please do not let that deter you - if you do not see this show, you'll be missing out on something truly incredible.

* This is the same review I placed on Amazon
** Yes, it's even better than Futurama, although it couldn't be more different

Wow, what a surprise this game was.

I played Fallout 2 way back when, and enjoyed it a great deal. I was shocked that anybody would be making a sequel at this point - the old games were turn based, isometric RPGs. Classics. And it's really hard for me to imagine continuing that tradition now.

Fallout 3 succeeds, in part, by not being bound by this tradition. Bethesda realized that that old style gameplay had no place in today's market. Even though war never changes, video games do.

So they ripped out the turn based combat, got rid of the 3rd person view. This is a first person, action RPG. It's almost at times like playing a straight up shooter. This is, as they say, Obvlivion with guns.

And man, is it awesome.

It's easy to romanticize the earlier games. They earned a lot of praise, and rightfully so - when they were released, they were the cream of the crop. Such a compelling and bizarre retro-apocalyptic setting, such freedom to explore the world and interact with it as you will. The player could do and be whatever he wanted. There was nothing else quite like it.

Some of this is lost in Fallout 3. As the 3d environment now becomes more complex, as every line is now voiced by talented actors, the player's options dwindle a bit. But my god - the second you exit Vault 101 and survey the crushed world from a "scenic overlook," you know it really was all worth it.

SPECIAL is still around, underneath it all. While the game plays like a shooter, the dice are still rolling behind the scenes. Skills and perks matter, especially in VATS, which pauses the action of the "action RPG" and turns it into pseudo turn-based combat, if only in brief spurts. VATS is genius. The best of both worlds.

The glue that holds this all together, the common thread, is that this world really feels like Fallout. Everything feels right - the crazy perks, the retro sci-fi artifacts, the bizarre humor... everything is in place. If they'd screwed this up, it wouldn't have worked. But they didn't. They took the world the first games gave us that distant 3rd person view of, and they placed us right in the middle of it.

The game takes itself a bit more seriously, but it has to. There are elements here that wouldn't have worked otherwise. Wandering through a disintegrating building, listening to audio recordings of a man's slow degeneration into a mindless ghoul. Descending into a failed Vault, uncovering the disastrous experiments that lead all of the inhabitants to their doom. Stumbling across a supermarket filled with raiders, with the corpses of hapless wastelanders strung up on chains.

At times, in the darkest caverns of the Fallout 3 world, you truly feel terror. At times, it feels like you're playing The Road.

The game works on almost all levels. It has its quirks, but it's impossible to care too much about them - there are way more hits than misses. If you play it straight through, sticking to the main plot, you can probably burn through the game in 8-10 hours. But don't do that - take your time, and revel in the horrific glory of the wasteland. You won't be disappointed.

I just bought my first new automobile, a 2009 Volkswagen GTI. This is also the first time I've financed any purchase outside of my mortgage (excepting tricks like using store financing solely to get discounts).

It's hard to claim that I really needed a new car, since the Integra still functions as basic transportation and since Annie's '06 Civic is a great vehicle for everyday use. Still, every month that passes makes the Integra more frustrating to operate - it's 16 years old, and both the exterior and interior are starting to show that age.

Mechanically, though, she's great. Clocking over 185,000 miles, but running like a dream. That's Honda for you.

Anyway, I said to myself, "Self, you can have a nice thing every now and then, even if you don't strictly need it. You've never owned a new car in your life, and right now you can grab the dealers by the balls and walk away with a good price."

This, mind, is after months of obsessive research. I've wanted to replace the Integra for a while, and I've been scouring the internets for a worthy successor. I honestly didn't expect that at the end I would be *buying* such a thing - I really just wanted to know what I should be lusting after.

The GTI was in a close fight with the Civic Si (which is a close relative of the now defunct Integra). Both vehicles had almost everything that lead me to the Integra to begin with: good gas mileage, fun to drive, nice (but conservative) appearance, compact size (but still able to seat 4 comfortably, 5 in a pinch).

Ultimately, 2 factors tipped the scales in the GTI's favor: first, it's not another Civic (and as much as I do love the Civic, I don't think we need two of the things), and second, I fell in love with the hatchback (which allows the GTI to cram more cargo and passenger volume into a vehicle that's actually shorter than the Civic).

My dealership experience was not at all what I expected. I had done such extensive research on the process that I was ready for a major undertaking: I armed myself with all the information I could find, and I used Edmunds to get an idea of what to expect. I was ready for a fight.

On a whim, though, I decided to try the Edmund's service to automatically get quotes from area dealerships via email. Nothing to lose from that, and it would at least give me a good baseline to start from.

Much to my surprise, one guy came in well under the rest (with a price that was well under both Edmund's FMV as well as invoice), and when I shopped the price around the other dealers (with one exception) basically told me they couldn't touch it. I went out to the lot (dragged the guy in on Saturday, when he doesn't even normally work) and tried to drive him down a bit further, but he wasn't budging at all on the price of the car beyond throwing in a couple of extras at cost. I honestly didn't expect anything different, though, given the way other dealers responded to that first quote.

In retrospect, I think I could have done marginally better with the single dealership that was able to match the price, but I don't think it would have been *much* better - a few hundred at most - and the dealer I went with has a better reputation and is more convenient to me. That's worth a few hundred bucks, I think.

Anyway, I'll put up a picture when I get around to it. So far I've got only minimal buyer's remorse, but we'll see how I feel after I start making those hefty payments...

I swear, my luck with hard drives is really rotten. I just lost the OS drive in my MythTV box, and that marks the second time in as many years (and the 3rd time total).

It shouldn't be surprising. I've got 8 drives in always-on systems, and I was sure to lose another eventually. It's just too bad it wasn't one from the RAIDz array.

Anyway, the last time I lost the primary (and at the time only) drive in my MythTV system, I responded by rebuilding the thing with RAID 1. It chugged along happily for a while with no issue.

At some point, I picked up a small form factor bare bones kit to replace the massive Dell tower that I had been using. In moving to the smaller kit, I was forced to sacrifice the second drive.

Of course, now, I pay the price.

Luckily, the price isn't that high. When I set up my RAIDz array a while back, I offloaded all of the actual media files onto that and exported them via NFS. A drive failure in the mythtv system itself doesn't cause me to lose any of those.

At the same time, I also configured bacula to back up everything else "important" to the raidz pool as well, and I rsync those backups to an external drive. This works remarkably well, and until now I've had no cause to use it.

I noticed the drive failure last night, when I tried to upload a newly ripped CD. I didn't have time to do anything then - I just hit the gentoo website and started downloading the latest live CD (since god knows where I put my old one) and told bacula to restore everything to the local filesystem.

This morning, I got up a bit early and swapped out the failed drive with the one that used to be its mirror. I briefly considered trying to recover a bootable system from the outdated mirror, but quickly thought better of it; the data was really stale and would have to be replaced anyway. Might as well just nuke it from orbit and do a bare metal restore.

Once I had the live CD booted, it was pretty straightforward to recover from there. The bacula restore job had finished the night before, so all I had to do was partition the replacement drive and rsync the backup over from the Solaris box.

Unfortunately, I had failed to backup the boot partition. Not a big problem, but I had to go back in and recreate that, building a new initrd and creating a new grub.conf. I also failed to create /dev/console and /dev/null on the actual / partition, which caused boot to fail until I went back and did so. Lessons learned there.

I also lost my large "scratch" partition. I tend to keep a collection of useless junk around, and in this case I had already decided that these things were acceptable losses in a recovery scenario. In a way, it's actually nice to have this cleaned out.

The total time from cracking the case to having the system fully running with the prior night's backup was approximately 3 hours. I know I'm probably not going to see 3 9's on my DVR, but that's not a bad turnaround time from my perspective.