Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

June 01 2019

May 25 2019

May 22 2019

Skin Alley's Final Coat

Here’s early 70s prog band Skin Alley playing Norway’s Ragnarock Festival in June 1973, the only live footage I can find from their short career:

<!-- more -->

Questionable sartorical choices aside, I quite like it. The song is not especially memorable but the arrangement has interest and their performance is accomplished. At this point, the band were about six months away from their eventual demise. A fourth and final album, Skintight, would emerge later that year, its revamped and distinctly American-influenced sound alienating long term fans and resulting shortly thereafter in the dissolution of the group.

Skin Alley were managed by Clearwater Productions, alongside the great ‘lost’ folk-rock act Trees who themselves were petering out shortly around this time, another victim of fading interest from record labels that had rushed to sign a glut of ‘progressive’ acts in the preceding two years and were now turning their attention to more commercial propositions such as glam. They’d had some early success playing free festivals, although I don’t believe this had translated into significant album sales. Like Trees, Skin Alley signed a deal with CBS and produced two albums before being dropped, although they did scrape another deal that spawned a further couple of releases. Did they know the jig was nearly up at this point? The rest of the Ragnarock line-up that year, solid but hardly stellar with a distinct whiff of last year’s news about it - Mungo Jerry, The Pretty Things, Culpepper’s Orchard - may have offered a few hints. Perhaps they had an inkling, with the first glimmers of punk on the horizon in alleged reaction to the detached excesses of the prog years. The top tier prog acts - Yes, Genesis and co - were steadily iterating the foundations of lasting careers, while the second tier (Gentle Giant, Soft Machine, Caravan, etc.) had enough gas left in the tank to see them into at least the second half of the decade. But what was it like to be on the descending curve of your arc, the milieu that birthed you having had its day and marked for succession by the Next Big Thing? Even if Skin Alley had kept the faith with their fanbase for their fourth album, it seems unlikely the path of history would have been much different for them. In that sense, the Ragnarock footage, for all its qualities, feels elegiac.

May 05 2019

May 04 2019

The Age of Light by Whitney Scharer Ade wants to read The Age of Light by Whitney Scharer

May 02 2019

Mrs. Palfrey at the Claremont by Elizabeth Taylor Ade gave 3 stars to Mrs. Palfrey at the Claremont (Paperback) by Elizabeth Taylor
This review has been hidden because it contains spoilers. To view it, click here.

April 27 2019

bigbubbles

We're in the Endgame

I watched Avengers: Endgame yesterday and naturally, for someone who's already bored on about his favourite MCU movies, I have some thoughts.

SPOILERS below, obviously.



While I was really (really, REALLY) looking forward to this concluding part and expecting to enjoy it at least to the average standard for an MCU film, I was also fairly agnostic about its likely ranking in the pantheon. Infinity War, as much as it had a lot of nice moments, was for me a solidly mid-tier effort, and I pretty much expect that for any of the multi-Avenger line-up movies. All the subsequent Russo/Markus/McFeely films have been failed attempts to recapture the magic of The Winter Soldier, and all the Avengers sequels have fallen short of the benchmark set by Joss Whedon's first installment, and IW was no exception on either score. I suspect it can't be done, simply because there is no great way to cram that many lead characters into a movie, do them each sufficient justice, and still maintain a focused, coherent storyline. perhaps the only time it was going to work perfectly was that first Avengers outing. (But then a thing isn't beautiful because it lasts.)

Given that, I'm impressed therefore that Endgame was at least satisfying as a movie. Yes, pretty much any character outside the OG Avengers didn't receive much of a chance to shine, and even some of the core group weren't fully-served (it feels more excusable if you allow that characters like Thor, Quill and Strange had the best of the action in the previous installment). But I didn't have much lingering resentment at this. I'm not one of the people who feels that Steve Rogers's ending was out of character, or that Tony Stark's was "unfair" to his character (and ghod, I nearly lost it to bawling like a lickle babby when Pepper said goodbye). There was enough fan service, enough little flashes of brilliance for every one - even where those only amounted to a few scenes for figures like Scarlet Witch and Valkyrie - and enough of a credible plot to leave me sated. I haven't yet decided where to rank it: it's can't be better than Avengers 1 or TWS, as discussed. Better than Thor: Ragnorok or Iron Man 3? Paawhssibly...

My few nits then:
  • No investment in setting up Danvers as the leader of the new Avengers line-up going forward. In fact, barely any teamwork for her at all, let alone the sense of a baton being passed.
  • Coupled with that, almost nothing to hint about the direction for "Phase 4". True, there were pointers for Thor, Loki and Sam/Bucky (although that's really TV business, and I hope there will be more porosity between the TV and movie universes in future). But no real clue. Unless Spider-Man FFH is going to open that up.
  • Banner's new accommodation with Hulk came out of nowhere and really felt like a stretch. That's the downside of a three hour movie that still can't be long enough to include everything.
  • Undoing everything great that came out of Ragnarok really sucks. Listen to your mother and eat a damn salad, Thor!
  • A few missed interactions that would have warmed the old cockles, if we'd had another hour or two running time: Danvers/Fury; Danvers/Valkyrie; (Danvers/anyone); Groot/Rocket; Quill/Stark; Strange/Banner; Rogers/Barnes; Wasp/Rescue maybe.
  • The way time travel is handled in this film (the alternate reality model, and Banner's sketchy explanation about going back becoming your future and your present then being your past) is really the only way it could work given the demands of the plot, Even then, I'm not convinced it properly ties up in every case.
  • Presumably, if Cap replaced all the infinity stones where they came from, the time stone went on ultimately to be destroyed by Thanos and Strange never gets it back. Which seems to nerf his powers.
  • Really don't think too hard about the practical ramifications of removing half the lifeforms in the universe for a five year period and then popping them back exactly where they left.

April 21 2019

At the Existentialist Café by Sarah Bakewell Ade gave 4 stars to At the Existentialist Café: Freedom, Being, and Apricot Cocktails (Hardcover) by Sarah Bakewell
bookshelves: partly-read
Really three and a half stars for me, but that feels hard hearted as it's a good book. After Sarah Bakewell's masterly Montaigne book, I could summarise his philosophy and outlook in two words ("Je soutiens"), but I could no more explain existentialism now than I could the workings of a nuclear power plant. It still feels beyond grasping, and all attempts slide uselessly off it. As for Heidegger and his "dasein", it's the first piece of philosophy I've read about that left me muttering, "This sounds like a whole bunch of Nothing." But that, I am sure, is more a reflection of the nuances of Heidegger than Bakewell's attempt at explaining him. I enjoyed the biographical elements and anecdotes, and Beauvoir certainly receives her deserved due. To her credit, the author tackles with assuredness and an open mind figures whose views could often be, to say the least, problematic.

March 18 2019

March 09 2019

Ade shared a quote
35068687
Just try to find an uncompelling photo of Fleetwood Mac taken at any point between 1975 and 1987. I've spent hours scouring Google Images in search of a single Fleetwood Mac band photo to which I am not sexually attracted, and failed every time.Steven Hyden

March 05 2019

Twilight of the Gods by Steven Hyden Ade gave 4 stars to Twilight of the Gods: A Journey to the End of Classic Rock (Hardcover) by Steven Hyden
Part-memoir, part-rumination and occasionally analysis of the current state of "Classic Rock" and its likely 'end' in the foreseeable future (will the last Beatle kindly turn off the lights?), Hyden's book is never less than a cracking read for any fan of that ol' time rock 'n' roll. Some of it is doubtless intended to provoke (such as any attempt to define Classic Rock), some of it probably won't convey much outside of North America (the merits of REO Speedwagon, anyone?), the occasional observation is a little trite (e.g. when people grieve for dead rock stars, they're really mourning their lost youth! And Fleetwood Mac, boy are they fucked up, huh?) but thankfully the majority of it does not tread overly familiar ground. I would counsel that it contains a surfeit of Springsteen, an entirely gratuitous chapter about Phish and a brief rave for Japandroids as keepers of the flame, but then again I listen to Motorpsycho and Royal Blood these days so who am I to talk. Also, Hyden seems determined to commingle Led Zeppelin IV and Dark Side Of The Moon as the touchstones of Classic Rock, two albums that may delineate one axis of the genre but ignore at least another entirely. In the interests of research, I even gave in and tried listening to "Blonde on Blonde" (um... nuh). But I can forgive anyone who speaks with as much evident love and as little ironic distance for the music as he does here.

Has the rock era come to an end? Certainly the days of the 'Rock Star', as memorialised in David Hepworth's Uncommon People: The Rise and Fall of the Rock Stars 1955-1994. seem to be over, along with much of the infrastructure that supported them (MTV, magazines, overt racial and sexual discrimination, record conglomerates, ...), but the music itself has probably just been subsumed into our new, all-the-culture-all-the-time Internet age along with disco, punk and ... well no, not you Garth. As Hyden notes, artists from well outside the genre like Beyoncé can now quote freely from classic rock without seeming as irredeemably beached as your dad playing his treasured LPs on Friday night and moaning that "they don't make 'em like this anymore!" Yeah, I love those records and I could also never hear them again without too many regrets, providing somebody today still gives a shit what they're doing (and can write a tune). And if not, there are enough 'lost' classics still being unearthed to make reliving 1971 in real time feel like a viable option.

Hey, enough of my yakkin'. Let's boogie.

March 02 2019

January 31 2019

RJ Davies garage
Merthyr Road, Pontypridd, Wales

January 30 2019

Creating vSphere VMs with Ansible

Ansible now contains a decent set of modules for managing virtual machines in VMware vSphere. As ever with Ansible, the key is not so much in knowing how to use these modules (which the docs explain fairly clearly) as in knowing how to organise the playbooks that call them. Here’s one example based on our own recent practice.

<!-- more -->

To use this, you should be running at least Ansible 2.7.5 as the module was broken in older 2.7 releases. I assume you already have the prerequisites, including an account with administrator privileges on your vCenter, a basic knowledge of how to create a new VM in vSphere, and so forth. To enable us to create new VMs for the systems we need to manage with our playbooks, first of all we wrap the vmware_guest module in a role. The role uses a combination of standard or typical default values, global variables for common settings and per-host variables specific to the VM in question. For our own purposes, we only need to be concerned with basic Linux VMs of a mostly similar specification, so we don’t worry about customising the configuration for different OS platforms.

For example, the role defaults might be:

1
2
3
4
5
6

# roles/create-vm/defaults/main.yml

vmware_scsi: 'lsilogic'

vmware_firmware: 'bios'

vmware_disktype: 'thin'

vmware_hw_vers: 13

vmware_netdev: 'vmxnet3'


This defines our standard VM SCSI controller, firmware, disk provisioning, hardware version and network device (all of these are compatible with CentOS, for example).

The global settings are defined in the group variables for ‘all’ hosts, and specify the local vCenter, site-specific names like the vSphere data centre and overall common settings for the VMware modules:

1
2
3
4

# group_vars/all.yml

vmware_user: "{{ lookup('env', 'USER') }}"

vmware_vc: "vcenter.our.domain"

vmware_datacenter: 'Main Datacenter'


Here we authenticate to the vCenter using our central directory, so we use the logged-in ID of the person running the playbook as the VMware username. Alternatively, you can create a specific account with limited privileges in vSphere for Ansible to use.

Finally, we configure the VM details in the host variables file under host_vars/:

1
2
3
4
5
6
7
8
9
10
11
12

# host_vars/vm1.yml

vmware_cluster: 'Firewalled Production'

vmware_folder: '/Main Datacenter/vm/prod/servers/linux'

vmware_datastore: 'main-datastore-1'

vmware_vms:

  - name: "vm1"

    cpus: 4

    mem: 8

    diskgb: 80

    net: 'PRODNET'

    mac: '00:50:56:de:ad:be'

    os: 'centos7_64Guest'


(VM folder names are prefixed with the data centre name followed by ‘/vm’. Note that in practice with this structure, one can define several VMs together in a list - e.g. within the group variables - but this is not necessary. In most cases, it’s probably cleaner to separate the VM configs by individual host.)

In the top level playbook, we also need to request the password for the vCenter user (or fetch it from a secure vault if using a specific account for Ansible):

1
2
3
4
5
6
7
8
9
10

# create-vms.yml

- hosts: all

  become: false

  gather_facts: false

  vars_prompt:

    - name: "vmware_passwd"

      prompt: "vCenter password"

      private: yes

  roles:

    - { role: create-vm, when: vmware_vms is defined }


We need to disable fact gathering as the hosts we’re creating may not exist yet so Ansible can’t connect to them.

Finally, we pull all these variables into a task defined within the create-vm role.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

# roles/create-vm/tasks/main.yml

- name: create VMs

      vmware_guest:

        hostname: "{{ vmware_vc }}"

        username: "{{ vmware_user }}"

        password: "{{ vmware_passwd }}"

        cluster: "{{ vmware_cluster }}"

        datacenter: "{{ vmware_datacenter }}"

        datastore: "{{ vmware_datastore }}"

        cdrom:

          type: 'none'

        disk:

          - size_gb: "{{ item.disksize) }}"

            type: "{{ vmware_disktype }}"

        folder: "{{ vmware_folder }}"

        guest_id: "{{ item.os }}"

        hardware:

          boot_firmware: "{{ vmware_firmware }}"

          scsi: "{{ vmware_scsi }}"

          memory_mb: "{{ (item.mem * 1024) }}"

          num_cpus: "{{ item.cpus }}"

          version: "{{ vmware_hw_vers }}"

        name: "{{ item.name }}"

        networks:

          - name: "{{ item.net }}"

            device_type: "{{ vmware_netdev }}"

            mac: "{{ item.mac }}"

        state: present

      with_items: "{{ vmware_vms }}"

      register: vms_deployed

      delegate_to: localhost


The key thing here is that the task is delegated to ‘localhost’, i.e. the Ansible control node, and therefore the connection to the vCenter to create the VM will occur from the host where Ansible is run. (You can use a different host such as a dedicated vSphere management server but Ansible must be able to connect to it and it must have the Pyvmomi library installed.) This task loops through the vmware_vms list and creates each VM defined there through the vCenter.

If you change the settings for any VM, Ansible will attempt to modify its configuration in vSphere if possible. For example, you can adjust the allocated memory in a running VM (assuming the guest OS supports it and hot-adding memory is enabled for the VM) but attempting to shrink a virtual disk returns an error.

Currently, due to Ansible bug #34105, vmware_guest isn’t fully idempotent if you’re using distributed switches in your vSphere networking configuration; the task will report ‘changed’ every time it is run and you will see a “Reconfigure Virtual Machine” task logged in the vCenter, even if no aspect of the VM has been altered. (There’s a PR for this bug but it doesn’t appear to have been merged yet.) If this concerns you, you can first run a vmware_guest_find task to search for the listed VMs in vCenter, register a variable and use the result of that to drive the creation of any VMs that return ‘failed’ (see my previous post on using multiple values in a registered variable).

Obviously, at this point you’d still need to power on the new VM and install an OS on it. In fact, you’d probably instead deploy from a pre-built template, using the template and customization parameters of vmware_guest to configure it. The vmware_guest_powerstate module could then be used to power it up and initialise it, followed by vmware_guest_tools_wait to pause until it’s ready.

January 13 2019

Panasonic GX80 with Darktable

This is a quick summary of how best to set up and use Darktable for processing raw files from the Panasonic DMC-GX80 (GX85 in USA).

For reference, I’m using Fedora 28 with the current Darktable 2.6.0 release. I mainly shoot landscapes or architecture, so some of the following may not apply if you’re a portrait photographer.

<!-- more -->

I was fairly pleased with Darktable for processing my GX80 images - its extraction of detail is particularly good - but found that my results were sometimes a little low in overall contrast and colour images lacked saturation and richness. However, the new filmic module in 2.6.0 appears to mostly resolve these concerns. The other key requirement is ensuring DT has specific camera profile data available for the appropriate aspects of the image pipeline. With these criteria fulfilled, the results from DT can surpass what you can obtain with Snapseed (which in itself is otherwise an excellent app for quickly enhancing in-camera JPEGs).

  1. Obtain the DCP profile ‘style’ files for the Panasonic GX80 (potentially downloadable from er… various places or you can install Adobe DNG Converter under WINE to extract them).
    You could also try the Rawtherapee DCP profile for the GX85. Note that DCP profiles are not guaranteed to work identically across different raw processors.
  2. Convert each of the DCP files to ICC profiles using dcp2icc (use the 32 bit binary, it’s easier), e.g.:
    $ dcp2icc 'Panasonic DMC-GX85 Camera L Monochrome.dcp' 5000
    (It’s not clear whether the colour temperature argument has any effect on the resulting output file; if in doubt and you believe your DCP profiles contain dual luminant values, generate several different output files using values of 2850, 5000 and 6500.)
  3. Copy the resulting ICC files to ~/.config/darktable/color/in/
  4. Install the lensfun-tools package and run lensfun-update-data to download the latest camera and lens correction data files.
  5. If you have the GX80 variant, copy /usr/share/darktable/noiseprofiles.json to a separate folder in your home directory, edit it and change all occurrences of ‘GX85’ to ‘GX80’. Then modify your Darktable menu entry or startup alias to run darktable --noiseprofiles /path/to/modified/noiseprofiles.json instead.
    (This was a suggestion in a DT issue and I’m not entirely clear whether it’s necessary or whether DT knows they’re the same camera, but the current file definitely does not reference the GX80.)
  6. Start Darktable and import your raw files in the usual way.
  7. Select a raw to edit. Make sure you’re editing the .rw2 file and not the JPEG version if you create both.
  8. Disable the basecurve module and set the demosaic module to AMaZE.
  9. Go to the input color profile module. Your Panasonic style profiles should now show up in the profile dropdown. If you want to start by emulating one of these, try them out. I find the colour ones sometimes look a bit washed out initially (you may prefer to stick with standard color matrix default), although the Monochrome styles are a good starting point for B&W conversions.
  10. Enable the lens correction module. If you’ve updated the Lensfun data correctly, this should have autodetected the correct camera and lens model.
  11. Adjust the tonal range in the exposure module so that you have no clipping (automatic mode is often suitable).
  12. Enable the filmic module (new in 2.6.0; if you don’t have this release, leave the basecurve and demosaic modules unchanged and use the traditional tone/colour modules). Use of this module still seems a bit voodoo but try the ‘auto tune source’ option for the tone-mapping parameters (might make things better or worse) and enable ‘preserve the chrominance’ for colour images (will probably need to back off the Saturation slider). Try tweaking the sliders from this point to adjust the contrast and shadow/highlight detail as required.
  13. You might want to save a new Darktable style at this point for future use as an initial starting point.
  14. Apply tone curve (e.g. S-curve to bring the contrast up a little), local contrast, sharpen, vignette, split tone, etc. modules as required. If you find the image a little noisy in smooth tone areas (e.g. blue sky), enable the raw denoise module with a very low noise threshold (e.g.0.002). If image colours still look off, try some of the other white balance presets.
  15. Other tricks: to make the image pop a bit more, try the ‘boost’ preset in the highpass module or ‘clarity’ in the equalizer (but lower the mix value to 200-600 range). For a glow effect in B&W, apply the bloom or lowpass modules with a blend mode of multiply, and add masking as required. For landscapes, enable haze removal and try the graduated density module on bright skies.

One nit to watch out for with DT 2.6.0: if you use a non-standard window manager (such as fluxbox), you might encounter bug #12387. Should be fixed in 2.6.1.

January 06 2019

A tale of two roadhouses

20181230-113901 “Fibre Broadband is here”, proclaimed the sign on the BT cabinet next to the A41 layby grandly. Behind it lay the quietly mouldering remains of the Cherry Tree Hotel, the multiple voids in its shabby walls exposing its black heart to an apathetic 21st Century. Really, they had fast Internet?

<!-- more -->

20181230-114003 I left the car alongside the dual carriageway, parked in a scrubby bay shortly before the roundabout where the A49 broke free from a temporary embrace with its London-bound sibling to hasten south for the heady delights of Shrewsbury and the Marches. Clambering over a rough embankment thrown up to seal the Cherry Tree’s entropy off from the present day, I approached the building warily. Bare vines straggled over the walls and feebly attempted to tear down the former entrance porch. Entering would have been as simple as stepping through the maw of the french window, its vacant panes lying forlornly on the ground in front of it, but I had no intention of proceeding further in that direction without a hard hat and a hefty dose of YOLO. The sign above the front door was now blank wood apart from the lingering cursive of the word “Ales”, while the nameboard outside was now only a vacant frame topped with ornate ironwork and redundant spotlights.

20181230-114323 In its heyday, the Cherry Tree had been called the Witch Ball Inn and boasted a swimming pool and dancehall out back, cementing its status as a roadhouse of the 1930s, those roadside hotel/pubs seeking to mark themselves out as worthwhile destinations in their own right. Sited close to the A41/A49 junction at Prees Heath, south of Whitchurch in Cheshire, the Witch Ball later picked up custom from nearby RAF Tilstock, apparently proving popular with visiting American airmen stationed there during the war (although whether the swimming pool was then open under the restrictions of wartime austerity is a matter for speculation). Perhaps done to advertise its location to this lofty market, the building still carried the word “HOTEL” painted in large white letters on the roof. It was this more than anything that had first caught my attention while flashing past on the opposite carriageway, an echo of a half-remembered sight from my childhood - hadn’t all such places picked themselves out in this way at one time?

Walking over the mossy tarmac of the former car park, with a pair of spindly, unkempt trees clinging to their small stony island in the middle, no trace of the pool was visible to the rear. A large extension suggested that the Cherry Tree had retained the dancing to the end though; a Friday night spot if you could find a mate willing to drive you all there or chance the breathalyser later.

Next door to the Cherry Tree is a corrugated tin shack garage business, its closed-up showroom housing vintage Morris Minors and an eclectic collection of old toys and antiques. This too is up for grabs, the owner apparently having retired. As I poked around the derelict pub, a car drew up outside the garage and a man briefly hopped out to peer inquisitively into the windows, perhaps weighing up a change of career or possibly just a plot of land that might prove lucrative in future.

20181230-114241 Across the dual carriageway lies the Raven Inn, sporting the same mock Brewery Tudor style of the Cherry Tree but still open and serving in the dog days of 2018. It’s hard to escape the feeling that the Raven is gazing across into a mirror reflecting a prophecy of its own inevitable doom: architecture as memento mori, Ozymandias in half-timbered, half-intact black and white. Reviews on TripAdvisor are mixed, with the accommodation faring badly (“never in my life have I experienced a shambles like it”), although the restaurant has a few fans prepared to overlook what sounds like a rather forlorn interior (“see past the decor and you’ll have good food and friendly service”). Forty years ago, your father or grandfather would have made it the penultimate stop when returning from North Wales, to round out the day with a slap-up family meal in genteel surroundings enjoying “traditional cooking”. Now its clientele consists of those who unwittingly stumble over it while googling for “hotel near Whitchurch” ahead of a work trip and connoisseurs who know that a shabby establishment often belies a decent breakfast.

The Raven in fact has a much longer lineage than its Johnny-come-lately upstart over the road, being apparent on maps from the 1880s and probably much further back. It seems likely that it would originally have been a coaching inn, as the location must have been a useful staging post on the long ride either from London to Birkenhead or coming up from Herefordshire towards the north. The Witch Ball and its pool, a large oval behind the hotel, are present on the 1950s map and even the 1970s one. On the latter, the Raven is the “Wild Raven Inn” and has by this point assumed its present outline; most probably, it would have been rebuilt in competing style when the Witch Ball arrived on the scene to cash in on Britain’s nascent motoring boom.

20181230-115106 What future now for these punctuation marks of 20th Century road travel? The motorways have abstracted all the long distance traffic, so only those who prefer the scenic route or have local deliveries to make are likely to be passing, and even then modern, reliable vehicles and convenient bypasses mean that you won’t require an overnight stop or proper meal to break your journey. It’s four hours from Cardiff to my home town of Warrington along the A49, and while I’m ready for a break after a drive of that length, I don’t need a full hotel dinner midway to keep me going (although I’m now thinking I should have a swift Coke in the Raven one day for posterity’s sake). Today, a location like this would normally merit a Pizza Hut or drive-in Macdonalds to catch the family trade coming back from shopping at weekends, but perhaps that market is already saturated. For the Cherry Tree, the war seems over, the building by all appearances too far gone to justify renovation even if its location could once again support two such hostelries. On the other side, the Raven overlooks a normally busy lorry park and weekend bikers meet, but they prefer the Midway Truck Stop just before it. The Raven’s own car park at least wasn’t empty when I passed through, and they’re clearly buggering on for now, but absent funds and the impetus to invest in it, its gentle decline will not be reversed any time soon.

Other bubbles

August 28 2018

A note about benches

crop0003 If you’re familiar with a certain place BB knows well, you’ll probably recognise the distinctively ornate and serpentine end of this public bench, being as it is entirely redolent of only one place. Yes, this is, of course, located in…

<!-- more -->

“Porthcawl!”
“Spittal!”
“Berwick!”
“Harrogate!”
“Clitheroe!”

…Well actually, BB was going to say Aberystwyth, as for years we vaguely assumed they were unique to the promenade there, or at least to a certain type of mostly unpretentious Welsh resort - the scales and the forked tail putting us in mind of dragons rather than snakes, hence the mental association. (Despite the presence of Porthcawl in the list above, Bridgend Council actually stored one of their original benches during repair work and then somehow contrived to lose it. But then Bridgend Council appears to suffer from a debilitating mental block when it comes to benches.)

The other somewhat vague and indecisive thought was that someday, we might like to own one like it - until finally, in an idle moment’s Googling, we looked it up and discovered that bench ends like this actually originate on the other side of the country.

This is, in fact, one of the standard platform bench types employed by the North Eastern Railway, later subsumed into the LNER and subsequently the East Coast Main Line of BR. The LNER was never financially the strongest concern of the Big Four companies, particularly after the recession of the 20s and 30s hit the North Eastern coal traffic, and is widely recorded to have been on the verge of bankruptcy when the railways were taken into government control on the outbreak of WW2. During the fifties, rationalisation saw a lot of the intermediate stations and branches of the ECML closed, smaller places such as Alnwick and Seahouses losing their stations. At this time, local councils were able to acquire job lots of redundant former NER platform benches, hence their preponderance in Northumbrian towns such as Berwick (where they have since been maintained and renewed on a like-for-like basis). However, others remember them in situ before this period, so it seems likely that councils were already installing benches of this form anyway, and have continued to do so.

In fact, the pattern continues to be available in the current catalogue of the original foundry, the Ballantines Bo’ness Iron Works, and in those of their competitors. Ballantines went bust a few years ago but have since been bought out by a scion of the original family, so the manufacture of the serpent bench endures. Indeed, the pattern appears to have been an off-the-shelf option since Victorian times (note that the example above also retains the snake’s tongue, which is often reportedly lost as it is the most fragile part of the design). Today, new benches in this style are procured by councils from Logic. However, if you want one for your own garden, or at least a reproduction in non-corroding polyurethane, you can obtain one from Broxap. Warning: you might want a stiff drink to hand before following that link, it’s almost twice as much as BB has ever paid for a bench. Needless to say, an authentic, refurbished original could set you back at least twice that, although you can find the ends alone without slats for considerably less.

Credit to the contributors in the links above for their findings on this subject, particularly the posters in the LNER forum; BB’s own research effort amounts only to persistent Googling.

August 02 2018

An Ansible pattern for lists of files

Problem: In this scenario, we have a potentially long list of usernames (accounts) and a directory containing SSH public keys, one per file with each file named after the user that owns that key. We want to deploy all the keys for the users in our list.

<!-- more -->

The sets of users and keys are not identical; there may be more users without keys (a common situation, alas), and we may have many other keys belonging to users not in this list.

(You might instead store your users’ keys directly in a list or dictionary, which would obviate the need for the code below, but you’d forever be copy-and-pasting keys into a YAML data structure unless you have some automated way to keep it up to date. Come to that, I hear those crazy kids even store public keys in LDAP directories these days.)

First issue: a file lookup throws an error if the file doesn’t exist. We could simply iterate over the list of users and set ignore_errors so that we pass on the ones that don’t have keys, then supply a null default value instead:

1
2
3
4
5
6

- name: add SSH keys

  authorized_key:

    user: "{{ item }}"

    key: "{{ lookup('file', 'path/to/keys/' + item) | default('# NONE')}}"

  with_items: "{{ userlist }}"

  ignore_errors: yes


But this is messy and longwinded, as we’re incurring a file lookup, even if it fails, and remote call for every user. (Does default even work with lookup? I have a feeling it may not…)

A naïve first pass at solving this might be to go through the list of users, call the stat module to see if a local key file exists for each one and save the results in a list, and then use a conditional test before trying the lookup for the key:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

- name: check for a key file for each user

  local_action:

    module: stat

    path: "{{ item }}"

get_checksum: false

get_attributes: false

get_mime: false

  with_items: "{{ userlist }}"

  register: user_keys



- name: deploy user keys

  authorized_key:

    user: "{{ item.item }}"

    key: "{{ lookup('file', 'path/to/keys/' + item.item) }}"

  with_items: "{{ user_keys.results }}"

  # only deploy keys that exist:

  when: item.stat.exists


(Remember that a registered variable for a looped task contains a list of hashes for each item in the loop, comprising the original object named item and the task result for that object named after the module, in this case stat.)

We use local_action because we’re looking for the key files on the Ansible controller node rather than the client. This at least saves us some remote calls, as we only deploy actual keys that we find, rather than trying to do so for all the users. Here, we reduce the overhead of the stat module slightly by disabling the retrieval of file attributes that we don’t need, such as checksums. But we’re effectively iterating over the entire list of users twice, once for the users and again for their potential keys, many of which may not exist, which is slow and mostly wasted effort.

A better approach would be to iterate over only a list of keys that we know exist:

1
2
3
4
5
6

- name: deploy user keys

  authorized_key:

    user: "{{ item.item }}"

    key: "{{ lookup('file', 'path/to/keys/' + item.item) }}"

  with_items: "{{ user_keys.results | map(attribute='stat') | selectattr('exists') | list }}"

  when: item.stat.exists


Here we pull out all the stat elements from the results list and then select only the ones that have an exists attribute which is true. This may reduce the number of iterations considerably but it is still a loop stepping through one item at a time, and we haven’t avoided doing all that file I/O for the interminable stat lookups.

Ideally, we’d instead generate a listing of all the key files in a single pass (like an ‘ls’ of the directory), then take the intersection of that list with our list of users - i.e. to obtain the list of users for whom we have keys.

My first thought was to use the fileglob lookup to create a list of all the key files. However, fileglob returns the full paths for all the objects it finds. You might think it would be possible to use the Jinja2 map function to apply the basename filter to every element of the list, thus stripping the paths and leaving only the filenames:

1

key_names: "{{ lookup('fileglob', 'path/to/keys/*') | map('basename') | list }}"


But this doesn’t work for some reason that isn’t obvious to me; instead it breaks the filenames up into a list of single character strings like this:

1

["u", "s", "e", "r", "1", "u", "s", "e", "r", "2", ...]


However, the find module does give us a list we can filter in that way:

1
2
3
4
5
6
7
8
9
10

- name: get list of all user SSH key files

  local_action:

    module: find

    paths: "path/to/keys"

    excludes: '*~'

  register: find_keys



- name: derive key names

  set_fact:

    all_keys: "{{ find_keys.files | map(attribute='path') | map('basename') | list }}"


(Note that the find excludes parameter, used here to remove editor backup files from the results, is only available from Ansible 2.5. It isn’t strictly necessary, as only the files that match actual usernames will be used anyway. Alternatively, you could ensure that all your key files are named username.pubkey instead, which is perhaps a bit more intuitive, and then use patterns: '*.pubkey' with find. But you’d have to strip the extensions as well in the next step.)

(Instead of find, we could just run an ls command and process the standard output with split to make a list, or even shell out and call echo dir/*. But that’s spawning another process, which is cheating.)

Again, the find module is run locally as the key files are stored on our Ansible controller node. We extract the path attributes from the files key in the return value of the find module, as those contain the path for each file found, and then run basename over them to strip the directory path.

Now we can obtain the intersection of our sets of users and key files with one simple Jinja2 filter, and deploy the precise set of keys that are we need:

1
2
3
4
5
6
7
8
9
10
11

- name: derive list of user keys

  set_fact:

    # this could be appended to the previous step instead

    user_keys: "{{ all_keys | intersect(userlist) }}"



- name: install users' keys

  authorized_key:

    user: "{{ item }}"

    key: "{{ lookup('file', 'path/to/keys/' + item) }}"

    state: present

  with_items: "{{ user_keys }}"


This seems to me quite a neat pattern if you ever need to process a set of files according to some selective criteria or by cross-referencing against a second list. And it’s a lot quicker than watching a list of values scroll steadily up the screen.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl