Thursday, 17 July 2014

Female Symphonists

While Wikipedia currently lists more than 900 entries in the category Women classical composers, a diligent search reveals that comparatively few of these have written a symphony - less than one in five, according to my very unscientific sampling. And while considering them as a group might make exactly as much sense as clumping together all male symphonists, to wit no sense at all, still it might advocate just a little for positive discrimination. Here then are seven of the best.

Louise Farrenc
Louise Farrenc (1804 - 1875, France)

In 19th century France, opera was king. The public turned away from anything else, indeed from any unfamiliar or new name. Symphonically speaking, they tended to shun anything that was not Beethoven, Haydn, Mendelssohn, Mozart, Schumann, or otherwise German and Great.

Furthermore, the resources necessary to mount a symphonic performance were beyond almost anyone's means, as lamented by Camille Saint-Saëns and others. If the Société des Concerts du Conservatoire didn't chose your work for performance, there were no jobbing orchestras for hire; you had to pay for and assemble your own band personally.

It was against this backdrop that Louise Farrenc sought to buck the trend for exclusively male composers, not to mention symphonists. And with a great measure of success; admired by Berlioz and Schumann, this teacher and scholar composed three superb symphonies.

Farrenc's other accomplishments included the co-founding, with her husband Aristide, of the publishing house Éditions Farrenc in Paris, which remained one of France’s leading music publishers for nearly 40 years. She spent 30 years as Professor of the Piano at the Paris Conservatory, where her excellent instruction led to many of her students graduating with Premier Prix.

On two occasions, in 1861 and then again in 1869, she received the Prix Chartier of the Académie des Beaux-Arts.

Selected works for full orchestra:

Emilie Mayer
Emilie Mayer (1812 - 1883, Germany)

The encouragement of a mentor proved invaluable to this 19th century romantic composer. The conductor, baritone, and fellow symphonist Carl Loewe, known in his day as "the Schubert of North Germany", once said of Mayer: "Such a God-given talent as hers had not been bestowed upon any other person he knew." Such accolade gave her much inspiration and motivation throughout the rest of her very prolific composing career.

Mayer wrote a documented total of eight symphonies during the decade 1847-1857, in addition to numerous chamber works for strings and/or piano... not to mention an opera (Die Fischerin - The Fisherwoman), and a piano concerto! Her works garnered much critical and popular acclaim, and she would travel Europe in the 1950s to attend public performances of her works.

Selected works for full orchestra:

Amy Beach
Amy Beach (1867 - 1944, America)

Amy Beach was a child prodigy who could sing and harmonise accurately by age two. At five, she was composing waltzes. At six she began piano lessons, and by age seven, was giving public recitals of Beethoven, Chopin, Handel, and her own work. As a composer, Amy was almost entirely self-taught, with the exception of a year spent at age fourteen, learning harmony and counterpoint from Junius W. Hill. She made her professional debut in Boston in 1883, and soon after became a soloist with the Boston Symphony Orchestra.

Beach's early writing is mainly Romantic, often compared to Brahms or Rachmaninoff. Later she would move away from tonality and into whole tone scales, using more exotic harmonies and techniques.

The Boston Pops paid tribute to Beach in 2000, when her name was added to the granite wall on Boston's famous Hatch Shell - the only woman ever to have received this honour.

Beach wrote many songs for solo voice with piano accompaniment, together with many more of the sacred and secular choral kinds. Her works also include much piano and chamber music, a mass, and like Emilie Mayer above, a piano concerto and an opera (Cabildo). Today she is probably best remembered for her singular symphonic composition, the celebrated ''Gaelic'' Symphony, written in response to Dvorak's criticism of American composers.

Selected works for full orchestra:

Ruth Gipps
Ruth Gipps (1921 - 1999, England)

Another child prodigy, Ruth Gipps performed her first composition at age 8 in a music festival, when the work was bought by a publishing house. Soon afterwards she began her performance career in earnest by winning a concerto competition with the Hastings Municipal Orchestra.

In 1936 she entered the Royal College of Music to study theory, composition, piano, and eventually oboe. An accomplished all-round oboe and piano soloist, she was also a prolific composer. A hand injury at age 33 ended her performance career; she decided to focus on conducting and composition.

Her music often shows the influence of her teacher Vaughan Williams. She rejected serialism, twelve-tone music, and other such trends in the avant-garde, considering her five symphonies as her greatest works.

She founded the London Repertoire Orchestra in 1955 as an opportunity for young professional musicians to become exposed to a wide range of music, and the Chanticleer Orchestra in 1961, a professional ensemble which included a work by a living composer in each of its programs, often a premiere performance. Later she would take faculty posts at Trinity College, London (1959 to 1966) and the Royal College of Music (1967 to 1977), and then the Kingston Polytechnic.

Selected works for full orchestra:

Gloria Coates
Gloria Coates (b. 1938, Wisconsin USA)

American by birth and education, Gloria Coates has lived in Munich, Germany since 1969. Having already written some half-dozen extended, multi-movement orchestral works, she decided while writing the seventh ("Dedicated to those who brought down the Wall in PEACE") in 1990, to revisit them all and relabel them as symphonies.

In so doing, she set herself on course some fourteen years and eight more symphonies later, to become known as the most prolific female symphonist of all time.

To a certain extent this distinction is arbitrary. As Kyle Gann writes in his 1999 New York Times article A Symphonist Stakes Her Claim, "Symphony", after all, is a word open to wide interpretation. It does not, for Ms. Coates, refer to a work in several movements, the outer ones allegro and the second one adagio. He also reports Ms Coates as saying, "It has to do with the intensity of what I'm trying to say and the fact that it took 48 different instrumental lines to say it, and that the structures I was using had evolved over many years. I couldn't call it a little name."

Selected works for full orchestra:

Ellen Taaffe Zwilich
Photo © brightcecilia.net
Ellen Taaffe Zwilich (b. 1939, Florida USA)

After graduating from Florida State University in 1960, Ellen Taffee Zwilich moved from to New York in order to play with the American Symphony Orchestra under Leopold Stokowski.

She then joined the Juilliard School, where upon her 1975 graduation, as the first woman to achieve their Doctorate of Musical Arts in Composition, she gained some prominence by having her Symposium for Orchestra programmed, with the Juilliard Symphony Orchestra, by Pierre Boulez.

In 1983, with her first symphony, Zwilich became the first woman to win the Pulitzer Prize for music.

Selected works for full orchestra:

Libby Larsen
Photo © Ann Marsden
Libby Larsen (b. 1950, Delaware USA)

A graduate of the University of Minnesota, Libby Larsen was in 1983 appointed one of the Minnesota Orchestra's two composers-in-residence, making her the first woman to serve as a resident composer with a major orchestra.

Her works exhibit a great mixture of influences, from the Gregorian Chant sung by the St. Joseph of Carondelet nuns at Christ the King School as a young child, through her mother's boogie-woogie records, her father's Dixieland band (was an amateur clarinetist), and the many different styles of repertoire introduced to her by Sister Colette, her first piano teacher, to the eclectic direct influences of her college teachers.

Asked about her teachers and influences, she has said "To tell the truth, my teachers have come to me from unexpected places in my musical life. They have been poets, architects, painters and philosophers. The other way I really learn is by reading scores voraciously, from Chuck Berry to Witold Lutoslawski."
(http://libbylarsen.com/index.php?contentID=232)

Her awards include a Grammy, a Clarion, two Honorary Doctorates, and a George Peabody Medal, among many others.

Selected works for full orchestra:

All information and pictures courtesy of Wikipedia, unless otherwise noted.

Friday, 20 June 2014

Unit Testing Is Dead

Haskell logo (Wikipedia)
Don't Bury The Lede!

Fair enough. Here's the power point takeaway:
  • The future present is multicore!
  • Only functional programming languages (Haskell and Erlang for the purist, but also Scala, Ocaml, F#) can scale adequately to cope with this future present.
  • Functional software design eschews mutable state, being purely procedural and "static".
  • Objects and interfaces (and O-O generally) are obsolete.
  • Unit testing, as we used to know it, is dead! Yeah! And TDD/BDD too! Yeah!
  • But we still have to support our legacy O-O systems with unit tests...
  • Here's how to do that without jettisoning statics.

LINQ The Hero

The introduction of Language-INtegrated Query (LINQ) into the C# language, with C# 3.0 in November 2007, headlined an impressive list of new features. But in truth, there was only one major, new feature delivered in that compiler release. Virtually everything else, with the possible exception of Automatic Properties, was introduced simply to underpin and enable the great LINQ:
  • Anonymous Types
  • Local Variable Type Inference
  • Object and Collection Initializers
  • Lambda Expressions and Trees
  • Extension and Partial Methods
Some examples of these dependencies:
  1. Local Variable Type Inference is essential when returning values of an Anonymous Type from a query.
  2. Lambda Expressions are required to enable the writing of sufficiently general SQL WHERE clause predicates.
  3. Extension Methods provide the backbone of the "fluent" (method chaining) syntax, upon which the Query Comprehension (using SQL-like keywords) is just compiler syntactic sugar.
Naturally, most of these supporting features have found immediate application in multiple other areas. Extension Methods in particular have spawned an entire vocabulary of Fluent APIs (of which my favourite has always been Bertrand Le Roy's FluentPath). These are popular with developers and library code consumers alike, being in the words of TechPro's Khalid Abuhakmeha fun and discoverable way to allow fellow developers to access functionality.

Villain Of The Piece

But with great power comes, as they say, great heatsinks. And coolest in their response to the proliferation of these extensions, implemented as they are throughout C# using static methods, are the unit test evangelistas. Their point is simple and well-made:
  • Unit testing involves rewiring your dependencies using mocks or "friendlies" which replace those real dependencies for test purposes.
  • Static methods lead to an essentially procedural programming environment, with code and data separated, and without clear objects or interfaces available to be swapped out and substituted.
So much the worse for static methods, they say. To which I rejoin, so much the worse for your unit testing framework! Not all such tools have intractable bother with statics.

Pex/Moles

Microsoft's Pex and Moles VS2010 power tools, and their VS2012 replacement Fakes Framework (via Shims, though not Stubs), can handle statics reasonably well.

Typemock

The Typemock Isolator can control the behavior of static methods, just like any other method:
Isolate
  .WhenCalled(() => MessageBox.Show("ignored arg"))
  .WillReturn(DialogResult.OK);
So, your test might look like this:
[Test]
public void TestStaticClass()
{
  Isolate.WhenCalled(() => UserNotification.SomeMethod()).WillReturn(5);
  Assert.AreEqual(5, UserNotification.SomeMethod());
}
Telerik JustMock

JustMock provides for unrestricted mocking of dependent objects, including non-virtual methods, sealed classes, static methods and classes, as well as non-public members and types. Mocking of properties like get calls, indexers and set operations is also supported. JustMock also supports mocking of all classes and methods included in the MSCorlib assembly.

Don't Meddle With The IL?

Some of these solutions engender suspicion because of their under-the-hood behaviour. Specifically, there is concern that anything rewriting the actual Intermediate Language (IL) generated by the compiler, for consumption by the jitter, must result in something other than the official written code being tested. But this is an unjustified worry for several reasons.
  • By its very nature, IL is not the code that's finally executed on the end user's processor. What does the jitter do, but transform that IL into something entirely new?
  • Several .NET components cause new IL to be generated at run time. For example, Regex patterns which are not precompiled cause their own custom assemblies to be generated each time they are evaluated.
  • Visual Studio design mode is the biggest IL simulator of them all. Just ask yourself, how does it run the constructor for your new user control in design mode, when you haven't even finished typing it in yet, never mind compiling it?!
In short, these Shimmying frameworks are thoughtfully designed and quite serviceable, and aren't doing anything outlandish that you're not already relying on to a great extent.

Further Reading

Statics and Testability

Miško Hevery, Russ Ruffer and Jonathan Wolter's Guide to Writing Testable Code (November 2008) lists warning signs related to the four most popular flaws in O-O Design. (Google)

Miško Hevery, Static Methods are Death to Testability (December 2008) goes into more detail and answers commentators' concerns with the previous document.

Introductions to Functional Programming

Learn You a Haskell for Great Good!

This is the world's best tutorial introduction to the world's best programming language.

Learn You Some Erlang for Great Good!

This is the world's best tutorial introduction to the world's second best programming language.

Tuesday, 4 March 2014

Book Review: "Threat Modeling: Designing For Security"

Ben Rothke reviews Adam Shostack's new book:
"When it comes to measuring and communicating threats, perhaps the most ineffective example in recent memory was the Homeland Security Advisory System; which was a color-coded terrorism threat advisory scale. The system was rushed into use and its output of colors was not clear or intuitive. What exactly was the difference between levels such as high, guarded and elevated? From a threat perspective, which color was more severe — yellow or orange? Former DHS chairman Janet Napolitano even admitted that the color-coded system presented 'little practical information' to the public. While the DHS has never really provided meaningful threat levels, in Threat Modeling: Designing for Security, author Adam Shostack has done a remarkable job in detailing an approach that is both achievable and functional. More importantly, he details a system where organizations can obtain meaningful and actionable information, rather than vague color charts."
Full review:
http://books-beta.slashdot.org/story/14/03/02/1748257/book-review-threat-modeling-designing-for-security
Adam Shostack
Threat Modeling: Designing for Security
John Wiley & Sons
17 February 2014
ISBN-10: 1118809998
ISBN-13: 978-1118809990

Sunday, 29 September 2013

Natural Sort Order

Ten Before Two But 10 After 2

A recent update to our flagship product takes advantage of the server-side paging capabilities of the DevExpress Grid control. Naturally, this change has involved the migration of much client-side C# code into server-side table changes, triggers, and stored procedures, all written in SQL. One of the casualties was a particularly hirsute C# method which used to sort table contents into what's sometimes called "Natural Order", so that e.g. DC10 would come after DC9, rather than the naive collation or ASCII order, which would have them reversed.

For reasons unknown it fell to me to implement our Natural Sort. Easy I thought, I'll ask Google for the answer. That's when I discovered there's really no such thing as The Natutral Sort Order. It depends on your particular data, and your specific requirements, to a frankly surprising degree. Of course, there is plenty of material to choose from on the web. Jeff Atwood's Coding Horror on the subject is quite a good central station for your exploration of the matter. But this plenitude is also a bit of a problem. After an hour or two of research, I'd decided my best plan was to design and implement my own, newly reinvented wheel.

Fields

The basic approach is to partition or "stripe" the input (unsorted) data into a sequence of fields, alternating alphabetical content with numerical, and then treat each field as its own Sort By column - sorting the alpha fields as normal, and the numeric fields numerically. In the above DC9 / DC10 example, this results in an initial alpha field containing "DC" in both cases, followed by a numeric field containing the integers 9 and 10, which are then subjected to a numerical sort.

Some of the examples I'd read performed this latter sort by actually converting the input data field into an integer, then using the language's numeric comparison capabilities. I didn't want to use that approach, because a 32-bit signed integer can only be used for field sizes up to 9, a 64-bit one 18, and so on. I had no specification to assure me customer data wouldn't exceed such an arbitrary bound, so I fell back on the old workaround of keeping the numeric fields in string form, but left-padding them with zeros until all the rows were the same length, allowing an alpha sort to do the job of a numeric one. This is essentially part of what we do when we adopt the ANSI date format, e.g. 2013-09-29, to get correctly ranked dates.

Collations

Notice that it doesn't matter which padding character you use to right-align the numeric fields, just as long as it doesn't come after 0 (zero) in the collation order. This is important later.

One fun fact I found while researching collations was that aa comes after ab in the Danish / Norwegian collation order. For historical-typographical reasons, aa is treated (while sorting) as identical to the letter å, which just happens to be the last letter of their shared alphabet. Never mind; we'll just have to assume anyone using a particular collation order knows what they're doing, and won't be surprised by results which after all should in all cases appear perfectly non-anomalous to them.

Field Sizes

Okay, so now we have this requirement to right-justify our numeric fields by left-padding them with e.g. zeros. What fixed field size should we pad them out to? Well our input data, being stored in a traditional, relational database, has some particular maximum string length. In my case that length was 200. There's nothing to stop our customers filling every single character space with a decimal digit. So we could be looking at a numeric field of width 200.

What about the alpha fields? These don't require padding, since standard left-to-right string comparisons work fine for them. But note that every alpha character "uses up" one position in the input data, rendering that location unavailable for storing a subsequent digit. Long story short, we can stuff any alpha prefix into the left edge of our 200-character field, and still have enough room to fit and pad out the remaining, right-justified, numeric content.

For performance reasons, the ultimate destination for this calculation was a "live" Sort Order column in the relevant database table, rather than a UDF call at SQL SELECT time. That's why my buffer size had to be allocated statically, rather than optimised with reference to the worst case requirements of the data actually in the table; we didn't want a new row of data invalidating the precomputed sort orders of the old data.

Islands and Seas

You might worry about non-digit characters encroaching on the padding space of our numeric fields, and you'd be right to. Actually everything works okay as long as we stick to letters and digits. Anomalies can start to appear when we introduce punctuation characters, especially when using ASCII collation. The numeric digits and the upper- and lower-case letters can be seen to form three disconnected "islands" of codes, surrounded by four seas of punctuation characters.

In practice, these anomalies are mitigated by our customers' sparse use of such punctuation, and tendency to apply it consistently whenever it is used at all. As a further mitigation, I changed the padding character from a zero digit to a space, ensuring that padded-out numeric fields are essentially guaranteed to sort lower than any alpha or other character found in the same region.

Example

The following, correctly sorted data can be seen to illustrate these adjustments. Notice, in the Natural Sort column, the use of the space character as filler, and the fact that the 'h' of 'Coach' occupies the same character column position as the '1' of 'Van 1234' without causing any problem:

Simple Sort   Natural Sort

 Coach 12      Coach  2
 Coach 2       Coach 12
 Van 1234      Van  234
 Van 234       Van 1234

Field Count

Obviously the input data might contain further alternating fields of non-numeric and numeric data. What should we do with subsequent fields? Well, we just package them up in pairs using exactly the same algorithm as the first, and when we have done this enough times to consume all of the input, return the concatenation of all these field blocks as the sortable key.

There is another minor optimisation available here. Obviously the first field pair must have consumed at least one character from the input. This means that the field "buffer" for the second pair can be made one character shorter than the first - say, 199 characters instead of 200. Likewise, if input characters remain after the second or subsequent field pair have been extracted, then that pair must have consumed at least two characters, so the next buffer size can be 197, or 195, or 193, or...

Yes, quite. The law of diminishing returns cuts in quite promptly here, especially since we decided that a total of 3 field pairs would be more than adequate for our customers' requirements (actually there is an auto-generation element in our product, designed to nudge them into using just a single field pair: an alpha prefix, followed by a numerical index). So I just left all my buffers at width 200. You should obviously make use of this optimisation if your input size limit is much lower than 200, or if you decide to use a lot more significant field pairs.

Coding and Testing

This works adequately and was ultimately accepted for production, but I must acknowledge here the excellent work done by our Test Department, both in testing my new UDF out-of-sequence when I asked - quite unreasonably - for an early comparison with the old client-side C# function (come to think of it, how the hell did they even do that?), and also in uncovering pretty quickly all of the corner cases - punctuation, padding characters - mentioned above.

Oh, the code? Yeah sure, here ya go. Should be quite easy to hunt down the few rogue "200"s to adapt it for your use. You might also want to limit the number of iterations to prevent DOS attacks (we use only 3). As I said at the outset, it's still surprisingly specific and might not work for you. For example, it doesn't recognise decimal points / numeric separators. The truth is, there simply does not exist one Natural Sort Algorithm suitable for all data.

CREATE FUNCTION [dbo].[NaturalSort](@input NVARCHAR(200))
RETURNS NVARCHAR(MAX)
AS
BEGIN
  DECLARE
    @count INT = LEN(@input),
    @result NVARCHAR(MAX) = '',
    @p INT = 1, @q INT, @x INT, @y INT
  WHILE @p <= @count
  BEGIN
    SELECT @x = PATINDEX('%[0-9]%', SUBSTRING(@input, @p, @count) + '0') - 1
    SELECT @q = @p + @x
    SELECT @y = PATINDEX('%[^0-9]%', SUBSTRING(@input, @q, @count) + '!') - 1
    SELECT @result = @result + SUBSTRING(@input, @p, @x) +
      REPLICATE(' ', 200 - @x - @y) + SUBSTRING(@input, @q, @y)
    SELECT @p = @q + @y
  END
  RETURN @result
END

Three Onions Holding Up A Melon

Who Ordered That?

A favourite Twitter follow is @maanow, with their daily MAA Minute Math problems. These are usually simple little scenarios that take, as implied by the name, a minute or so to solve. But one of the best so far, I don't mind you knowing, took me a little longer than that. It involved three touching 1" radius spheres supporting another sphere of radius 2", and asked: what's the height of this arrangement?
http://maaminutemath.blogspot.co.uk/2013/08/august-7-2013.html
What surprises me about the solution to this one is the appearance of the square root of 69 in the answer. Doesn't that strike you as more than a little bit literally odd?

The solution is arrived at by first noticing that a line connecting the centre of the large sphere to the centre of one of the smaller, touching ones, has length 3 (all dimensions may as well be in inches; it's immaterial to the result). Now, this line is the hypotenuse of a right triangle, whose third vertex is the midpoint of an equilateral joining the centres of the three smaller spheres. This second triangle has side length 2, so the distance from its midpoint to its centre is 2/√3. Squaring both this length, and that of the earlier hypotenuse, en route to the ultimate solution, we discover that curious √69 emerging from the fact that 3³ - 2² = 23 = 69/3.

That's all, just wanted to draw your attention to this little oddity today.

Thursday, 21 February 2013

The Raven that Refused to Sing

and other stories

Prolificacy's name is Steven Wilson, as you might agree when you've seen the man's 500-page discog here.

Each new Steven Wilson album release is a bigger, fuller, more satisfying art-event than its predecessor. I'm talking of course about the limited deluxe editions. Nothing else is worth having, if you can have instead one of these wondrous objects, while nothing else - no mere collection of tunes and songs - is in our time, worth our time reviewing. Obviously then, it's especially edifying when - reportedly to beat an internet leak - your preorder arrives, entirely unexpectedly, four days early.

Rivaling the dimensions and heft of some paving slabs, well the 10" ones anyway, these special editions seem to have evolved by a process of continuous improvement into a format which by now should make the fan relax immediately into the familiar luxurious form, simultaneously with the unfailingly new and truly multimedia content. It's like a lottery win of the senses. For consider:

First, it looks and feels great - a solidly substantial objet d'art, with its great 5mm thick hardbound book cover enveloping 128 heavy and glossy illustrated pages full of impressions, stories, lyrics, sleeve notes, mysteries. Next, you get the smell of it, obviously the high aromas of a newly printed, full gloss inky tome. Only at the end of these sensory introductions, and as a final formality, will you eventually get to hear it. Before then, this beautiful thing has already carved its little square snug into your world, and as you begin your first consumption of these sounds (2 CDs) and visions (a DVD and a Blu-Ray), you reflect that Steven could easily have achieved a full sweep of all five senses, by the simple expedient of including a stick of chewing gum. Maybe next time.

The Musicians

Every work of this standard of quality, and each major aspect of such a work, is essentially in itself a collaboration. The original and most vital aspect here being of course the music, that collaboration comprises primarily Steven himself, singing and playing various keyboards (including King Crimson's original mellotron, borrowed from his friend Robert Fripp) and guitars, as befits a solo album. Thence in the astonishing ensemble of legends that is his touring band, alphabetically we have Nick Beggs on bass, Chapman Stick and vocals; Guthrie Govan on lead guitar, and Adam Holzman on keyboards; while Marco Minnemann* hits stuff, and Theo Travis blows into things.

Oh, and Steven's choice of associate producer / recording engineer? Only bloody Alan Parsons.

The Music

The album comprises six pieces, with three of these - satisfyingly enough, for those with an adult's attention span - each being over 10 minutes long. It opens with a song previously released on Steven's earlier live document, the quite recent Get All You Deserve 2CD / DVD / Blu-Ray.



1. Luminol: an uptempo percussive 4-to-the-floor with driving bass, signals the initial intent. Transitioned by Theo's flute and harmony vox / mellotron, it falls and fragments into a typically prog clipped and snipped timesig, before slowing and softening into the story of the busker (one of Steven's ghost stories), with yet more lush harmonies. There are hints of and nods to a shared progressive jazz heritage throughout, culminating in a typically strong KC and mellotron restart and return to quick tempo for the finish. From Wikipedia:
"Luminol", which was first performed by Steven Wilson and his band on the second half of his Grace for Drowning tour, takes its inspiration from a busker, who, according to Wilson, is "there every single day. It doesn’t matter what the weather is like; he’s always there, playing his acoustic guitar and singing these songs. Snow, rain, gale force wind – nothing will stop him from being in his spot. ... He’s the kind of guy who is so set in his routine that even death wouldn’t stop him." Wilson considers the notion "that somebody could be a ghost in life, as well as a ghost in death, somebody who’s completely ignored even in their lifetime – it hardly makes a difference; and death doesn’t make a difference, either; it doesn’t break the routine." [1]
2. Drive Home is soft and gently acoustic, showcasing Steven's famous talent for hook laden tune smithery. You need to clear away all the jetsam in your brain and face the truth. Even this gentility itself takes a percussion break, a breather wherein for example a delicate guitar part ambles alone for a few bars. Builds to a mixed (both climactic and anticlimactic) finish.
"Drive Home" is based on a suggestion from illustrator Hajo Mueller. It is "about a couple driving along in a car at night, very much in love; the guy is driving, and his partner – his wife or girlfriend or whoever she is – is in the passenger seat, and the next minute she’s gone." The ghost of the man's partner eventually returns, "saying, ‘I’m going to remind you now what happened that night.’ There was a terrible car accident, and she died, etcetera, etcetera – again, the idea of trauma leading to a missing part of this guy’s life. He can’t deal with the reality of what happened, so he blocks it out – like taking a piece of tape and editing a big chunk out of it." [2]
3. The Holy Drinker indulges a sequence of guitar / keyboard / clarinet solos as well as some welcome overdrumming in its satisfyingly extended (2½ minute) 4/4 introduction, prior to leading into another SW ballad, whose verses are similarly separated by inventive instrumental fills. The middle duet between Fender Rhodes and flute, leading into some big Keith Emerson chops, is particularly worth the ticket price. When the story continues in a lente pianissimo section, its dramatic and mathematical conclusion can be sensed from afar.
According to Wilson, "The Holy Drinker" concerns "a guy who’s very pious, very religious, preachy and self-righteous. I’m thinking of TV evangelist-types – guys who are prepared to tell people that they’re living their lives wrong and that they’re missing something because they don’t believe in God or whatever it is." The man, who, despite criticising other people's lifestyles, is himself an alcoholic, unwittingly challenges the Devil to a drinking competition, with disastrous consequences: "Of course, you can’t beat the Devil at a drinking competition – you can’t beat the Devil at anything – and so he loses. ... He gets dragged to Hell." [3]
4. Vocally, The Pin Drop starts unpromisingly. But instrumentally, this is a consistently strong and driving track, with particularly tortured sax.
"The Pin Drop" addresses "the concept that you can be with someone because it’s comfortable and convenient, not because there’s any love or empathy." Wilson explains that "The song is basically sung by the wife. She’s dead, she’s been thrown in the river by the husband, and she’s floating down in the river while singing this song – from beyond death, beyond the grave, as it were." The song considers "The idea... that sometimes in a relationship there can be so much tension, so much unspoken resentment and hatred, that the tiniest thing can set off a violent episode, and in this case, one that ends in tragedy. The sound of a pin dropping on a floor can be the thing that instigates the fury." [4]
5. Back now in time and context to Foxtrot era Genesis, for the introduction to the little proggy masterpiece that is The Watchmaker. Not a love song, but a tale of touching companionship and failure, human existence, inevitability and frailty. Its second act warms and beats to the musical imagery of clockwork, accelerating subjectively through a spectacular clarinet and guitar duet, before a braking return to the piano-backed, heart-rendingly honest narrative. The endgame, by turns instrumental and vocal, is a particularly pleasing and quirky workout.
The fifth track on the album explores "the story of the watchmaker, the guy who is meticulous about his craft, but he never has any kind of emotional outburst, nor does he express violence or any extreme emotions whatsoever." It concerns "a couple who have been together for 50 years or more, purely because it was convenient and comfortable." Wilson explains that "The watchmaker ends up killing his wife and burying her under the floorboards of his workshop. But, of course, she comes back, because she’s been with him for 50 years; she’s not going to leave him now." The song concludes when "the wife comes back to take him with her, which", Wilson suggests, "is another classic ghost story, in a way." [5]
6. As for the closing, title track: The Raven that Refused to Sing is perhaps the most chillingly haunting ballad on this quite haunty album.
The title track explores the story of "an old man at the end of his life who is waiting to die. He thinks back to a time in his childhood when he was incredibly close to his older sister. She was everything to him, and he was everything to her. Unfortunately, she died when they were both very young." The man becomes convinced that a raven, who visits the man's garden, is something of "a symbol or a manifestation of his sister. The thing is, his sister would sing to him whenever he was afraid or insecure, and it was a calming influence on him. In his ignorance, he decides that if he can get the raven to sing to him, it will be the final proof that this is, in fact, his sister who has come back to take him with her to the next life." [6]
See and hear for yourself:



The six indented comments above, quoted from Wikipedia, are in fact primarily sourced from a particularly illuminating interview with Steven (appearing two weeks ago in musicradar) in which he talks about the initial development of the project, and discusses the background to each song in turn and in generous detail.

The Bonuses

The second CD contains different versions of all of the main songs, curiously labelled "(demo)" - who on earth would demand a demo from an effective supergroup** like this? - together with one "unused idea", Clock Song. Were the pendulum percussion and the musical chimes regarded as too obvious, too Mellotron Scratch? Maybe it just couldn't be bent to inhabit the overarching and otherwise cohesive supernatural / ghost story / fear of mortality / end-of-life regret concept circle of the piece. Whatever the reason, it is indeed little more than a not-fully-developed idea, inessential, though not unpleasant.

The documentary DVD and Blu-Ray (see below) also contain between them:
  • a 96/24 Stereo LPCM of the album;
  • one bonus track - Drive Home - lounge version, also 96/24 Stereo LPCM;
  • instrumental versions of all the album tracks (96/24 Stereo LPCM);
and as you would expect from the progressive world's most prolific 5.1 remastereer, the following 5.1 mixes:
  • DTS 96/24 5.1 surround
  • Dolby AC3 5.1 surround
  • 96/24 5.1 LPCM
  • DTS-HD Master Audio 5.1
YMMV

One reviewer, while describing this new album as painful, moving, desperate, melancholic and superbly beautiful and exhorting us to Buy it, also advises, It’s pretty much a progressive affair. As simple as that. There is nothing here that manages to stretch itself out of the canons of such a well-defined genre. And he may be right; but many long-time fans of Steven Wilson (and of Porcupine Tree in particular) are impressed by the jazz sensibility and free-form styling that have infected his recent work - perhaps inevitably, given the creative cadre of influences here, and his own experiences in remastering those King Crimson, Jethro Tull et al masterpieces for surround audio. To those fans, this kind of prog is new and fresh today, forever heading off in unexpected directions.

The Stories and their Art - Hajo Mueller

While Steven himself provides the quite literally haunting stories for Luminol and The Birthday Party, collaborator Hajo Mueller supplies the concept and illustrations throughout the book, as well as the original idea for the title story which he then reworked with Steven. Both the DVD and the Blu-Ray include his art gallery.

Photography & Documentary - Lasse Hoile

It wouldn't be a new Steven Wilson opus without that final contribution from his long and fruitful collaboration with Lasse Hoile, who here provides for both the DVD and the Blu-Ray, the photographs used in the photo gallery, as well as filming and editing the indispensable studio documentary.

The Gig

Fellow Weegies, let's all catch the touring band at Glasgow's ABC this March 2ndUpdate: Oh wait...
* Zappa alumnus supreme Chad Wackerman (Best Drummer Name Ever!) stands in for Marco Minnemann on the North and South American legs of the current tour.
** Actually, each member leads his own band. Hypergroup, then?

Thursday, 14 February 2013

Claims Based Access Control

The previous article on Role Based Access Control described the IIdentity and IPrincipal implementations available with Microsoft .NET 1.0 through 4.0. Designed in 2002, these soon began to creak under the stress of increasingly distributed systems of rising complexity. With WCF in 2006, Microsoft provided another security model incompatible with the old one, and in 2009, succeeded in combining the best parts of these two implementations into WIF, and ultimately in 2012, into .NET 4.5.

IIdentity and IPrincipal Revisited

The new hotness was Claims Based Authorization. A System.Security.Claim is a statement about an entity, made by another entity known as the Issuer of the claim. The statement takes the form of a key/value pair, which is obviously much more powerful than the true/false settings used in Role Based. For example, Active Directory might issue claims that the current user is named "alice", is an administrator, has email address "alice@microsoft.com", and so on. We would treat these claims, coming from AD, with a high level of confidence.
public class Claim
{
  public virtual string Type { get; }
  public virtual string Value { get; }
  public virtual string Issuer { get; }
  // ...
}
We could set up claims using syntax like this:
var claim = new Claim("name", "alice");
However, recall that previously we found reasons (of localization and breaking admin changes) to avoid using literal labels to identify security resources such as groups. It turns out that we might also want to use the predefined members - essentially namespace URIs - of the ClaimTypes class, to ensure that our claims can be serialized, shared and understood across a federated identity space:
var claims = new List<Claim>
{
  new Claim(ClaimTypes.Name, "alice"),
  new Claim(ClaimTypes.Email, "alice@microsoft.com"),
  new Claim(ClaimTypes.Role, "Sales"),
  new Claim(ClaimTypes.Role, "Marketing")
};
ClaimsIdentity and ClaimsPrincipal

Obviously, claims can carry a lot more detailed information about an entity than roles. For excellent reasons of backward compatibility, claims were integrated into the existing IIdentity / IPrincipal framework through the creation of two new container classes:
public class ClaimsIdentity : IIdentity
{
  public virtual IEnumerable<Claim> Claims { get; }
  // ...
}

public class ClaimsPrincipal : IPrincipal
{
  public virtual IEnumerable<ClaimsIdentity> Identities { get; }
  // ...
}
Now we can set up Alice's principal like this:
var id = new ClaimsIdentity(claims, "Console App");
var user = new ClaimsPrincipal(id);
Thread.CurrentPrincipal = user;
The second parameter to the ClaimsIdentity constructor, authenticationType, is needed because, unlike the case with Role Based Identity, Claims Based Identity allows claims to be attached to e.g. anonymous principals. By default then, if authenticationType is not provided, IsAuthenticated will return false.

Backward Compatibility

Examining the framework source, it's interesting to note that every (non static) public in these new classes is virtual. That's because they have been "interposed" in .NET 4.5, so that all the implementation classes we looked at last time (WindowsIdentity / WindowsPrincipal, and GenericIdentity / GenericPrincipal) as well as those we didn't (e.g. FormsIdentityRolePrincipal) are now derived from these common base classes. This design gives all consumer code access to the claims collection if desired, while still preserving the old interfaces (Name, IsInRole, etc).

The base classes implement the legacy security model with the help of the claims collection. For example, accessing the ClaimsIdentity.Name property causes this collection to be traversed, searching for an entry with type ClaimTypes.Name, while checking for a role is done by searching for ClaimTypes.Role entries.

Incidentally, both of these associations can be overridden using the supplied, overloaded, ClaimsIdentity constructors. So you can use these base classes to benefit from the power of the claims approach in your own custom implementation, while still retaining the flexibility of selecting which ClaimsType values you want the legacy support to treat as names and roles:
public class MyClaimsIdentity : ClaimsIdentity
{
  public MyClaimsIdentity(IEnumerable<Claim> claims, string authenticationType) :
    base(claims, authenticationType, ClaimTypes.Email, ClaimTypes.GroupSid)
  {
  }
}
In the above example, accessing the Name property will search for a claim of type Email (often used as the unique user identifier), while a call to IsInRole will search for a match within the groups to which the user belongs.

Consuming the Principal

To get access to the claims from runtime code, we simply retrieve the IPrincipal from the thread as before, and cast it to a ClaimsPrincipal. Alternatively, and equivalently, we can obtain it in a single step via the handy, static ClaimsPrincipal.Current property:
var user = ClaimsPrincipal.Current;
With this we can now run arbitrary LINQ queries on the Claims collection property, or take advantage of the special FindFirst, FindAll and HasClaim methods, all of which accept either string parameters or lambdas.
var email = user.FindFirst(ClaimTypes.Email).Value;
var adClaims = user.FindAll(claim => claim.Issuer == "AD AUTHORITY");
Credential Types and Transformations

.NET 4.5 boasts out-of-the-box support, via descendants of the SecurityTokenHandler class, for a wide range of industry standard authentication protocols, which it translates automatically to ClaimsPrincipal format. These include:
  • Windows (SPNEGO, Kerberos, NTLMSSP)
  • Forms authentication
  • basic HTTP authentication
  • SSL client certificates
  • WS-Security tokens
  • SAML
Incoming credentials arrive, are processed and transformed, and then delivered to your application in the "final" ClaimsPrincipal. This processing begins with an appropriate SecurityTokenHandler deserializing and validating the incoming token, from which it constructs the "initial" ClaimsPrincipal. This can then be further processed or transformed, for example to validate the incoming claims or translate security-domain groups into application-domain user permissions, via your custom ClaimsAuthenticationManager (to run the following sample code, add references to System.IdentityModel and System.Security):
public class MyClaimsTransformer : ClaimsAuthenticationManager
{
  public override ClaimsPrincipal Authenticate(
    string resourceName, ClaimsPrincipal oldPrincipal)
  {
    // Validate the supplied claims
    var name = oldPrincipal.Identity.Name;
    if (string.IsNullOrWhiteSpace(name))
      throw new SecurityException("User name is missing!");
    // Process claims
    var creditLimit = LookupCreditLimit(name);
    var newClaims = new List<Claim>
    {
      new Claim(ClaimTypes.Name, name),
      new Claim("http://myclaims/creditlimit", creditLimit.ToString())
    };
    // Create & return the new ClaimsPrincipal
    var newId = new ClaimsIdentity(newClaims, "Local");
    return new ClaimsPrincipal(newId);
  }
}
Obviously we have to find somewhere to tell the system to use our new custom ClaimsAuthenticationManager class, and the equally obvious place to do that is in the configuration file. There's a new "system.identityModel" section:
<configuration>
  <configSections>
    <section
      name="system.identityModel"
      type="System.IdentityModel.Configuration.SystemIdentityModelSection" />
  </configSections>
  <system.identityModel>
    <identityConfiguration>
      <claimsAuthenticationManager
        type="MyNamespace.MyClaimsTransformer, MyAssemblyName" />
    </identityConfiguration>
  </system.identityModel>
</configuration>
Finally, at the point in our application code where previously we would perform our ad hoc claims processing, polluting our beautiful domain code with extraneous user security permissions logic, we now have essentially this fixed one-liner (no, don't laugh):
Thread.CurrentPrincipal =
  FederatedAuthentication
  .FederationConfiguration
  .IdentityConfiguration
  .ClaimsAuthenticationManager
  .Authenticate(
    "none", // Can pass eg a URL resource for context
    new WindowsPrincipal(WindowsIdentity.GetCurrent()));
Next time: session management.