18 May 2009

pop science

Went to a show last night at Borrowed Time, a now-defunct (last night was its last night) art/performance/community space in a temporarily vacant house. It was nice to see Charles Latham, an anti-folk fellow (I feel like I'm allowed to throw that tag around casually at the moment, since I just wrote a review of Jeffrey Lewis' [fantastic] new album without mentioning it once) whom I've seen before at the Mitten. And of course it's always a treat to hear Emily, whom my colleague A.D. just described, preposterously, as "sensationally bitchy." It'd been too long. Her cough was doing fun things for her singing voice. (A little sad too, tho, and I can't help subvocalizing along with some wistful harmonies...)

But I especially enjoyed getting to hear Deirdre and Conor, a new-to-me Philly duo who played guitar/cello/vocals duets, assisted by a cool old Korg analog drum machine (this one), with earnest and sappy lyrics about domestic love, often using silly science and math metaphors (and also a fight song for accountants.) Aw. You know I have a thing for couple bands. They made me think of The Long Lost, not so much for their music (though Deirdre's faint, flat singing is reminiscent of Laura Darling's) but their general aesthetic, which is not just gentle and sincere but so markedly, overwhelmingly White it's kind of amazing. You don't often hear music that feels so utterly un-inflected by (notionally) "Black" pop styles and yet is clearly very genuine in its own right. Diggin' it. Here's what they look like, presumably taken on their wedding day:



After their set, Conor told me about a project a friend of his had done, called pitchformula, wherein he analyzed Pitchfork reviews to see which words appeared more frequently in positive vs. negative reviews. Which I think is mostly interesting as a way to see how critics write differently about music they like vs. music they don't like. Weirdly, one of the words way more likely to appear in negative reviews is "lyrics," while generic instrument terms like "guitars" and "drums" are by far the most predictive of positive reviews. Apparently, critical Pitchfork reviews, at least through 2004 (when he did the project) were less likely to actually talk about the music, more likely to employ "meaningless" value judgments, attacks on the intelligence of the artists and/or listeners, and references to commercialism. It would be really fascinating to know how these results would be different now, five years later; I'd definitely expect to see some substantial changes reflected in both the writers' attitudes and their preferences, following the anti-Rockist revolution, the broadening of P-fork's scope, stature and sensibility, etc. etc. (For some reason, the published results of the project mostly don't include words associated with specific genres; it would have been interesting to see how those skewed.) Also, would be really interesting to know how these results compare with, for instance, Allmusic reviews (which probably skew even more positive – though there's a much wider pool of reviews to draw on – but should ostensibly have less overt "critical bias," however that would come across.)

Apparently, this guy, Loren Wilson, was primarily interested in using this data collection/analysis in order to inform his attempt to write songs that would specifically appeal to critics, somewhat along the lines of Komar and Melamid's fascinating and bizarre Most Wanted and Most Unwanted songs (which are definitely worth a listen if you haven't heard them.) The resulting songs, somewhat predictably, feature many elements (textures, sounds, structural approaches, etc.) thrown together in somewhat overloaded tracks, the upshot of trying to use many of the "positive" qualities all at once (which Wilson justifies because "complex" and "unexpected" are listed high among the favorable attributes.)

It's easy to be skeptical of the legitimacy (scientific, that is, not aesthetic) of this kind of "systematic" approach – given the same lists of words, would many other musicians come up with something that really sounded like this? And – the obvious problem – does the way things sound really have anything to do with how "good" they are? (The answer ought to be no, presumably, but then he's not trying to make music that's "good," just music that will appeal to a certain set of people...albeit people whose criteria for good music should theoretically be especially broad and open. or maybe not.) In any event, the music he came up with does sound strikingly like certain critical-fave zeitgeists of 2003-04. (Also, they're pretty good, though the things that make them good are not really the same as the things that stem from the results of the data – instead, those qualities tend to make them predictable, on the one hand, and disjointed on the other, not that those things aren't also "interesting.")

Which, again, would make it really interesting to hear how it would be different if this project repeated now – if anything, the musical landscape as-approved-by-critics is even more scattered and heterogenous these days (even though a glance at P4K's Best New Music page does still reveal a somewhat surprising amount of more-or-less straightforward Indie Rock); I wonder if songs self-consciously based on current critical preferences would come off even more hopelessly jumbled.

No comments: