3 years, 3 months ago
Near future prediction is always a rough game. The 1984 film 2010, an adaptation of Arthur C. Clarke’s 2001: A Space Odyssey sequel, managed a double error that almost perfectly defines the way in which the classic science fiction vision of the future fizzled. It’s first important to note the sort of endearing hubris of the film in the first place. In 1968, when 2001 came out, the future was thirty-three years off. But in 1984 the future was only twenty-six years off. There was still the resplendent belief in the onrushing scientific utopia that the culture had promised - a utopia wrapped up as a historical inevitable consequence of the then-present. Hence 2010’s amusing double error - the belief that humanity would be on the verge of going to Jupiter and the belief that the Cold War would be rumbling on.
Instead the American space program spent 2010 winding down, each of the three surviving Shuttles making their penultimate flights. The Cold War is dead and gone. Indeed, the entire apparatus of the future is long since gone, and has in fact quietly disappeared into the rear view mirror. Time memorably spend the tail end of 2009 declaring the aughts to have been the “worst decade ever,” and while this should be taken as Comic Book Guy style hyperbole with periods after every word, the case was compelling enough. A decade long war on terror culminated in a massive financial crash and the growing realization that electing a black dude President of the United States was not actually going to fix everything.
Even the replacement future hastily drafted in after the original draft withered on the vine was floundering. It’s not that the Internet was underperforming as such, but on the other hand, its transformative promise was somewhat less than advertised. What had started as revolutionary technology was now just Facebook.
The future had never really been built past the change of the millennium, and without that to anchor it there was just a vast, featureless present. Even technological advancement had become oddly fixed and predictable. The future came in regularly scheduled product launches headlined by Steve Jobs, incremental, predictable, and mostly leaked to rumor sites a few weeks in advance. It was not even a hugely pessimistic historical moment. Rather it was one captured by a grim banality. Nothing changes. Humanity sits past the ghost point, all the ideas and storming progress of the twentieth century ground to a point. For a hundred years now we had assumed we were building to something - that there was some sort of stable endpoint that all the churn and upheaval pointed towards. Whether that endpoint was utopia or armageddon was up in the air, but the broad idea that history was going somewhere wasn’t. Instead, however, the wheels simply fell off, and the tow truck never came. We sat, abandoned on history’s roadside, left thinking, “is this all there was?” And, seeing no sign of civilization, we tweeted about it.
Tea and Coalitions
It was pretty clear that Labour was toast. Their only lead in the polls came in the immediate aftermath of Gordon Brown becoming Prime Minister, and as Brown did not call a snap election he was, in effect, doomed, not least because he had the charisma of a dead fish and a nasty habit of accidentally getting caught on microphones calling people bigots. This was hardly surprising: Labour had been in power thirteen years, and that’s a hard streak to maintain.
What was interesting, in the lead-up to the election, was that nobody seemed particularly enthused about David Cameron either. Indeed, for much of the election the hot story looked for all the world like Nick Clegg and the Liberal Democrats, who, on the back of Clegg’s performance in the UK’s first televised leaders’ debates, were polling ahead of Labour at various points, and threatening to snatch the entire election. In effect, Clegg was successfully executing the Barack Obama playbook, offering younger voters a sense of an alternative to tired political gridlock from the two major parties. In the wake of the expenses scandal, the message seemed to play well.
But a strong performance in the final debate from David Cameron and a fear campaign over the instability a hung parliament would supposedly cause mostly pulled the Lib-Dems back to Earth, and in the final election result they finished only 1% up on their 2005 performance, actually losing five seats due to the vagaries of the UK electoral system. (Buy any Lib-Dem a drink and they’ll explain with vigor.) The final result on May 6th had the Conservatives with just 36.1% of the vote, Labour with 29%, and the Lib-Dems with 23%. The result was, in fact, a hung parliament, and after five days Cameron formed a coalition government with the Liberal Democrats. Not to peek ahead excessively in the narrative, but on the whole this coalition proved disastrous for the Liberal Democrats, who got very little of their agenda enacted and saw their polling support collapse in a large part because of the understandable anger of people who voted for a left-wing party and got a Tory Prime Minister supported by the very party they voted for.
Six months later, in the United States, the midterm elections for the House of Representatives and Class Three of the Senate took place. The Democrats managed the impressive feat of losing six Senate seats, making an impressive four election streak of the Democrats losing seats in Class Three, and getting completely slaughtered in the House of Representatives as a wave of extreme conservatives united under the loose banner of the Tea Party entered office.
In both cases there is something of an official narrative of events. The US elections were supposedly a reaction to the Affordable Care Act, aka Obamacare. The UK elections, despite the failure of the Conservatives to actually attain a majority, showed a mandate for austerity policies and belt-tightening. These are, after all, the way these things work: an election must always, in the master narrative, display some shift in the public’s desires and opinions that is then realized by the government in power.
In practice, this is nonsense in both cases. The US elections were determined not by any sizable number of people who had previously been Obama supporters defecting to the opposition, but by changes in turnout. Obama was elected by young and minority voters who historically have a lower turnout in the midterm elections. In 2010 the voters who turned out were older and whiter, and they voted as they had in 2008. Nobody’s mind had changed. Similarly, in the UK, a full 52%, an outright majority, voted for one of the two left-leaning parties, versus just over a third for the Conservative party. As has been the case in every single UK election since 1935, the left-leaning electorate is larger but split between two parties.
It is, of course, not that simple. A democracy doesn’t function on votes not cast, and the choice between Labour and the Liberal Democrats is a real one, even if both are to the left of the Conservatives. But while it may not be a narrative that forms a leftist government, it is at least no more dishonest than the narrative of a rightward turn in either country. In both cases the basic result was the same: the political right gained power considerably out of proportion to their actual support, and used it to enact policies that had the result of ending the Great Recession for the extremely wealthy while deepening its effects on the majority. The frustration, unsurprisingly, was palpable.
The Fame Monster of Peladon
In pop music, meanwhile, the news is Lady Gaga.
It is not that Lady Gaga is the first visibly performative and manufactured pop star. She’s not even the first to wear her plasticity on her sleeve, nor the first to anchor well-produced dance-pop anthems in a heavily performative aesthetic. Yes, she takes it to an impressively complete level, rejecting the entire idea of a personal identity separate from the pop performance, but it’s still a straight-up lift from the Madonna or David Bowie playbooks, played with the ruthless artistic consistency of Kraftwerk or Laibach. Nevertheless, it is worth looking at her in this precise historical moment, this being, in effect, Peak Gaga. The key single at this moment in time is of course “Bad Romance,” which seems in hindsight to be set to go down as her most enduring and iconic number.
The song itself is a straight up bit of uptempo dance pop. It has more than enough hooks to snarl in your head, and manages the useful feat of having multiple memorable segments that it can thus cycle among so as to be catchy without wearing out its welcome. You’re just as likely to end up on the chorus as you are the nonsense syllables. The subject matter is one of the great classics of pop music: this relationship is awful and I love it so much.
None of this explains its impact. This is a single that is unapologetically driven in a large part by its music video. It is not that music videos ever died as an art, but there was a vast fallow period between “the days that MTV played music videos” and the point where YouTube took over the function, and Gaga was one of the first artists to exploit their return emphatically.
“Bad Romance” is not her first music video, but it is the first one of her videos to foreground one of the basic problems that faced Stefani Germanotta in becoming a pop star, which is that her facial falls significantly outside the narrow band of what is considered normatively and conventionally attractive. In much of her earlier videos and work in general this fact is carefully elided, either through substantial uses of makeup or by the use of masks of various sorts (including her signature oversized glasses). But in the video for “Bad Romance” she instead features herself made up to further emphasize the sense of physical strangeness. Indeed, the video goes further, eliding Lady Gaga’s wide-eyed and unsettling visage with latex clad monsters dancing spasmodically. This image of monstrosity is, of course, central to the project - the album “Bad Romance” is on is called The Fame Monster, its title changing Gaga’s first album, The Fame, into a new form.
The resulting aesthetic is not merely pop spectacle, but pop monstrosity, in the classical sense of monstrousness as something meant to be gazed at and looked upon. But in every previous version of a stunningly plastic pop star had been based on a sense of ever-shifting identity: the question of what character the artist will play on a given song or album. But Lady Gaga moves beyond that - there is nothing but the raw spectacle. Pop becomes an object intended to be approached as the Other, defined by its very strangeness. Even if the historical moment of Lady Gaga is brief - her next video for “Telephone” doesn’t have nearly the impact despite being, in most regards, a better song and having Beyoncé as a collaborator - and her albums have seen declining sales since The Fame Monster. But the point is in many ways less Lady Gaga’s enduring presence than the new relationship with pop that she augers - one in which we finally treat culture as a monstrous entity to be looked at from beneath as it looms over us, unapproachable and formless.
The Avatar of Inception
Science fiction, at least, is doing fine. The end of 2009 marks the release of James Cameron’s long-awaited followup to Titanic, the sprawling sci-fi epic Avatar, while 2010’s summer is anchored in part by Christopher Nolan’s Inception. Both are interesting films from a certain perspective, in that they are films concerned with questions of identity. Avatar takes a fairly straightforward story of postcolonial aliens and adds to it a whole “becoming the Other” plot that ends in a charmingly posthuman species change. Inception, on the other hand, is an astonishingly well-made execution of the concept of a dumb person’s idea of what a smart movie should be like that attempts to get “it was all a dream” to actually work as an end-of-film reveal.
Neither are particularly good movies, although they’re certainly not bad movies either. They are simply middling movies. And yet they were at the time very successful, giving at least a flavor of the time. As suggested, what’s interesting here seems first and foremost the obsession with identity and the self. The very notion of “I” is in flux. This is not a new theme, of course - science fiction has been playing with this for decades.
But here the notion of identity is bound up in big budget, special effects based spectaculars. Whatever the films’ conceptual territories might be, both were marketed largely through the existence of their impressive and innovative special effects. They invite their audiences to gaze at a big spectacle, as with much of the culture at the time. But the spectacle is oddly self-denying. To gaze at the special effects of Avatar or Inception is to enter a world in which identity is not so much questioned as rejected. These are not films that explore the frontiers of experience and self, after all. The fragile nature of identity is just taken for granted in films that fit smoothly and uncomplicatedly into other genres. Inception is a heist film. Avatar is countless sci-fi films. It’s just that they’re standard issue films from which we have been pushed out.
The implication is that our stories no longer need us. That they can get by perfectly well without us. We are extraneous even to our own cultural objects, which have taken on lives of their own. Tellingly, the other major sci-fi film in this period is Iron Man 2, the point at which Marvel’s movie division really revved up its seemingly unstoppable juggernaut of churning films out of pre-existing properties. Film threatens to become a nearly self-sustaining medium, imbued with enough depth and concepts to simply retread existing ground over and over again. Narrative without people. Having killed the author, we’ve now killed the audience. There is only the endless desert of text itself, wandered by nobody. Only ghosts, with nothing left to haunt but themselves.
Share on Facebook