Wednesday, March 11, 2009

Can I Refill Your Coffee? You Might Be Here a While

The usual stylistic considerations of theatre—lighting, set, sound, blocking—were absent from the staged reading of John Baxa's “Dead Hours,” one of many exemplary instances of student playwriting to emerge from Kalamazoo's recently concluded Student Playwriting Festival. Lacking the usual trappings of the stage that define a performance, the actors that stepped on stage in tandem to sit on stools before sheet music stands—each one bearing a script—carried the entire weight of portraying the dead hours of a lonely Michigan diner with just one oddly prophetic customer. In turn, they relied more heavily on the script to guide them, truly putting Baxa's script writing talent to the test. He passes.

 While the double feature of Alex Clothier's “Just Kidding” and Ben Harpe's “Glass Closet” likely stole the show for diligent attendees of the Student Playwriting Festival, the mildly exotic nature of a staged reading versus a traditional performance warrants attention—for it alters the nature and mood of the performance. The structure of a staged reading can complement some plays, while the nuances of others may get lost in the simplicity. “Dead Hours” is of the former.

 Creepy Steve, played by Dwight Trice, is the reason “Dead Hours” works as a staged reading. An acutely prophetic homeless man that works like clockwork—never speaking, except for this one night—has but a brief appearance and even briefer lines near the end of the play. But he is present on stage for the entire reading—for he would not be in a regular performance—lending his character an omnipresence that give his words even more weight. Trice's acting talent finds symbiosis with this important role.

 The diner itself—lacking a set to imply its presence—is built by the gossamer imaginings of the audience, the actors' static, nigh-mocking gestures of mopping the floor from their stool or eating cake serving as mortar for the illusion. There is a sense that this diner exists in eternity, and the audience just happens to have stumbled in to watch on this momentous night when Creepy Steve wanders in and eats some cake as usual, then hands out a few tokens of absolute truth that would otherwise go unsaid.

 It is business as usual for the small Michigan diner—coping with disillusionment with illusion. It is also evermore the status quo for this neglected state, where seemingly the only response left to factory closings, underfunded and ignored inner cities, and a crushing recession on top of an already floundering state is to by some red hair dye and spend a week mopping floors at at a diner, floating in the midst of the morass, looking for answers.

Tuesday, March 10, 2009

Make-Believe War Never Changes, But Real War Changes Everything

“War...war never changes” is the perpetually pessimistic opening of “Fallout 3”'s dreary diatribe against humanity's predisposition towards total annihilation. What utter nonsense—war has changed fundamentally, from tribe against tribe with rocks and arrows to state against state with machine guns and artillery to bloc against bloc with nuclear missiles. The Neanderthals and homo sapiens could never have imagined 200,000 people could be incinerated in an instant, merely at the behest of a man sitting in an excellent suit behind a fine desk 10,000 miles away. 

 The perspective and scale of war has changed tremendously, and while it is perversely comforting to shrug and say that boys will be boys with a coy smile while exactly such a Rwandan boy hacks another apart with a machete, people too have changed to cope with ever improving scale of warfare we have unleashed time and time again in effusive fits of spastic slaying.  

 From Maxim's World War I killing fields to the systematic, expertly engineered Holocaust to the completely de-mechanized but nevertheless methodical extermination of Rwandan Tutsis, in the aftermath of war people have invariably tried to make sense of the devastation. Past atrocities inspired great artists and works of art—today, as Jack Daglish said in “Hotel Rwanda,” “I think if people see this...they'll say, 'oh my God that's horrible,' and then go on eating their dinners.”

 Today, fine art that expresses real anguish and dread is for the sole perusal of snobbish “New York Times” art critics, who are relegated to irrelevance in the public eye while the Hoi Polloi subscribes to the mass media for endlessly regurgitated opinions about why people hurt each other.

 But “we can all talk about war in the abstract, and about how it advances or distorts American interests, but we only occasionally get to see the faces and hear the voices of the people who actually do the fighting,” and Rod Slemmons' words are the crux of why video games are the newest, most important medium for coping with war.

 Game developers have an obsession with war—one that easily exceeds Hollywood's, at least proportionally—if not because they seek to understand it, at least because their consumer base is equally obsessed. Until recently, the most popular video games used war as a colorful plastic backdrop for the player's egomaniacal rampage under the guise of an overpowered immortal one-of-a-kind supersoldier. The war of games like the “Halo” series is consequence-free and painted in black and white; its only purpose is to provide targets. This attitude stretches back to the days of “Doom,” “Quake,” and “Enemy Territory;” being the only good guy made it easy to figure out who to shoot, and any background these war games had was peripheral—seasoning to give the slaughter some semblance of a point. At least 1999's “Unreal Tournament” accepted that there was no method to its madness—its strange and unfocused story of a bloodsport tournament pokes fun at itself and other games that use violence as a blank-slated justification for play.

 The earliest games were constrained by a reliance on pure text to situate the gamer within a convincing world, and storytelling was of the utmost importance. Even when games began to utilize graphics and sound, they were primitive—there were no realistic facial expressions or intonation—and so when a game explored a theme, it was largely a textual and plot-driven exploration. As the resources available to developers grew, they nevertheless remained limited, and developers therefore had to make a choice between storytelling and gameplay.

 The original 1997 iteration of the post-apocalyptic adventure “Fallout,” now just another cult classic with loads of critical acclaim and a weird sense of humor, is draped with a palpable sense of loneliness in the wake of war that broke through the barrier of simplistic graphics. And while parts of it are fun, the pacing is slow and the action staggers. Gamers flocked to “Fallout” for the story and to “Unreal Tournament” for the explosions.

 Perhaps the first game to be universally lauded for successfully combining excellent storytelling and engaging gameplay is “Deus Ex,” from 2000. As the video game industry began to see the light of legitimacy and the rewards of exponential advances in technology, vast resources became available to game developers. They could now express ideas through characters that were more human-like than ever before, and could speak to the player through the sights and sounds of the world instead of a text box reading “it is pitch black. You are likely to be eaten by a grue.”

 But instead of following “Deus Ex”'s example, the most popular video games of the early 2000's were the likes of the “Halo” and “Command & Conquer” series, which exploited a backdrop of war for the immediate justification to shoulder an assault rifle without exploring war through their highly detailed and expressive—if scripted—characters.  

 “Half-Life 2” may have been the greatest exception, and it won much deserved recognition for its emulation of “Deus Ex”'s equal focus on storytelling and gameplay, as well as using a well-crafted physics engine to immerse the player. However, the real triumph of “Half-Life 2” was combining all of its innovative elements into understated instances that defined the world and conveyed a message.

 When Gordon Freeman, the protagonist, first steps off of a train into the oppressive autocratic fiefdom of Doctor Breen's City 17, his—and the player's—very first impression of the city is a robotic scanner that curiously floats towards his unfamiliar face and snaps a picture of him with a brilliant flash. The feeling of being constantly monitored and controlled is compounded by myriad security cameras and gas-masked police as the player wanders outside into the twisted world he will have to fight against.  

 His world will come to be defined by war, but as a means to an end—the quest for freedom that is the real justification for Freeman's actions. By placing control over Freeman's fate in the face of these complex themes, the game ties the player into its story through the gameplay.

 Games like “Half-Life 2” or “Deus Ex” are not the norm, which is why they receive such high marks in the critical sphere. But games like these are becoming less exceptional as each generation of best-selling games takes greater advantage of the opportunities for amazing storytelling and gameplay that the medium offers.

 The most direct input to the player is the character or characters that the player controls, and games of late—instead of relinquishing their reliance on war as a foundation—have put more effort into using those characters as a lens through which a player can get a better sense of the war that defines the game's world. Games with a first-person perspective that place the player inside of the character's head have seen the best use of this synthesis between story and gameplay, character and player, though some real-time strategy games, like “Company of Heroes,” have made an effort in their cutscenes.

 The fast track to making a character accessible to the player instead of just a puppet that responds to button presses is scaling the character down from superhero level to just another soldier. “Call of Duty 4,” the 2007 competitor for the affections of the “Halo”-obsessed gaming populace, places the player in the combat boots of a U.S. Marine and a British special forces soldier instead of a genetically-modified, armor-clad, supersoldier of earthshatteringly awesome proportions. Even the next expansion to the “Halo” franchise, the upcoming “Halo 3: ODST,” features the player as an Orbital Drop Shock Trooper, i.e. one of those guys that never survives the first 30 seconds of the mission in any other “Halo” game.

 If game developers continue to rise out of their shamelessly exploitative slump and instead express war's insidious, universal impact through every avenue—storytelling, gameplay, characters and the world they inhabit—they will be well on their way to making better games that could hold an important message or even the potential to fully legitimize the gaming medium. With bigger budgets and better technology than ever, games now pay the greatest attention to detail and realism; but they must also treat war realistically.

Monday, February 23, 2009

Oscar the Grouch

 In Hollywood, money is not accepted. Dreams are the currency of this fairyland, where glancing references to the global recession that is dragging the world down from yet another deflating speculation bubble are just that—glances out the window of Dorothy's cottage at the wondrous world outside. At the 81st Academy Awards, the Academy sometimes caught a glimpse of the approaching tornado out of the corner of its eye, and it quickly turned away.

 Like its corporate sponsors, the Academy probably would have liked to appeared defiant in the face of economic decline; declaring, with a grandiose gesture from the ivory tower of art, that the show must go on. Like its sponsors, the Academy Awards alluded to the global economic crisis just enough to irritate, but not enough to serve a purpose.  

 Everyone had a style; Coca-Cola deflected attention by pimping their latest humanitarian cause du jour, Sprint blithely ignored everything like a teenager texting on her new 3G phone, and Hyundai injected oddball phrases like “these troubling times” into the plot of their mediocre car pornography, leaving those vague spectres to hang, creaking in the doorway. Foreign companies like Hyundai and Honda ran commercials with an unprecedented dose of America-centrism—which is appropriate, since this is all America's fault.

 For its part, the Academy occupied itself with the usual fashionable autofellatio that “People” will fawn over for the next year and a half. Academy president Sid Ganis made no speech, rendering the hallowed institution a silent irrelevance at its own show. Instead, the forum was left open to presenters and award-winners to send whatever messages they pleased.

 Though it was rarely genuine and never uncensored, this was the most interesting part. What comes through—both overtly and subtextually—in the acceptance speeches and category introductions acts as a measure of our social climate. From Sean Penn and Dustin Lance Black's tearful rallying cries for equal rights for homosexuals, to Tina Fey saying “a blinking cursor on a blank screen” instead of pen on paper, the concerns and considerations of the outside world were sucked into a swirling microcosm, condensed, and spit back out onto televisions across the world. Except much wealthier and prettier.

 Social commentary and fashion guidance is the only reason to watch the Oscars anymore. At least it is acknowledged that the dress that one starlet wears on one night is just as important as the culmination of years of planning and hopes and dreams and work—movies, that is. All is well, really—people tend to know much more about what movies they like than what clothes to wear. It's an instinctual reaction, and there's no reason to trust an arbitrary construct like the Academy with our taste in movies.

 So the Oscars is about everything but the movies now. The video game industry shut down and reformulated its seminal event when it stopped being about the games. Should the Academy do the same? It isn't really up to the Academy—it's up to the viewers at home. There were already viewership fears this year, which means a well placed blow in a years time could bring the house of cards toppling down.

 Perhaps our flaccid attempts at controlling our rampant energy overuse could be applied with some effectiveness to the Oscars—always watch the Oscars in as large a group as possible to minimize energy usage and ratings, and turn off the Oscars on unnecessary televisions—especially when no one is in the room. Or, just don't watch them at all.

Monday, February 9, 2009

Caution: Compulsive Caring for Cairns Can Create Critical Complexity

 Andy Goldsworthy's most meaningful works are composed of natural, unaltered materials nearly as eccentric as he is—spit to glue icicles, sharpened twigs to pin bracken—with no greater tools than his own two hands and, occasionally, his teeth. Rock balancing, however, requires abstract tools like a sense of gravity, judgment of form, tactile intuition, and “understanding of the stone” that Goldsworthy clearly has. His egg-shaped cairns have made him a renowned rock balancer, but they are also heavy with significance for Goldsworthy as both an artist and a critic who creates within the greater creations of nature.

 This Scot's deceptively simple art—nature gives him rocks, he stacks them—easily blossoms into infinite meaning and interpretation. The beauty of all his sculptures come from a simple form with great subjective potential, but the rock sculptures he creates have a comparative permanence that form their own niche within his highly specialized art.

 Individually, the general egg form of Goldsworthy's cairns combines overt organic imagery with the covert insinuations innate within human interference with nature. The egg is the minimalistic, aesthetically pleasing symbol of birth; the cairn is rife with the jutting and fractured complexity of stone. The two together form a heavily layered symbol of life, manipulated to hyperbole in its exaggerated scale—Goldsworthy's own take on nature's form.

 Anyone—with balance, practice, and luck—could stack rocks or even approximate Goldsworthy's style of cairn, but his larger-than-life constructions both prove his dexterous mettle and cement his artistic interpretation as something other than a knock-off of nature. The gathering of rocks that eventually becomes a cairn also imposes order upon the random scatterings of nature, while remaining a chaotic sight to behold even when perfect balance is attained.

 Together, Goldsworthy's cairns are symbols of travel that demonstrate the interconnectivity of the world that everyday life glosses over. Cairns are historically path markers for anonymous travelers, but Goldsworthy's are also the focal points of a personal network of ley lines that link the “journeys and places that [he feels] an attachment towards,” as he comments in “Rivers and Tides,” the biographical documentary of his art.

 Be it God's will, random biological chance, or aliens, nature has flourished as an artist. But no artist can reach her highest potential or greatest appreciation for the majesty of her work without a great critic to complement her. For nature, that critic is Andy Goldsworthy.

Tuesday, February 3, 2009

Palin Not Mutually Inclusive

 In 2006, AOL did something that most people have never heard about, which is an odd concept in a digital age where information spreads uncontrollably. The only way not to learn about something anymore is to not seek it out. It appears that many people weren't too concerned with the fact that thousands of individuals' privacy was compromised when AOL released its search logs to researchers. The logs were supposed to have been stripped of identifying data, but search engines everywhere were taught a grave lesson when reconstruction efforts allowed researchers to pinpoint the identities of many so-called "de-identified" AOL users. Search engine companies, digital pioneers, and privacy advocates alike were taught an important lesson in 2006; no one is safe, and old notions of "privacy" have become obsolete.

 But MIT professors and ACLU lawyers weren't the only ones interested in the results. Ars Technica has taken note of two Dutch filmmakers who have taken a particular interest in 711391, dedicating the "I Love Alaska" series of 13 short movies to her eccentric search strings. Her digital fingerprints paint the portrait of her life, and each are presented in an appropriately post-modern style; "short and stark, each shows a barren landscape while a narrator reads through the actual list of search queries," the films take the task of context out of the hands of the filmmaker and lay it upon the viewer. It is up to each of us to reconstruct 711391's life in whatever manner comes to mind. It is an exercise in creativity and education on issues of privacy and personal liberty that we tend to evade.

A Student of Nature's Form

 Be it God's will, random biological chance, or aliens, nature has flourished as an artist. But no artist can reach her highest potential or greatest appreciation for the majesty of her work without a great critic to complement her. For nature, that critic is Andy Goldsworthy.  

 His most meaningful works are composed of natural, unaltered materials nearly as eccentric as he is—spit to glue icicles, sharpened twigs to pin bracken—with no greater tools than his own two hands and, occasionally, his teeth. Rock balancing, however, requires a sense of gravity, judgment of form, tactile intuition, and “understanding of the stone” that Goldsworthy clearly has. His egg-shaped cairns have made him a renowned rock balancer, but they are also heavy with meaning for Goldsworthy as both an artist and a critic who creates within the greater creations of nature.

 Taken individually, the general egg form of Goldsworthy's cairns combines overt organic imagery with the covert insinuations innate within human interference with nature. The egg is the minimalistic, aesthetically pleasing symbol of birth; the cairn is rife with the jutting and fractured complexity of stone. The two together form a heavily layered symbol of life, manipulated to hyperbole in its exaggerated scale—Goldsworthy's own take on nature's form.

 Anyone—with balance, practice, and luck—could stack rocks or even approximate Goldsworthy's style of cairn, but his larger-than-life constructions both prove his dexterous mettle and cement his artistic interpretation as something other than a knock-off of nature. The gathering of rocks that eventually becomes a cairn also imposes order upon the random scatterings of nature, while remaining a chaotic sight to behold even when perfect balance is attained.

 Taken together, Goldsworthy's cairns are symbols of travel that demonstrate the interconnectivity of the world that everyday life glosses over. Cairns are historically path markers for anonymous travelers, but Goldsworthy's are also the focal points of a personal network of ley lines that link the “journeys and places that [he feels] an attachment towards,” as he comments in “Rivers and Tides,” the biographical documentary of his art.

 This sort of deceptively simple art—nature gives him rocks, he stacks them—blossoms into works of infinite meaning and interpretation when subjected to even limited analysis. Goldsworthy has proven himself as an artist by taking some of nature's best art and working critically with it. Now all he needs is a good critic himself.

Wednesday, January 28, 2009

Rule 5: You Don't Talk About Style Club

 The sample sentences that are used to teach grammar and style are inimitably entertaining, whether they are hilariously inept or simply awesome. Those found in “The Elements of Style” are of the latter variety, employing such suspenseful samples as, “he saw us coming, and unaware that we had learned of his treachery, greeted us with a smile.” I should like to write an entire murder mystery or spy thriller based on that one sentence.

 That aside, the punctuation of our fair and grossly convoluted language is its breath, for it controls our breath. And as breath is life, so punctuation is the life-giver of language. That is, unless you are an authentic Ancient Greek, in which case you do not use punctuation or even spacing in your writing and it is a wonder that you get anything done at all.

 I wish to take to heart many of the technical specifications and clarifications of “The Elements of Style,” for my time under the reign of Montessori-inspired education ruined my sense of grammar and punctuation, so it is a wonder I can read and write my native language at all. In all seriousness, I have no significant trouble with the English language—though I do hold grudges against it—but I do not understand the fine mechanics behind some of its more loaded scenarios, such as conjoining various combinations of independent and dependent clauses. I tend to get it right, but without knowing why.

 I shall therefore defer to a book that actually uses the word “indefensible” in reference to certain usages of the comma, for any style guide with as straightforward and austere a title as “The Elements of Style” must be able to rein in the unbridled forces of the English language. I will start by paying closer attention to some of the suggestions in Rule the Fifth, for there are innumerable ways to combine independent clauses. But don't do it with a comma. I must remain especially aware of using “so” in the middle of a sentence, primarily to show a cause-effect relationship, as it is apparently an adverb and so must be preceded by a semicolon. However, I will have to continue my quixotic quest for an adequate way to rectify the situation when I stumble across it, for the book's suggestion to transmute the “so” to an “as” and place it at the beginning of the sentence is awkward and—in the case of a string of causes and effects—much more noticeably repetitive when at the beginning of the sentence. I also will not precede the “so” with a semicolon, for that is a terrible waste of powerful punctuation (like dunking a nuclear power rod into lukewarm bathwater) and, as Vonnegut says, semicolons are “transvestite hermaphrodites representing absolutely nothing. All they do is show you've been to college.”

 I've been to college; I'm proving it right now.