Tuesday, December 14, 2010

games and Games

I have always objected to the term 'videogame'.

Growing up in London in the mid 80's, I knew them simply as Games. My pubescent older brother had a pair of friends that exemplified the odd couple, named Merlin and Richard. Merlin had the stature and affect of a cherub gone wicked, while Richard was a sardonic mountain. Though they and my brother likely got up to all manner of trouble, my defining memories of the raucous trio surrounded their playing of (and extensive talking about) Games. Their eyes would widen as they said the word, reverently, with a capital "G", clearly a superlative realm of experience beyond the drab world of 'ordinary games'. If asked to clarify, perhaps by a curious adult, they probably would have said "computer games", and I agree.

In hindsight I have attributed this distinction to the English propensity for simplicity in casual speak. Moving to the United States at age six, I was immediately struck by the obsessive technicality of American language (a tendency some call 'Pentagon Speak'). The blunt English term 'lift' had been elaborated to 'elevator'. 'Car park' was romanticized to 'garage'. 'Hoover' was replaced with the literal 'vacuum cleaner', and the plastic superhero figurines I called plainly 'figures' had been commercialized to 'action figures'. So when a friend first dragged me into his brother's bedroom and over to the NES to "play videogames!", I got the translation, but I wasn't happy about it. After spending most of the rest of my life living in the U.S. I have lost my kneejerk imperialist's distaste for a majority of Americanisms, but not that one. 'Videogame' will forever feel cloying and infantile to me.

In a broader sense, though, I genuinely believe 'videogame' to be a misnomer. It smacks of hasty media classification in the face of an emerging trend. The 'video' refers to the display medium, implying that what sets these games apart from others is the flickering screen. By this line of logic, any televised or visually projected event has the potential to be a 'video' game, and we could have just as easily called it 'telegaming'. What I discuss in this blarg are uniquely computerized games, developed on and for computers. From the first board game simulations written for Ferranti computers in the early 50's to the Commodore 64, through the Nintendo Entertainment System, the PC Gaming explosion of the 90's and now with app games on mobile devices - these are truly computer games. And while the definition of the term 'computer' is shifting daily, it is the unique approach of computing that makes these 'virtual' games possible. There are proponents of renaming the whole affair 'digital gaming', but that is another can of worms I'm not interested in opening.

I am an adherent to 'computer gaming', and most of the time they're just games. But in my private capacity, I will always think of them as Games.

Tuesday, November 9, 2010

Masters of Doom part 2: the Wow Factor

I again have a capable gaming PC, and by capable I mean that I am now able to play all the games I missed over the last 5 years. I am resigned to the knowledge that I will always be staggered behind the latest and greatest developments in PC gaming (and, indeed, the platform in general). XKCD's Randall Munroe sums up the phenomenon perfectly in this strip: (for more, go to http://www.xkcd.com)


And speaking of Half-Life 2, I have chosen to re-enter the world of PC Gaming picking up where I left off: Episode Two. This continuation of the Half-Life story is fantastic for any number of reasons, but to fully explain my experience of it, I have to share some history with you.

The first videogame to ever make me say "WOW!" was Quake 2. I'm talking slack-jawed, incredulous, protracted WOWWWWing. Sure, I'd been impressed by games before - the Super Nintendo was blossoming as a creative platform around the time I started evaluating games on a level beyond arcade-instant-gratification. I remember people "ooohing" and "aaahing" over Donkey Kong Country and its polished pseudo-3d effects. The technology of gaming had matured to allow very expressive visual art, perhaps for the first time. I am not denigrating the work of game artists before the 90's; getting attitude or atmosphere with precious few pixels was a creative triumph of its own. But increased resolution and color depth allowed the creation of games that even a layperson could call beautiful, and meant that games were less defined by their technical limitations than they had been in the past. The Super Nintendo (and, arguably, the Sega Genesis) marked the culmination of 2d game art.

Around 1997, all my rich friends started getting Playstations and Nintendo 64s for their birthdays and Christmas and Hanukkah. I was not impressed. All of a sudden, Mario was flying around in large 3d worlds and doing karate. Turok: Dinosaur Hunter was visually stunning, but felt like more of a tech demo than a game. Goldeneye had satisfying multiplayer, but often felt staged and hokey in singleplayer. The Playstation delivered well for a number of genres, but all of the games I played that were supposed to be breathtaking already looked dated to me: Resident Evil was as awkward and ugly as the teens glued to it. Final Fantasy VII was a triumph of the RPG medium, but the visuals were painfully stitched together from a mess of 2d and 3d elements that clashed on screen. Even Metal Gear Solid failed to grasp me with its drab environs and snail's pace. The Playstation, though inspired with content, was not powerful enough to do real 3d environments justice, and as such I failed to be wowed by any of its games.

The sinking feeling this game gave me.

I played these games with a detachment that felt foreign to me so far in my experience of videogames - I didn't care that they were 3d. What was surely a landmark technical achievement in the gaming industry often didn't seem to translate to me, the gamer, having any more fun. Mario 64 was a good game, maybe even a great game, but I couldn't help but feel that it lacked the depth of its SNES predecessor, Mario World. Same with the (later) Metroid Prime. Same with Zelda: Ocarina of Time. For all of its visual impressiveness and way-paving, I always found the content for the Nintendo 64 lacking - a triumph of style over substance. I think I could sense, even at the tender age of 12, that the glorious age of 2d gaming was coming to a close, and that 3d entertainment had not matured to the point of being a worthy replacement.

My disappointment with these supposedly 'revolutionary' titles slowly pushed me to the realization that there was a much less quantifiable element, the immersiveness of a game, that I found so compelling. It was an effect achieved through a very deliberate concoction of elements such as sound design, pacing of the action, player rewards and visual fidelity. The in-game environments needed congruency - a consistent feel in which no part of the virtual reality was displaced enough to break the suspension of disbelief. I played games because I liked being in the environments created, and I was more compelled to spend time with the incredibly polished Zelda: A Link to the Past than I ever was with Ocarina of Time.


Turok: If you search real hard, you might actually find an enemy!

Meanwhile, the experience on the PC at the time was confusing. The transition to 3d hardware acceleration was difficult on a good day and left a bad taste in the mouths of many gamers. PC games had been thus far less visually adept (due to lack of hardware acceleration) but more conceptually adventurous than their console counterparts. The high cost and initial investment in making a console game has always steered investors towards 'safe bets', games that appeal to the lowest common denominator. The PC platform, conversely, is an open sandbox in which anyone enterprising enough can attempt their own games, and the channels for distribution are many and varied. As such it is the unofficial home of a vastly quirky and creative gaming culture.

Part of the unique opportunity present on the PC was the interface: a mouse and keyboard comprise a very robust input scheme. Text-based games were possible, and the pointing control of the mouse allowed for a level of interaction not possible on consoles. In addition, the almost endless slew of joysticks, gamepads, racing simulator wheels and whatever-else-people-could-possibly-imagine meant that, when it came to control, PC was king. This allowed for the birth of new genres: the point-and-click adventure, simulators of every scope and focus, the realtime strategy game... and the first-person shooter.

I want to take this opportunity to mention that I am not a violent person. Anyone who knows me personally will tell you as much. Although any conventional child psychologist would have labeled me with ADD, and although I was a very destructive and angry youth, the moment puberty hit I became a skinny, docile teddy-bear. Actually, you'd probably find a moving, talking, 1-foot-tall teddy-bear a lot scarier than me. So when I describe my superlative experience of playing these (absurdly violent) games, please don't misunderstand the root of my excitement. I was not waiting for a socially sanctioned excuse to shoot people in the face - I just enjoyed it when I came across it. (An aside: if you're still uninformed enough to believe that videogame violence correlates in any way with real-world violence, don't bother to share your opinion with me. You've already lost the argument. Seriously.)

Enter Quake 2. Made by the founding fathers of the visceral first person PC gaming experience, id software, Quake 2 knocked my socks off. There was a level of detail, of atmosphere in this game that I had never seen before. The in-game environments were believable: chunky in the familiar idiom, and the structures and buildings seemed to have weight to them. Dynamic lighting meant that laserbeams would create an orange glow on the walls around them. The game's locales were somewhat drab, following id convention, but they had an open-endedness that encouraged wandering and exploring without turning down the heat on the action too much. Enemies in this game were smarter: they would wander through the levels to find you. They reacted quickly and semi-intelligently to your attacks, displaying some manner of the will to live and to challenge the player's right to. And some of them were quite aggressively scary: robotic dogs that would jump out of the dark at you, attack relentlessly and take some real killing. Cyborgs with large grafted blade arms that wanted nothing more than to hack you to pieces. Landing a shot on one of these guys would register a believable reaction: whiplash, bleeding, visible signs of injury (called 'pain skins' by the developers) and, of course, the trademark giblets if you hit them hard enough.


Ouch.

Quake 2 was a real refinement of id software's design principles: that a first-person action game should be intense, scary, atmospheric, gory and, most of all, immersive. They didn't waste time on hokey story elements (that wouldn't become really viable until the next generation of first-person-shooters, anyway) or glitzy silly special effects, instead opting to make interactions in the game as believable as possible. Your machine gun kicked back as you fired it, requiring you to 'steady your hand' using the mouse. Dead enemies would attract swarms of flies. Grenades bounced around with somewhat believable physics models. Landing on the ground after falling from a height looked, sounded and felt painful to the player. Little details such as these combined to make the experience of being in the game world tangible in a way that was previously unknown to me. These simple stabs at realism brought me into the game in a way that bad voice acting, poor NPC behavior and some silly pubescent sci-fi drama never would have. To this day, playing Quake 2 makes me feel vulnerable and mortal, afraid to get hurt. Quite a far cry from the Rambo superman antics of Doom and its copies. Quake 2 marked for me the maturing of the first-person action genre.

And don't get me wrong: some of these elements had been present in some form or another in abovementioned games. Turok had impressive level design visually, but none of it felt that relevant to the player - aside from feeling like a ghost world, it was a setting that the player wasn't particularly compelled to explore. I dunno, maybe people just like dinosaurs. Or being bored. Resident Evil did very well building dramatic tension and dread, but damn did it suck to play! The simple mechanics of walking across a room were painfully difficult for some inexplicable reason, as if the programmers had purposefully screwed with the laws of physics just to piss me, Tarun Perkins, off. Never mind lining up a zombie in your crosshairs or getting anything useful done. Those apologists who thought the terrible control scheme was a deliberate ploy to heighten the panic can go F*** themselves. This line of reasoning constitutes the worst excuse for poor design ever. Goldeneye had interesting missions and some pretty good level design, but enemies just stood around in the environments and waited to be shot. I felt like I was playing that game through a distancing film, as if Bond himself were several martinis deep at the beginning of each mission. Enemies moved drunkenly and illogically, and their comical animations made me imagine them as being rejected stunt men from other games. In-game events such as explosions or helicopter takeoffs seemed to happen slowly, predictably, like they'd been waiting patiently to be triggered. It felt so linear, so call-and-response, and the ultimate suspension of disbelief was not fully created. Let's not even get started on the poo-fest that was Hexen II.


Line up so I can shoot you!

Unfortunately, the release of Quake 2 marked the culmination of id's important content contributions to the gaming community. They could only work one formula so far, and it would be simply a matter of time before their imitators started successfully taking the genre in new directions. Imitating your imitators is a sure sign of creative failure, evident first with Quake III, which competed against and was ultimately bested by Unreal Tournament. Doom 3 borrowed heavily from Half-Life (let's see: game starts with a mundane but informative walk through a science facility, only to require that you walk back through it after it's been ravaged and infested by demons. Check. There's a protracted and relatively boring stint on a train. Check. Full of scientist NPCs. Check. Doom 3's marines look and fight surprisingly similarly to Half-Life's... you get it). I will say, though, that it earned my respect for being highly polished and went down in my personal history as the first game I was too afraid to finish. id's current labor of love, Rage, looks set to follow this trend, and I predict that it won't sell as well as they're hoping for.

Yet id had found by this time what it was good at: licensing its engines. Quake 2 spawned a whole generation of games based on its code, games that ambitiously pushed the genre forward. Sin was a great example of a plot-driven shooter (once patched enough to be playable), Soldier of Fortune was gloriously violent, and Kingpin brought the action to the hood with mature language and a flamethrower.

And then there was Half-Life...

This article ended up being much longer than I originally anticipated, and covered a lot more ground. Stay tuned for an actual exploration of Half-Life in Masters Of Doom: Part 3!

Thursday, September 2, 2010

Ben & Jerry's New York Super Fudge Chunk Ice Cream

Ice cream is concentrated satisfaction. It is the twin elements of childish culinary delight - sugar and fat - put through an icy crucible of alchemical condensation. As such, it is also the subject of endless fantasy and lust. And any object of lust has one sole responsibility: to deliver the goods, in as goodly a fashion as possible.

As I see it, ice cream can take one of two approaches in delivering the goods. The first is the road of simplicity: taking the finest (and smallest list of) ingredients, then assembling them in the finest way possible. In the modern world, this seems to be an inconceivably daunting task, at least in the U.S.A. (and let's not even talk about English ice cream, the only other nationality I have with which to compare). Your best bet is to know a local artisan who takes earnest pride in producing a quality product. Which I'm sure you all do, right?

I want my ice cream to be made by this guy.

The problem with the first approach is that it often leads to a phenomenon I refer to as "ice cream disappointment". Let's paint a scene: it's Sunday afternoon in the Universe, and you've spent the day so far lounging or adventuring with your sweetie or accomplice, and the specific brand of bodily need for ice cream arises. Or, for the more sophisticated among us, let us imagine that you have gone out to dinner with a special someone and have ingested and imbibed some very heartwarming comestibles and are ready for a toothsome epilogue. In either case, you have been doing some good living and are in need of an affirmative reward. So you go to order dessert, or in our first example you stroll down to your local creamery, and you decide that of all the offerings on the menu, the one that really calls to you is a plain old chocolate ice cream. Simple, classic and richly pleasing in every capacity. Before you know it, the sweating, mucilaginous treat is before you, and you go in for your first bite. It is in precisely this moment that one can so often experience ice cream disappointment. As the heavy chill transforms to fluid experience, as the familiar sensation of salty, chocolatey earthiness suffuses your consciousness, you can't help but hear the faint suggestion in the back of your head that this ice cream isn't good enough! My entire life has been leading up to this moment of indulgence, or perhaps I have allowed myself this one treat in the face of dietary stricture, and this ice cream isn't it, damnit! I have eaten a product whose sole existence in this reality is to provide me pleasure, and it has failed. You may choose to continue, but even as you do, you are keenly aware that you are now dealing with a system of diminishing returns. Now, I know that this brand of disappointment is not localized to ice cream. However, of all the food items that I eat with any regularity, ice cream is the one that makes me feel this the most acutely.

The second approach is the one Ben & Jerry's takes, and is a good choice for them, considering their acquisition by Unilever Corp. in 2000 all but dashed any possibility of employing the first approach. (For those of you indignant about the suggestion that Ben & Jerry's products have deteriorated since then, save it for someone who doesn't care). Their philosophy is what I call the "treasure chest" philosophy, and it's a very successful workaround. They have decided not to try for purity, instead infusing their ice cream with the promise of constant discovery. Often, when reading a Ben & Jerry's flavor description, you think to yourself, "All that in one little pint of ice cream?" I guarantee, though, that your next few minutes will be rife with adventure, no matter which you choose.

Furthermore, I'm coming to believe that New York Super Fudge Chunk is their best flavor. The description: "Chocolate ice cream with white and dark fudge chunks, pecans, walnuts and fudge-covered almonds." Bam! Chocolate upon chocolate upon chocolate. My favorite three nuts. This ice cream feels like a candy bar exploded into it, and the result is most junkily satisfying. I struggle to stop eating it, for each bite uncovers some new tasty morsel - just as you've forgotten about the stealth white chocolate chunks (which I otherwise don't like), another pops up, and you smile to yourself again. The combination of nuts and chocolate chunks ensures that you don't succumb to textural monotony. I am normally a conservative consumer of ice cream, content to eat only what I need to satisfy my craving in the moment. Ben, Jerry and the folks at Unilever may end up making a glutton of me.

If your tastes seek something a little more refined (but equally difficult to put down), I suggest you seek out Ciao Bella's Pistachio Gelato, or really anything made by that illustrious company. And while that may be another review begging to be written, I'll leave the hard work to you on that one ;).
Need I twist your arm?

Monday, August 30, 2010

Clicky Keyboards!

Ok, time to accrue some nerd cred. I have recently been using a 'clicky' keyboard purchased from Unicomp. To be precise, it's the Spacesaver 104, which means it's moderately less leviathan than the rest. They have a niche - they sell keyboards based upon the old IBM series M line, which were desirable for any number of reasons. Let's explore some of them:
  • Clicky Keys: these keyboards generate the good old-fashioned clickety-clackety you've learned to miss in the office. The technology behind this was developed in the 80's and is called the 'buckling spring'. This means that each button has in it a spring that compresses when the key is pressed and releases with that satisfying sound we all love (including everybody on your floor of the building). There is something very basely affirming about pressing these keys - each keypress is a statement! The modern alternative is called the 'rubber dome', which will be present in the vast majority of the keyboards you'll see these days. A look under the keys of such a keyboard reveals rows of little rubber nipples that, while cheap to manufacture and functional, are inferior in quality to their ancestors. Think Arabica vs. Robusta coffee beans. The main reason for this is
  • Ergonomics: the action on this thing is lovely! Your finger feels cushioned very nicely throughout its downward travel, and I notice a significant (!) decrease in the amount of finger, hand and forearm fatigue compared to my crappy previous Dell keyboard. If you spend an hour plus typing at a time (if you're a student, Facebook chatterer, long-distance-email writer or some kind of hacker) you will feel very happy you invested in one of these.


Quite the compact wonder, wouldn't you agree?

  • Durability: As old IBM keyboards have proven, these things will easily last for decades. There is a large metal plate upon which these keys sit. This keyboard weighs several pounds, and you could probably hurt someone with it. Just remember to clean them every once in a while!
  • Eliteness: the indelible low self-esteem of most geeks requires constant gadget upgrades to ensure their status on the imaginary pecking order. This keyboard will earn you nerd points.
Have I convinced you yet? Here's a link to a far superior article on the subject on Dan's Data: http://www.dansdata.com/clickykeyboards.htm . The length and breadth of appreciation in his writeup is the stuff of which geekdom is made. I, for one, am happy with my purchase and feel no need to go any further into the 'vintage keyboard' market. Peruse for your own Unicomp keyboard at http://pckeyboards.stores.yahoo.net/

Hip Hop you shouldn't miss

If you're into hip hop, there's a few albums you should make sure you hear.

Prefuse 73 - Vocal Studies and Uprock Narratives

Guillermo Scott Herren is probably best known under his moniker Prefuse 73, and this was his first album as such. Although many would consider One Word Extinguisher his finest work, and while that album is an hourlong glitch-hop opus, I find that for me it has neither the subtlety nor the staying power of his first. Vocal Studies and Uprock Narratives is what I feel the best hip hop should be: understated and greater than the sum of its parts. It begins with the garbled voices of a number of underground hip hop radio station DJs giving shoutouts and bigups. Then, suddenly, the rug is pulled out from under your feet as Prefuse's signature ribbon of clips and slices unfurls before you. You become acutely aware that this is NOT another collection of instrumentals waiting to be spit upon or an uninspired mix tape from some DJ Shadow knockoff (RJD2, I'm looking at you). This is a unified, refined, singular realization of artistic vision. Beats and samples are vivisected and strung together gleefully like Christmas lights, though nothing about this album is wanton. There is a sensitive intelligence behind each decision and inclusion - listening to Prefuse's finer work elicits the feeling of each track having been pre-approved by a genius (I feel similarly listening to Aphex Twin, an obvious source of inspiration for Herren). What surprises me about this album, what keeps me coming back, what keeps its tracks creeping under my skin is the gentleness of his presentation. Where hip hop is hard, Vocal Studies is soft. Where you expect his cuts to be sharp, they are smooth. The tracks hit, but never do they feel domineering or demanding, two common afflictions of the genre. This album could be considered a contradiction embodied, but I would have to disagree.



MC Solaar - Prose Combat

MC Solaar IS French hip hop. He is France's answer to Bob Marley. He is one of the only rappers I would unreservedly call a poet. It's a shame that most of you reading this will not be able to understand his lyrics, for they are special. This album has more than its fair share of inspiration - it has the condensed feel of a greatest hits record. Floating on the exultant beats provided by Jimmy Jay, Solaar spans the spectrum of emotions in his witticisms and philosophical observations. Jay's tracks are a spectacular jazz and soul retrospective, and the album's backing vocalists and Laurent Vernerey's bass contributions join them seamlessly. Through its horns, slick drum breaks, pulsing grooves and bizarre samples, Prose Combat is as refreshingly new as it is classically old. This album exemplifies the genre during the creative rush of the mid 90's, and is not to be missed by any fan of hip hop.



Q-Tip - The Renaissance

After a decade of relative obscurity, Kamaal Fareed emerged with this, his album of rebirth. Good thing, too - many of you probably had forgotten all about this man, this half of A Tribe Called Quest (ok, let's be fair - 80%). If ever there were a rapper possessed of a gilded tongue, it is Q-Tip. The Renaissance is many things, but what struck me about this album was its maturity. Every moment of every track is older, wiser, more dense with intention than we're used to, especially from New York's good-time fun-loving charmer. The sound is so very him, but so very modern, and so very compelling. Everything is tasteful; each groove is spoonfed to you, lest you tire too quickly. Q-Tip covers a lot of ground in his lyrics and provides a journey through the process of becoming a 40 year-old rapper. He gives credit where credit is due, and sometimes demands his own. Bassist Antuan Barrett provides some of the bounciest, most compelling lines I've heard to date, and a quick list of important album credits includes Raphael Saadiq, Amanda Diva and D'Angelo. My suggestion to you would be to listen the hell out of The Renaissance. Pardon my french.

Friday, August 6, 2010

Masters of Doom and a short history of family geekery

I have just finished reading a book by David Kushner, a journalist and author known for his work covering the music and videogaming industries. The book, Masters of Doom, is a biography of the gaming company id Software and its two founding fathers, John Romero and John Carmack. You'll have to forgive the book's melodramatic and repetitious delivery; you'll be wondering why three full descriptions of the origin of the "Burger Bill" moniker were needed, and you'll never again want to hear the phrase "bit flip". Some sections of the text feel like they were stitched together from a number of disparate articles. What lies beneath the gluttonous delivery, though, is a well-researched and entertaining exploration of the decade that saw PC games explode into the mainstream, and what many alive during that time might refer to fondly as the golden age of PC gaming.

Part of what makes this book so exciting for me is that I was alive during this era! For once, I can read the recounting of historic occasions in which I participated, in this case occurring within the computer geek subculture of the 90's. Granted, my older siblings were more steeped in it than I was -- coming of age around the beginning of '98 meant that I experienced most of these developments via the trickle down effect. They were no less thrilling because of it.

First, though, a short history of my first six years. Like any kid raised during the 80's, I grew up against the alluring backdrop of computer games. They became the object of lustful desire for any red-blooded child who got a taste. Nintendo and Sega were in a full-scale console war and the consumers were the beneficiaries. Some of my earliest memories of my older brother extend back to his frothing excitement over the release of Sega's next big system, the Megadrive. Television ads at the time exclaimed, "I dreamt I was in a Sega! It was wicked!" One of Sega's biggest appeals was its apparent lack of censorship -- while Nintendo was content with psychedelic jumping plumbers, some of Sega's more gruesome titles played out onscreen in a manner that many parents decried, including my own. As a hyperactive and violent child since birth, I was forbade to play video games at precisely the age I became aware of their hypnotic pleasure. I vividly remember hiding behind the couch or on the top bunk in my brother's bedroom, contenting myself with watching what I could not touch.

Consoles weren't the only appealing virtual escape in our house. My mother, a graduate student at the turn of the 90's, brought home a computer on which she did her schoolwork. My brother, not content to let a gaming opportunity pass, was soon blasting away on games like Bruce Lee and Elite. The latter, now considered a cult classic and the target of many ports, remakes and copies, had an unsurpassed depth and freedom to its game world that may have provided many with their first glimpse of the freedom PC games could provide. Admittedly, I never could get into Elite. I was only six, and had no patience for the slow pace of space exploration.

Around the time that Nirvana released Nevermind, our family packed up and relocated to the U.S. It was a shock for all of us, emerging from the concrete oppression of London to discover the rolling hills and forests of upstate New York. This wildness would set the tone for the next couple of years, for we had brought with us no TV and no computer! Though shocking at first, I believe we all found the verdant setting refreshing. I preoccupied myself with that most virtual of realities, pure imagination, exploring and fighting with invisible things.

In late 1993 there were two major developments in our household. One was the birth of my little sister. The other was an IBM PC packing an Intel 8088 processor clocked at 4.77mhz! It had an industrial-sized red power switch and sounded like a helicopter launch when powered on. Even at the time it was clear this was a dated computer, considering it had been released in 1981 about three years before I was born (!), but the effect its presence had on our family was immediate. It had two 5.25" floppy drives and no graphics card, meaning we had to content ourselves with ASCII text graphics and our imaginations. Interfacing with white-on-black text only computers was a wonder akin to staring up at the stars, piecing them together into the constellations we knew should be there. All of a sudden we had channels streaming in again: BBSes and the web! Email and chat rooms! Though the quality of communication was primitive when compared with the standards of the day, it all felt new and brave and exciting. My older sister became entranced with online text-based role playing games. They had a narrative intrigue and a crumb-trail of incremental rewards that pulled one beyond the binary litany of the blinking cursor.

The aspect of the primordial internet that interested me most was the Bulletin Board System (BBS), an early example of an online forum. A programmer would host a service on their computer or network that would allow others to dial in and interact in any number of ways. The beeps and static shrieks of the modem were a telltale sound in our house, and woe betide the person who picked up the phone during a session, disconnecting the logged-on user. The local BBS scene culminated in biweekly Geekfests, where geeks and anyone who identified with them would hang out and celebrate their culture. These events were marked by gaming on any platform available, quirky and far-out interactions, and probably a lot of weed smoking (I was too young at the time to know about such things). The characters who attended these events were as vivid as any dreamed into virtual existence: the wispy-haired and ethereal Dren Eht Sral (larS thE nerD backwards) and Noid, whose daily attire consisted of welding goggles, gloves with Freddy-Kreuger finger-knives and a black trenchcoat. It was par for the course in a community that declared the definition of a geek to be "one who bites the heads off of live chickens!"

The geekfests I was allowed to attend were held at a labyrinthine house off of some back country road in Tompkins County, NY. The place looked like a pile of wood cabins dropped from the sky that had since healed together - it was a mess of rooms, corridors and staircases. It was at these geekfests that I first caught wind of the sensation surrounding id's Doom. All gaming and hands-on geekery took place in the basement of the house, which was outfitted with a number of PCs and the Super Nintendo responsible for holding most of my attention each visit. I was enthralled by a recent release, the hovercraft racing game F-Zero that I was surprised to read in Masters of Doom was enjoyed by the staff of id software during the development of their early games. My brother and some others shrugged off its gameplay as boring and sterile, but for me it was the pinnacle of technical achievement in gaming to date. Also popular (but never appealing to me) was the controversial Mortal Kombat, whose gorily animated disembowlments were performed on animated sprites created from photographs of real people! I spent hours in front of that Super Nintendo and am sure my religiosity was a major annoyance to more than one geekfest gamer.

Meters away, a network of several PCs cranked away on Doom, perhaps the most visceral game yet developed at the time. I'd heard about the immersion, the tension, the gory graphics and thrilling sound effects. But, to be honest, the first time I saw Doom load up on a PC screen I felt a niggling sense of disappointment. This was it? While Super Nintendo games of the day were smooth, polished and colorful, Doom looked pixellated and dull-palletted. I watched as a player ran around some shapeless laybrinth, chopping up demons with a chainsaw. They groaned in lo-fi. I didn't get it. There were no puzzles, no colorful backdrops, no real sense of a consistent visual style, and the point-shoot-move-repeat game dynamic lost its appeal for me within minutes. This is no revolution, I thought. Someone made a game where you shoot stuff, and it's not as much fun as those overstimulating, epileptically staged simulations at the arcade. I exclaimed aloud, "Doom sucks!"

There was a moment of shocked silence. Then a veritable geek chorus retorted, "Doom rules!"

Hardly the most exciting achievement...

I was not, in the moment, able to see what made Doom so important. The fact that the PC finally had arcade-level action titles made for it was a huge breakthrough for the platform, but that was just the beginning - Doom supported network multiplay! You could assemble tens of your buddies and duke it out together in an arena. Doom invented deathmatch. It was also arguably the first real catalyst for the game modding community, a bunch of self-taught hackers who modified the game's code to implement their own changes. The Barney mod became one the most infamous examples. The subsequent release of level editors and dedicated modification tools meant that even unskilled players could try their hand at personalization. This game took full advantage of the PC as a platform, allowing people to play in ways they'd never played in the past. And the "first person shooter" genre (popularized by Doom but contested as to its game of origin) was almost singularly influenced by Doom for the next decade.

Doom earned its many successes, but what made it so timely was its coincidence with other cultural trends. Its hellish theme and unapologetic gore harmonized well with the attitudes of angsty grunge and goth kids who were listening to Nine Inch Nails and Marilyn Manson. Satanic and occult images were flooding the collective teen consciousness through films, literature, Magic: The Gathering and, now, computer games. All of a sudden, everybody knew somebody who practiced Wicca. Vampires, werewolves, zombies and demons were more popular than ever in this generation's mythology. If you don't remember smoking weed while listening to Downward Spiral and playing Doom, you probably weren't teenaged in the early nineties.

Though id's John Romero (Doom's ideological father) cratered his game coding career through an ill-fated and over-inflated flight of ego, John Carmack (the other parent) remains one of videogaming's most revered geniuses. His work on 3d engines and virtual environments continues to be at the forefront of game development. If you've played Games in the last 15 years, chances are you've used some of his code. Though I didn't personally discover the joys of Doom until 1997 (when the family got its first Intel computer with a 66mhz Pentium processor!), I have reaped countless benefits from id's contributions to gaming.

... And those benefits will have to be explored in a later post, because this is getting ridiculous. Hats off to one of gaming's finest developers! And many thanks to David Kushner for taking me down this whole silly path of nostalgia in the first place. For another good gaming-related read, check out Tom Bissell's Extra Lives.

Tuesday, August 3, 2010

An Excuse

So now I have undertaken to write my own blog.

Why? This is a question fledgeling blog authors should be obliged to ask themselves. Who would want to read what I have to say? The answer in this case is nobody, really. Family, maybe. Some friends might poke their heads in from time to time as a vote of sympathy. By putting my thoughts up in the free public domain I have knowingly created "another fucking blog", which any erstwhile browser of the web would be justified in thinking.

My reason in this case is that I need an impetus to start writing again. I've been on hiatus now for some extraordinarily long passage of time, for any number of reasons. It's time to resume, and I need an obligation to some arbitrary sense of the public in order to get work done. I am writing for you to ultimately keep writing for myself.

Often lately I have been consuming some media or another, only to find myself starting to review it in my head. Articles and opinions are writing themselves. I've been politely ignoring this tendency over the last year or so, but it will not let up. So here it is: the unsolicited opinion of another blog.

Followers