The Tech - Online EditionMIT's oldest and largest
newspaper & the first
newspaper published
on the web
Boston Weather: 32.0°F | Fair
Article Tools

The extent to which one could consider me an avid video gamer depends a great deal on your definition of “avid.” On the one hand, I know the difference between Ico and Ecco, and I always invert the Y-axis on my controller. On the other hand, I’ve never played a Final Fantasy game, and survival horror gives me the heebie-jeebies. (I have weak nerves and weak aim — sue me.) I also try to keep up with what games are being released, so I guess that counts for something, although my laptop isn’t quite on par with what is required to play most of these newfangled computer games.

I downloaded the demo for Portal once. Apparently, there’s a glitch in it that changes the portals to computer-crashing wormholes. Go figure. I contemplated e-mailing Valve about it, but decided that I’d rather play the demo for Worms 2 some more. (I don’t have a lot of cash to budget towards video games.)

I’ve always seen video games as a diversion akin to the movies, except that they happen to be more effective at keeping you awake — usually. For some reason, whenever somebody plays as Jigglypuff in Super Smash Brothers: Brawl, I start to get uncontrollably drowsy. There are more than a few parallels. Movies have test screenings, and videogames have beta phases. Movies have visual effects, and video games have animation artists. Movies have crappy sequels, video games… well, you get the idea.

Keeping in mind that plenty of people spend plenty of time watching television, I find it perfectly reasonable to spend comparable amounts of time playing video games — assuming I can spare the time, of course. Obviously, no MIT student would be so silly as to play video games with tests or problem sets inbound. Yet for some reason, we gamers are often viewed, at least by some, as drooling, glaze-eyed shut-ins. Perhaps the generalization is more true than we claim, but it’s definitely not true of all of us, all the time.

I took CMS.300 (Introduction to Videogame Studies) last semester, and as much as I enjoyed both the class and the looks of stunned envy on others’ faces when I smugly told them that such a class exists, I inevitably worry that the people reviewing my internship application won’t take me seriously after seeing the course on my transcript. Maybe I’m being paranoid (it wouldn’t be the first time), but I think it’s important to give credit where credit’s due, and I vehemently believe that the class was worth the credit.

Believe it or not, a class on video game studies took considerable effort to keep up with, even though it was a labor of love. After all, just because you play games doesn’t make you a qualified games scholar, much like how complaining about games doesn’t make you a qualified game critic, and like how driving a car doesn’t make you a mechanic. To be perfectly blunt about it, CMS.300 probably deserved even more commitment than I chose to give it.

I suppose the underlying lesson there, then, is that we gamers need to take ourselves and the analysis-worthiness of our chosen pastime seriously before anyone else can follow suit. We play, but how much do we really understand? There’s a question for the ages. Now, if you’ll excuse me, I have some very important business to attend to. Cliffs of Dover isn’t going to fail at 5% on Expert on its own, y’know. Work, work, work…