Tuesday, June 10, 2014

Thing practiced: nothing but liking things

I’m not sure how or why, but I found myself embroiled in an hour-long argument with a bright and—palpably—earnest young man about video games, their import, and who may or may not be ruining them for everyone. What prompted this was my audible praise for last year’s Gone Home, so I figured it was worth writing about the game, plus some other things I’ve enjoyed recently.

Gone Home

The Fullbright Company, Aug 2013

Gone Home: Polygon’s 2013 Game of the Year. Recipient of glowing reviews in the New York Times, Kotaku, the Atlantic, and umpteen other outlets of opinion. VGX’s Best Indie Game and Best PC Game of 2013; winner of the Best Debut awards at both BAFTA and GDC 2014; Games for Change 2014’s Game of the Year. Also: the recipient of user reviews calling it a “2deep4u indie hipster walking simulator” and “an interactive chick story” that “sucked balls like ICO,” and “quite possibly the worst game of all time.” Why do people feel so strongly? Why might you care?

I grew up in suburban California in the ’90s, and like everyone my age I dropped everything to obsess about Pokémon. Thanks to crippling social awkwardness and my brother outgrowing his N64, I kept playing games—first finishing Ocarina of Time and Mario 64, then spending way more time with SNES emulations and gaming IRC rooms and BBSes than a kid probably should.

When I got older, video games seemed like the perfect medium for art. Games marry visual art, music, and written fiction: all things I loved. But beyond that, video games involve the gamer—they force you to play a part in creating the art. This makes games expressive in a way other media can’t be. What video games deliver is experience itself—and if art is about evoking a response in the viewer, what could be more effective?

As a gamer, though, I hit a dead end. I couldn’t make the transition from kid games to “real” games. Nothing resonated the way Zelda or Chrono Trigger had. The popular games all seemed to be about something else.

It’s not a coincidence that big-budget games both revolve around empowerment and are marketed mainly to teenagers and very young adults, the demographic groups most likely to feel frustrated and powerless. These games let people do all the stuff they can’t in real life: drive hot cars, get hot girls, shoot guys in the face, be the hero. Domination and destruction will always be appealing—and some of these games are masterpieces. As representatives of a whole art form, though, they cover a narrow range. We’re handed adrenaline, victory, and good-vs-evil times twenty, but what about empathy? alienation? romance?

That’s why it’s so heartening to play Gone Home.

It’s hard to write concretely without spoiling it, but I’ll try. Gone Home is short: roughly movie-length on a typical playthrough. You’re dropped onto a porch on a dark and stormy night, and you walk around examining things, piecing together a story from the fragments you find. All you have is a window into a 3D world—you can’t see yourself, there’s no health meter, no number of lives. There are no character models, no cutscenes, no puzzles; there’s no combat, no story branching, no fail state. It’s the opposite of high-octane.

But it’s spellbinding. As you probe the intricately-crafted spaces, each element lures you in. The art is sumptuous and hypnotic, and the voice acting is exquisite. It’s all just right—the music, the lighting, each squeak of a floorboard and clack of a light switch—multilayered and cohesive, like when someone’s fingers intertwine perfectly with yours. And Gone Home stays playful throughout: witness interactive food items, endlessly flushable toilets, the inexplicable omnipresence of three-ring binders. If you like, you can heap a pile of toiletries on your parents’ bed, or turn on all the faucets. If you’re scared, you can carry a stuffed animal.

It works. Not because it’s high-concept, but because it’s deeply human. For me, at least, the game unearthed some long-repressed feelings—anxiety, ostracism, the thrill and poignance of a first love; how everything then is either exhilaration or heartbreak.

More than that, though, Gone Home shows that a game can simply tell a story. A story anyone can take part in, one where someone new to games won’t die instantly.

If you’ve ever been intrigued by video games, whether you’re a gamer or not, you should try Gone Home. If you don’t have $20, let me know and you can play it here. If you have $20 but you’re not sure about spending it, try thinking of it as an investment in the Fullbright Company, and in the future of games.

(And if you’re a longtime gamer worried that an influx of new people will dilute gaming culture, I ask this: Have you ever been shunned and called names by people whose approval and acceptance are important to you? Yeah? Me too. It sucks, right? Please, don’t be that person, don’t reject someone’s bid to be allowed to love the things you love. Welcome them in instead; be like John Scalzi, or like John Siracusa. Speaking of which—)

Hypercritical (the podcast)

John Siracusa and Dan Benjamin, Jan 2011–Dec 2012

Every culture has its rock stars, and in the Apple-geek community John Siracusa is a headliner. Appropriately, he is understated and tasteful—an engineer and a humanist—and a delight.

As the title suggests, the premise of the show is that John is hypercritical. The show’s tagline expresses a curmudgeonly despair: “nothing is so perfect that it can’t be complained about.” But the show’s billing sells it short. Rather than the complaining-about-things it purports to be, Hypercritical is, instead, the most nuanced take I’ve heard on technology and human endeavor.

John and Dan cover esoteric topics with a missionary zeal: spatial interfaces, fault-tolerant file systems, the agony and ecstacy of game controllers. “Why,” we may ask, listening, “might we care if the file system checksums our data?” John teaches us why. He shows us that analysis and rationality don’t preclude emotional fulfillment, but foster it. That details are superficial only to those to view them superficially. That, by applying earnest (and ruthless) intellectual effort, we can make better things; change lives.

Also, the show is fun. John is a geek’s geek, and an omnivorous one (never before has a man opined so eloquently on toasters). John is witty and smart and self-deprecating, while Dan (the consummate host) alternately teases him/eggs him on/is sage.

But the real genius of the show, and why I love it, is its conscience. Too many tech personalities combine intelligent analysis, on the one hand, with condescension and a weird arrogance on the other. As a high officer in the nerd elite, John could easily make cruel sport of those less sophisticated, less informed, or less bright than him. But he never does. Despite his fervor, John has a steadfast respect for the convictions of others. He thinks hard and doesn’t leap to judgment, and he airs his critics’ critiques whenever relevant.

And John brings this respect for others further. Prejudice in the tech industry has been a hot topic lately—specifically sexism (though racism and anti-LGBT sentiment also get mentions). Now, it’s not news to anyone that people can be assholes, but it seems particularly galling when nerds are doing the discriminating—nerds who at some point (I’m guessing) were themselves excluded from a group, and who lived through the emotional consequences. It seems like a debasement of nerd culture.

But, so, the genius.

To reiterate, everybody loves John. On the totem pole of tech-hero status, he’s that one guy in whichever totem-pole position is the best. John has geek cred. When John talks explicitly about inclusion and exclusion, the nerds (mostly) listen. But he does this only rarely—he knows that people don’t like being “preached to” about their baser human instincts, and he knows that even people who take his point seriously won’t necessarily internalize it in a way that translates to better behavior. So he does something different. He models correct behavior, stealthily and maybe unconsciously, through 159 hours of discourse on disk encryption and chip architecture, layout engines and connector design. I wish there were more.

The Essential Dykes to Watch Out For

Alison Bechdel, Nov 2008

When I was twelve, everything I knew about being a lesbian came from Alison Bechdel’s Dykes to Watch Out For. These days, kids can watch The L Word, Orange Is the New Black, The Fosters, or really anything—but the strip still holds up.

This One Summer

Jillian and Mariko Tamaki, May 2014

Just gorgeous.

Monday, June 2, 2014

Thing practiced: overenthusing

Depending on your level of geekiness, you may have heard Apple’s announcement today that a team of mutant programming virtuosos has been working—in secret, for years—on a new language that will replace Objective-C as the lingua franca of the iPhone, iPad, and Mac – plus presumably whatever Apple televisual, home-automating, and/or wearable products are to come. The circle isat lastcomplete.

The language is called Swift—and, if you’re interested, Apple has released a 500-page ebook detailing the language to let developers get started today.

Technically, Swift has some nice characteristics:

It’s fast. Like C, C++, and Objective-C, Swift compiles down to native code—there’s no VM or interpreter slowing things down at run time. Coupled with some serious compiler optimization, this should make Swift performant enough for any size of application.

It’s safe. Unlike Objective-C, Swift code is proven safe at compile time—that is, no compiled code can cause an access violation at run time (unless we explicitly mark our code as unsafe).

It has first-class functions. Functions can be nested within other functions, passed as arguments to other functions, and stored as values.

It has closures. Along with the code for a function, Swift stores the relevant parts of the environment that was current when the function was created. (This makes higher-order functions like sort and fold possible and powerful.)

It supports both immutable values and variables. Languages usually choose one or the other, but Swift has both—just type let for immutable values and var for (mutable) variables.

Swift is statically-typed, but types can be inferred. If the type-checker can tell what type the value/variable should be, we don’t have to write it.

And types can be generic. Functions and methods are automatically generalized by the type-checker, so our functions can be used across compatible parameter types—without manually writing overloads, and without resorting to id and using casts everywhere.

Swift gives us namespaces, with its module system.

Memory is managed automatically, and without pointers. Like Objective-C post-ARC, Swift uses compiler magic to track and deallocate instances as needed. We don’t need to explicitly specify strong vs. weak references any more, or use * for reference types. This makes iOS programming more accessible (pro/con), though not as easy as with garbage collection. And since ARC happens at compile time, there’s no performance hit at run time.

Swift has tuples (or product types). We can group multiple values into one compound data type, without using container classes/structs.

It has very-special enums (or sum types). A Swift enum contains a value of one of various specified types. We can use these like plain C integer enums if we like, but Swift enums go further: Swift lets each enum type carry data.

It has pattern-matching to go with tuples and enums. Values can be matched against patterns to decompose them concisely.

And much more: lazy properties, property observers, class/struct extensions, buffed-up structs, option types, function currying, type aliasing, and more. And Swift manages to do all this while maintaining smooth interoperability with Objective-C classes.

(For an admittedly-trivial demo of some of these features, here’s code for Minesweeper in Swift.)

With ideas drawn (in Chris Lattner’s words) from Haskell, Rust, C#, Ruby, Python, and the last twenty years of PL research and practice, Swift looks like a language even programming-languages nerds should find pleasing. It delivers what everyone wants from a language: an elegant way to interface with software.

More important than Swift’s elegance, though, is its context. Programming languages rise and fall not on language quality but on more-practical factors: which libraries and tools are available, what the industry standard is, and what’s most likely to get you a job. Swift is the first place all these factors converge: a highly relevant platform, a sole platform owner with the will to impose a standard, a human-friendly toolset, and a newly-modern language.

Have you ever wanted to start programming on iOS? Today’s the day.