Wednesday, April 27, 2016

Things practiced: dumb solipsism, self-indulgence, big ball of regret

My first memories of death were of my mother’s mother, then my father’s mother. In Chinese culture there are strictly-defined rites around death and mourning: the son breaks a vessel into a thousand pieces to proclaim his grief, the children compete to see who can weep most deeply. Though juvenile and emotionally untutored I recognized the benefit then: it gives the bereft something to do.

I learned grief as a teenager, when a dear friend killed himself — except I learned it improperly, kept my thoughts to myself, and placated my restless brain by letting it systematically dismantle all the relationships I’d had. I convinced myself I was back to normal while growing gradually more mad; time passed, and I left as soon as I could.

Which is to say: I’ve mostly dealt with grim things by placing them in a box and running away at full speed. This has gotten me good at inconsequential things like brain dissection and numerical analysis but bad at handling emotional complexity.

So maybe I can start by writing something down.


Aimee was the one who taught me that even though our earliest experiences teach us who we are, the determined can transcend.

I was raped three times pre-adulthood, by people I knew: a cousin, a friend, a housemate. Aimee was raped many more times than that, by a man who was closer and less escapable. She fought back where I did not; she talked about it but I did not, not even to her — not because it was painful but because my experiences seemed pretty normal and I didn’t want to cheapen her experience, which had been abnormally brutal and cruel, by diluting it with mine.

When I was raped I discarded certain assumptions I had held about how the world worked and about how safe I was. But Aimee saw her trauma as a tiny obstacle to be cleared, and once cleared an affirmation of her strength. She always had a sense of the possible, which was admirable, and incredible.

What’s always seemed problematic is that the brave die, and yet we cowardly ones are still here.


Driving through Los Angeles, the landmarks I know best are still the ones from Aimee’s convalescence: hospital, hospital, pharmacy, cancer clinic; the readiest memories still those of sitting in freezing waiting rooms producing the insurance cards, putting them away, filling out paper forms, ad infinitum, uselessly, helplessly.

Maybe it shouldn’t have been a surprise that she was able to handle imminent expectation of death with grace and love and courage and personal sacrifice — but I was always surprised: at the equanimity with which she lost her long hair which she’d so prized, at her tolerance for pain, at her unflinching will to face hard truths.

Had there been a moment when she was afraid to die? I wasn’t willing to ask.

The problem is, as much as you’d like to, you can’t actually take someone else’s weakness or pain or fear.


There’s a time when you expect your life to always be full of new and shiny things, and there’s a day you realize that’s not how it’ll be at all. That what life really becomes is a thing made of losses, of prized things that were there and aren’t any more.

Grief is so uninteresting, I know; I can look at myself and scoff. I want people around me, I dread the moments of solitude; then I bore everyone until they leave, or talk nonsense about the fungibility of time, tradeoffs, undoing. The madness is fading somewhat but clarity doesn’t take its place.

I promised to protect her, she told me I could not. In the end she was right.

Friday, March 6, 2015

Thing practiced: just endorsing things I like again

The Slate Political Gabfest

There are regrettably few things in my life I do religiously—hit the gym: no; practice violin: no; post in my practice blog: definitely not. But at 10:30 every Friday morning, I am guaranteed to be listening to the Slate Political Gabfest.

Every week, the stars – Emily Bazelon, David Plotz, and John Dickerson – discuss three noteworthy current issues, all in a deliciously incisive way. The show is a delight: it’s both ultra-conversational (the hosts banter, jab at each other, and digress) and deeply thoughtful. Without being nerdy or inaccessible, the hosts place ongoing events within larger historical and political currents—which is a nice reprieve from the fleeting news-bite-y nature of most political coverage.

The Employee of the Month show

Speaking of podcasts, Catie Lazarus puts up a show that’s hilarious, heartening, hard to explain, and highly recommended. Start with Rachel Maddow in 2013, Jon Stewart a couple weeks ago, or anyone else on the list whose work you might’ve seen or admired.

Canon’s excellent 24mm and 40mm pancake lenses

I’ve always favored inexpensive-but-high-quality gear, and to my taste the Canon EF 40mm f/2.8 and EF-S 24mm f/2.8 pancake lenses are perfect. They’re tiny – not much thicker than a lens cap – but have superb optical performance: amazingly sharp, no color fringing, almost no distortion, no ghosts. And they cost $149 each, which counts as spare change in the lens market.

lenses on a table

Build quality feels great: these lenses are tough but still light, unlike the dinky-feeling kit lenses or nifty fifty. Macro performance is sharp and close; autofocus is fast and near-silent; bokeh are smooth; portability is unparalleled.

Some sample shots with the 40mm on a 1.6x crop sensor:

And with the 24mm:

The 40mm (unlike the 24mm) works on full-frame Canon cameras and would probably make a fine walkaround lens there.

Where these lenses fit most beautifully, though, is in assembling an inexpensive but excellent photography kit around a Canon crop-sensor DSLR. The nicest images you can buy for $800: Canon Rebel SL1 ($399; the smallest and lightest DSLR available) + 24mm pancake ($149) + 40mm pancake ($149) or 50mm f/1.8 ($115) + a monopod or mini-tripod. Cheap, portable, functional.

Stephen Jay Gould’s essay collections

Stephen Jay Gould’s writing has the odd effect of making me feel both intelligent and extremely dumb: intelligent, because he clearly respects his readers’ intelligence and so never patronizes or trivializes complex concepts; profoundly dumb, because he draws from a seemingly infinite bank of data and ideas in illustrating his larger themes, showing a multifaceted erudition that makes me, at least, feel like a nitwit. Luckily, this is a feeling I relish. (My recommended first read: Hen’s Teeth and Horse’s Toes.)

Friday, December 5, 2014

With news of the recent failures-to-indict in Ferguson and Staten Island, some people have been wondering how grand juries fit into United States criminal procedure. I couldn’t find a handy FAQ to link to, so here’s mine.

Warning: I am not a lawyer; I am not a law student; I’m not even well-schooled in television legal dramas. My only qualifications to comment are (a) having spent lamentably many hours in a law library reading and (b) being a huge nerd.

What is a grand jury?

In the United States we have two kinds of juries: grand juries and trial juries. Grand juries hear evidence from the prosecution, then decide whether or not it’s worth spending time and money to bring the defendant to trial. Grand juries don’t determine whether a defendant is guilty or not guilty, but only whether or not to bring charges against him.

Why did a grand jury decide these cases, instead of a trial jury?

Going through a grand jury is a preliminary step that happens before a case goes to trial. In criminal procedure, every case passes through a series of steps: from the initial identification of a suspect to the ultimate sentencing and punishment of a convicted criminal. At each stage, some people are screened out, and their cases are dropped.

If the grand jury does indict, the case then goes to arraignment, where the defendant is informed of the charges and can plead guilty or not guilty. (Most criminal cases end here, with the defendant taking a plea bargain: he pleads guilty in return for more favorable sentencing or reduced charges.) If the defendant pleads not guilty, the case proceeds through the defense’s procedural motions, if any, and finally to a jury or bench trial for determination of guilt.

Why do we screen criminal cases before going to trial?

Two of the stated goals of the criminal justice system are (1) efficiency and (2) the presumption of innocence. The rationale for screening goes:

(1) Because the system has limited resources, it should screen out whichever accused persons it believes cannot be convicted of a crime. Everyone else should be passed through the process expeditiously.

(2) Given limited resources and human flaws, the system will make mistakes. If mistakes are going to be made, they should go in the direction of making sure innocent persons are not convicted (which necessarily means some guilty persons will be set free).

Are all cases screened by a grand jury?

No. All federal felony cases go through a grand jury before trial, but many states use a preliminary hearing in front of a judge instead.

Why are grand juries “grand”? Is it because their rulings are superior to other juries’ decisions?

No — the name was given because grand juries are generally larger in number of jurors. (Grand means large in French; contrast with a petit (small), or trial, jury).

But the grand jury in Ferguson was fairly small, wasn’t it?

Yes – only twelve jurors. Federal courts always use grand juries of 16–23 citizens, but different states have differing requirements for state grand jury sizes, ranging from 5–23.

What usually happens in grand jury proceedings?

In a grand jury review, only the prosecution presents evidence; the defense is not allowed to participate. No judge is present. Since the prosecutor is the only one controlling the proceedings, a grand jury usually hears only evidence implicating the defendant. Accordingly, grand juries almost always decide to indict.

(This is not what happened in the Darren Wilson or Daniel Pantaleo cases — in both cases the prosecutors presented evidence from the defense’s side.)

Where are the rules governing the criminal justice system defined?

As with other areas of law, criminal procedure is defined by a motley of constitutional law, legal codes, and culture.

Both federal and state courts are required to adhere to the U.S. Constitution; most relevant are Article III Section 2 and the Fourth, Fifth, and Fourteenth Amendments. Some individual states’ constitutions also address criminal procedure (usually to extend rights for defendants). Similarly, the Federal Rules of Criminal Procedure define the specifics for federal courts, while states use a mix of state codes, statutes, and past decisions to conduct their criminal procedures.

As we know, though, the culture of the justice system’s participants (police, prosecutors, defense attorneys, courts) makes up a large part of how criminal law is actually applied. The law tries to standardize the behavior of its agents, but often can’t.

Do other countries use grand juries to screen out cases?

No.

I can see how the decisions in the Michael Brown and Eric Garner cases failed the victims’ families and communities. But how did these decisions fail criminal law itself?

We know that the criminal justice system—like all human institutions—is imperfect, and will introduce some injustice. Some innocent people will be imprisoned, and some guilty men will walk free; we accept incorrect rulings as the cost of living under any system. But the goal, always, is for the law to be fair: all we can hope for, and work towards, is a system that treats people in the same situation the same way. By treating the deaths of black men at the hands of white police officers differently from all the others, we’ve failed.

Sunday, October 12, 2014

“[What books change is] your capacity of feeling. Art opens the heart.”
—Jeanette Winterson, Art Objects


(my actual bookshelf)

 

Recall the Greek myth of Narcissus—a beautiful youth falls in love with his own reflection, and dies. This is meant as a warning: such are the dangers of recognizing no reality but our own.

It can seem hopelessly old-fashioned these days to read books, but they’re the best salve for humanity I know. Some excellent alternative realities:

1. Fun Home by Alison Bechdel
An exquisite meditation on fathers and mothers, discontent and self-scrutiny, hate, grief, and catharsis.

2. Orlando by Virginia Woolf
Intimate, weird, enchanting.

3. Written on the Body by Jeanette Winterson
Obsession, dissected.

4. The Amazing Adventures of Kavalier & Clay by Michael Chabon
A rapturous telling of the 20th-century American dream.

5. The Remains of the Day by Kazuo Ishiguro
On what happens when you defer happiness.

Tuesday, June 10, 2014

Thing practiced: nothing but liking things

I’m not sure how or why, but I found myself embroiled in an hour-long argument with a bright and—palpably—earnest young man about video games, their import, and who may or may not be ruining them for everyone. What prompted this was my audible praise for last year’s Gone Home, so I figured it was worth writing about the game, plus some other things I’ve enjoyed recently.

Gone Home

The Fullbright Company, Aug 2013

Gone Home: Polygon’s 2013 Game of the Year. Recipient of glowing reviews in the New York Times, Kotaku, the Atlantic, and umpteen other outlets of opinion. VGX’s Best Indie Game and Best PC Game of 2013; winner of the Best Debut awards at both BAFTA and GDC 2014; Games for Change 2014’s Game of the Year. Also: the recipient of user reviews calling it a “2deep4u indie hipster walking simulator” and “an interactive chick story” that “sucked balls like ICO,” and “quite possibly the worst game of all time.” Why do people feel so strongly? Why might you care?

I grew up in suburban California in the ’90s, and like everyone my age I dropped everything to obsess about Pokémon. Thanks to crippling social awkwardness and my brother outgrowing his N64, I kept playing games—first finishing Ocarina of Time and Mario 64, then spending way more time with SNES emulations and gaming IRC rooms and BBSes than a kid probably should.

When I got older, video games seemed like the perfect medium for art. Games marry visual art, music, and written fiction: all things I loved. But beyond that, video games involve the gamer—they force you to play a part in creating the art. This makes games expressive in a way other media can’t be. What video games deliver is experience itself—and if art is about evoking a response in the viewer, what could be more effective?

As a gamer, though, I hit a dead end. I couldn’t make the transition from kid games to “real” games. Nothing resonated the way Zelda or Chrono Trigger had. The popular games all seemed to be about something else.

It’s not a coincidence that big-budget games both revolve around empowerment and are marketed mainly to teenagers and very young adults, the demographic groups most likely to feel frustrated and powerless. These games let people do all the stuff they can’t in real life: drive hot cars, get hot girls, shoot guys in the face, be the hero. Domination and destruction will always be appealing—and some of these games are masterpieces. As representatives of a whole art form, though, they cover a narrow range. We’re handed adrenaline, victory, and good-vs-evil times twenty, but what about empathy? alienation? romance?

That’s why it’s so heartening to play Gone Home.

It’s hard to write concretely without spoiling it, but I’ll try. Gone Home is short: roughly movie-length on a typical playthrough. You’re dropped onto a porch on a dark and stormy night, and you walk around examining things, piecing together a story from the fragments you find. All you have is a window into a 3D world—you can’t see yourself, there’s no health meter, no number of lives. There are no character models, no cutscenes, no puzzles; there’s no combat, no story branching, no fail state. It’s the opposite of high-octane.

But it’s spellbinding. As you probe the intricately-crafted spaces, each element lures you in. The art is sumptuous and hypnotic, and the voice acting is exquisite. It’s all just right—the music, the lighting, each squeak of a floorboard and clack of a light switch—multilayered and cohesive, like when someone’s fingers intertwine perfectly with yours. And Gone Home stays playful throughout: witness interactive food items, endlessly flushable toilets, the inexplicable omnipresence of three-ring binders. If you like, you can heap a pile of toiletries on your parents’ bed, or turn on all the faucets. If you’re scared, you can carry a stuffed animal.

It works. Not because it’s high-concept, but because it’s deeply human. For me, at least, the game unearthed some long-repressed feelings—anxiety, ostracism, the thrill and poignance of a first love; how everything then is either exhilaration or heartbreak.

More than that, though, Gone Home shows that a game can simply tell a story. A story anyone can take part in, one where someone new to games won’t die instantly.

If you’ve ever been intrigued by video games, whether you’re a gamer or not, you should try Gone Home. If you don’t have $20, let me know and you can play it here. If you have $20 but you’re not sure about spending it, try thinking of it as an investment in the Fullbright Company, and in the future of games.

(And if you’re a longtime gamer worried that an influx of new people will dilute gaming culture, I ask this: Have you ever been shunned and called names by people whose approval and acceptance are important to you? Yeah? Me too. It sucks, right? Please, don’t be that person, don’t reject someone’s bid to be allowed to love the things you love. Welcome them in instead; be like John Scalzi, or like John Siracusa. Speaking of which—)


Hypercritical (the podcast)

John Siracusa and Dan Benjamin, Jan 2011–Dec 2012

Every culture has its rock stars, and in the Apple-geek community John Siracusa is a headliner. Appropriately, he is understated and tasteful—an engineer and a humanist—and a delight.

As the title suggests, the premise of the show is that John is hypercritical. The show’s tagline expresses a curmudgeonly despair: “nothing is so perfect that it can’t be complained about.” But the show’s billing sells it short. Rather than the complaining-about-things it purports to be, Hypercritical is, instead, the most nuanced take I’ve heard on technology and human endeavor.

John and Dan cover esoteric topics with a missionary zeal: spatial interfaces, fault-tolerant file systems, the agony and ecstacy of game controllers. “Why,” we may ask, listening, “might we care if the file system checksums our data?” John teaches us why. He shows us that analysis and rationality don’t preclude emotional fulfillment, but foster it. That details are superficial only to those to view them superficially. That, by applying earnest (and ruthless) intellectual effort, we can make better things; change lives.

Also, the show is fun. John is a geek’s geek, and an omnivorous one (never before has a man opined so eloquently on toasters). John is witty and smart and self-deprecating, while Dan (the consummate host) alternately teases him/eggs him on/is sage.

But the real genius of the show, and why I love it, is its conscience. Too many tech personalities combine intelligent analysis, on the one hand, with condescension and a weird arrogance on the other. As a high officer in the nerd elite, John could easily make cruel sport of those less sophisticated, less informed, or less bright than him. But he never does. Despite his fervor, John has a steadfast respect for the convictions of others. He thinks hard and doesn’t leap to judgment, and he airs his critics’ critiques whenever relevant.

And John brings this respect for others further. Prejudice in the tech industry has been a hot topic lately—specifically sexism (though racism and anti-LGBT sentiment also get mentions). Now, it’s not news to anyone that people can be assholes, but it seems particularly galling when nerds are doing the discriminating—nerds who at some point (I’m guessing) were themselves excluded from a group, and who lived through the emotional consequences. It seems like a debasement of nerd culture.

But, so, the genius.

To reiterate, everybody loves John. On the totem pole of tech-hero status, he’s that one guy in whichever totem-pole position is the best. John has geek cred. When John talks explicitly about inclusion and exclusion, the nerds (mostly) listen. But he does this only rarely—he knows that people don’t like being “preached to” about their baser human instincts, and he knows that even people who take his point seriously won’t necessarily internalize it in a way that translates to better behavior. So he does something different. He models correct behavior, stealthily and maybe unconsciously, through 159 hours of discourse on disk encryption and chip architecture, layout engines and connector design. I wish there were more.


The Essential Dykes to Watch Out For

Alison Bechdel, Nov 2008

When I was twelve, everything I knew about being a lesbian came from Alison Bechdel’s Dykes to Watch Out For. These days, kids can watch The L Word, Orange Is the New Black, The Fosters, or really anything—but the strip still holds up.


This One Summer

Jillian and Mariko Tamaki, May 2014

Just gorgeous.

Monday, February 3, 2014

Thing practiced: storytime with canids

Tools used: D3.js, pen + paper, PubMed (references)

You’re much too beautiful! Do you want us all to feel depressed?

For 382 days, a twenty-seven-year-old man chose to consume nothing but fluids, salt, vitamins, and yeast. He survived, dropping from 456 to 180 pounds—and weighed in at 196 five years later.

Fasting is in vogue now, and not just among self-flagellants. Fans of intermittent fasting say hunger hones our bodies’ recovery mechanisms and preserves lean tissue while eliminating fat. Paleo Diet adherents—who argue that obesity, diabetes, and heart disease stem from the mismatch between our evolutionary history and the modern environment—remind us that humans evolved to thrive when meals were rare. So (how) do our bodies survive when we’re not eating?

All our energy derives from breaking carbon-carbon bonds, and we have various carbon backbones available to burn. Glycogen, a branched carbohydrate, is our fast fuel: all our cells maintain a stash, and all can quickly mobilize it as sugar when needed. Stockpiling glycogen is inefficient, though, because we package it with water. (We think of carbohydrates as yielding 4 calories per gram, but we reap only 1-2 calories per gram of hydrated glycogen.)

A larger store of potential energy is in body proteins, which are linear polymers of amino acids. Proteins serve critical functions (skin, ligaments, muscle, enzymes, hormones), so we preserve them as much as possible. Though proteins can and do get broken down for energy, catabolizing more than about half will kill us.

Our main fuel store is fat, which is both efficient to store (because it’s not hydrated) and otherwise useless (meaning it’s easily expendable). Theoretically, a man with 20% body fat could survive off his fat layer for weeks.

Stored energy reserves in a 170-pound, 20% body fat man; one day’s resting energy usage is shown for reference.

In practice, though, calories from different sources aren’t interchangeable. Our bodies don’t literally burn fuel, but instead send it through energy-harvesting pathways that accept specific inputs. Some cell types lack the machinery needed to process certain molecules; for example, red blood cells don’t have mitochondria, so they can use only glucose (sugar) for energy. Other tissues can’t use certain fuels because they’re not physically accessible; for instance, the brain can’t burn fat because fatty acids can’t permeate the blood-brain barrier. And our bodies can interconvert our energy stores in only limited ways: we can convert glucose to fat, and protein to glucose or fat, but we can’t convert fatty acids to either glucose or protein.

When we’re starving, our bodies prioritize two things: maintaining an uninterrupted flow of energy to the brain and spinal cord, and preserving as much body protein as possible. Normally, our brains run solely on glucose. During starvation, though, glucose is precious: we can create it only by destroying body protein. To conserve protein, the brain reduces its glucose usage by burning the ketone bodies β-hydroxybutyrate and acetoacetate as fuel. Likewise, muscle tissue and major organs like the heart and liver usually burn a mixture of glucose and fatty acids, but switch to a ketones-added, lower-glucose, higher-fat mixture.

How brain and muscle adjust their energy usage as starvation progresses: the brain switches to use ketone bodies (β-OHB and AcAc) as its primary fuel, while muscle replaces most of its glucose usage with fatty acids (FFA). (Percentages shown are oxygen equivalents).

During the first phase of fasting, as we run down our short-term glycogen stores, the liver ramps up its ketone production to allow us to make these fuel usage transitions.


As starvation progresses, blood concentrations of ketone bodies β-hydroxybutyrate (β-OHB) and acetoacetate (AcAc) rise by orders of magnitude; free fatty acid (FFA) concentration doubles.

These metabolic transitions spare vital organ and muscle proteins, but we do still burn some protein—both to produce the minimal glucose some tissues still need and to provide the four-carbon intermediates required to catabolize ketones and fat. Throughout this “steady” state of fasting, which can last weeks to months, our bodies gradually cannibalize our protein stores. Eventually, enough protein is consumed that our organ systems lose function and we die, usually when our respiratory muscles fail. (We die of starvation even if we’ve retained large stores of fat.)

So, what does any of this have to do with a plan to lose weight? Not much, I hope. It’s tempting to think that 20 or 30 days of buckling down and getting hardcore will turn us into perfect, beautiful people. If only.

we’re all fucked up skinny bitch