Things practiced: dumb solipsism, self-indulgence, big ball of regret
My first memories of death were of my mother’s mother, then my father’s mother.
In Chinese culture there are strictly-defined rites around death and mourning:
the son breaks a vessel into a thousand pieces to proclaim his grief, the children compete to see who can weep most deeply.
Though juvenile and emotionally untutored I recognized the benefit then: it gives the bereft something to do.
I learned grief as a teenager, when a dear friend killed himself —
except I learned it improperly,
kept my thoughts to myself,
and placated my restless brain by letting it systematically dismantle all the relationships I’d had.
I convinced myself I was back to normal while growing gradually more mad;
time passed, and I left as soon as I could.
Which is to say: I’ve mostly dealt with grim things by placing them in a box and running away at full speed.
This has gotten me good at inconsequential things like brain dissection and numerical analysis but bad at handling emotional complexity.
So maybe I can start by writing something down.
Aimee was the one who taught me that even though our earliest experiences teach us who we are, the determined can transcend.
I was raped three times pre-adulthood, by people I knew: a cousin, a friend, a housemate.
Aimee was raped many more times than that, by a man who was closer and less escapable.
She fought back where I did not; she talked about it but I did not, not even to her — not because it was painful but because my experiences seemed pretty normal and I didn’t want to cheapen her experience, which had been abnormally brutal and cruel, by diluting it with mine.
When I was raped I discarded certain assumptions I had held about how the world worked and about how safe I was.
But Aimee saw her trauma as a tiny obstacle to be cleared, and once cleared an affirmation of her strength.
She always had a sense of the possible, which was admirable, and incredible.
What’s always seemed problematic is that the brave die, and yet we cowardly ones are still here.
Driving through Los Angeles, the landmarks I know best are still the ones from Aimee’s convalescence: hospital, hospital, pharmacy, cancer clinic; the readiest memories still those of sitting in freezing waiting rooms producing the insurance cards, putting them away, filling out paper forms, ad infinitum, uselessly, helplessly.
Maybe it shouldn’t have been a surprise that she was able to handle imminent expectation of death with grace and love and courage and personal sacrifice — but I was always surprised: at the equanimity with which she lost her long hair which she’d so prized, at her tolerance for pain, at her unflinching will to face hard truths.
Had there been a moment when she was afraid to die? I wasn’t willing to ask.
The problem is, as much as you’d like to, you can’t actually take someone else’s weakness or pain or fear.
There’s a time when you expect your life to always be full of new and shiny things, and there’s a day you realize that’s not how it’ll be at all. That what life really becomes is a thing made of losses, of prized things that were there and aren’t any more.
Grief is so uninteresting, I know; I can look at myself and scoff. I want people around me, I dread the moments of solitude; then I bore everyone until they leave, or talk nonsense about the fungibility of time, tradeoffs, undoing. The madness is fading somewhat but clarity doesn’t take its place.
I promised to protect her, she told me I could not. In the end she was right.
For whatever we lose(like a you or a me)
it's always ourselves we find in the sea
(from an e.e. cummings poem that autoplays in my head every time I picture the ocean, i.e. ~10,000× while coding this)
Despite being terrible at it, I find graphics programming pretty fun.
Building realistic 3D environments in real time blends physics and computer science—graphics programmers have to satisfy two conflicting objectives:
to simulate reality as accurately as possible,
while using fewer than 16 or 33 milliseconds to process each frame.
Modeling the ocean is a good example of a situation where tradeoffs are required.
Fluid dynamics is based around the Navier-Stokes equations, which are a set of nonlinear partial differential equations that describe the flow of viscous fluids.
Fully solving these equations numerically provides an exact model of the ocean,
but is computationally infeasible.
Instead, I tried to simulate an ocean scene using this approach:
1. Generate realistic water surface height fields to model waves, using empirical knowledge of ocean phenomena.
2. Account for optical processes acting on ocean water: reflection and refraction from a water surface, the color filtering behavior of ocean water, and maybe more-complex effects like caustics and godrays.
3. Render a realistic sky gradient using known properties of atmospheric scattering.
4. Think about computational-resource cost versus quality gained and simplify where possible.
Results so far (underwhelming but workable) (click):
Jamie xx’s new album In Colour is finally out, and it’s so good.
Do you ever get that feeling,
when you’re up at 2 am and almost alone,
of being enveloped in something with maybe someone,
of melancholy becoming euphoric?
This album feels like that.
It’s dense and fleshed out, not downbeat and cryptic like the xx past — but still lonely, and lovely. Listen with good headphones.
Because I loved this album and its cover so much, I built a totally unsanctioned album-themed beat-detecting visualizer here:
(If the presets are unsatisfactory, drag and drop your own audio file onto the page.)
Thing practiced: documenting neat buildings so I don’t forget
I’ve always had a reverence for beautiful buildings, and for New York as their apotheotic urban motherland.
Cities (and humans) have to perpetually reinvent themselves to stay alive, and New York is the best example of this I know. There’s vast beauty in the uniformly-timeworn structures of Renaissance Florence or the Campidoglio, yes—but a modern city needs more.
Architecture, as Vincent Scully says, is a conversation between generations.
What New York’s architecture shows us is that we can both safeguard the past and believe that today can be just-as-good or better.
I’ve always felt sacrilegious making sushi, being neither Japanese nor highbrow. But I do love fish, the sea, and the idea of paying obeisance to fish and the sea—and so last night we had sushi. My notes:
###Tools and things
Rolled sushi is forgiving: the ingredients are on the inside, so the pieces don’t need to be as visually perfect as in nigiri. This means you can get away with a non-fish-specific knife, as long as it’s sharp. Also helpful: a colander for rinsing rice, a sushi rolling mat, a rice spatula, and an automatic rice cooker.
There are regrettably few things in my life I do religiously—hit the gym: no; practice violin: no; post in my practice blog: definitely not. But at 10:30 every Friday morning, I am guaranteed to be listening to the Slate Political Gabfest.
Every week, the stars – Emily Bazelon, David Plotz, and John Dickerson – discuss three noteworthy current issues, all in a deliciously incisive way. The show is a delight: it’s both ultra-conversational (the hosts banter, jab at each other, and digress) and deeply thoughtful. Without being nerdy or inaccessible, the hosts place ongoing events within larger historical and political currents—which is a nice reprieve from the fleeting news-bite-y nature of most political coverage.
Speaking of podcasts, Catie Lazarus puts up a show that’s hilarious, heartening, hard to explain, and highly recommended. Start with Rachel Maddow in 2013, Jon Stewart a couple weeks ago, or anyone else on the list whose work you might’ve seen or admired.
Canon’s excellent 24mm and 40mm pancake lenses
I’ve always favored inexpensive-but-high-quality gear, and to my taste the Canon EF 40mm f/2.8 and EF-S 24mm f/2.8 pancake lenses are perfect. They’re tiny – not much thicker than a lens cap – but have superb optical performance: amazingly sharp, no color fringing, almost no distortion, no ghosts. And they cost $149 each, which counts as spare change in the lens market.
Build quality feels great: these lenses are tough but still light, unlike the dinky-feeling kit lenses or nifty fifty. Macro performance is sharp and close; autofocus is fast and near-silent; bokeh are smooth; portability is unparalleled.
Some sample shots with the 40mm on a 1.6x crop sensor:
And with the 24mm:
The 40mm (unlike the 24mm) works on full-frame Canon cameras and would probably make a fine walkaround lens there.
Where these lenses fit most beautifully, though, is in assembling an inexpensive but excellent photography kit around a Canon crop-sensor DSLR. The nicest images you can buy for $800: Canon Rebel SL1 ($399; the smallest and lightest DSLR available) + 24mm pancake ($149) + 40mm pancake ($149) or 50mm f/1.8 ($115) + a monopod or mini-tripod. Cheap, portable, functional.
Stephen Jay Gould’s essay collections
Stephen Jay Gould’s writing has the odd effect of making me feel both intelligent and extremely dumb: intelligent, because he clearly respects his readers’ intelligence and so never patronizes or trivializes complex concepts; profoundly dumb, because he draws from a seemingly infinite bank of data and ideas in illustrating his larger themes, showing a multifaceted erudition that makes me, at least, feel like a nitwit. Luckily, this is a feeling I relish. (My recommended first read: Hen’s Teeth and Horse’s Toes.)
With news of the recent failures-to-indict in Ferguson and Staten Island, some people have been wondering how grand juries fit into United States criminal procedure. I couldn’t find a handy FAQ to link to, so here’s mine.
Warning: I am not a lawyer; I am not a law student; I’m not even well-schooled in television legal dramas. My only qualifications to comment are (a) having spent lamentably many hours in a law library reading and (b) being a huge nerd.
What is a grand jury?
In the United States we have two kinds of juries: grand juries and trial juries.
Grand juries hear evidence from the prosecution, then decide whether or not it’s worth spending time and money to bring the defendant to trial.
Grand juries don’t determine whether a defendant is guilty or not guilty, but only whether or not to bring charges against him.
Why did a grand jury decide these cases, instead of a trial jury?
Going through a grand jury is a preliminary step that happens before a case goes to trial.
In criminal procedure, every case passes through a series of steps: from the initial identification of a suspect to the ultimate sentencing and punishment of a convicted criminal. At each stage, some people are screened out, and their cases are dropped.
If the grand jury does indict, the case then goes to arraignment, where the defendant is informed of the charges and can plead guilty or not guilty. (Most criminal cases end here, with the defendant taking a plea bargain: he pleads guilty in return for more favorable sentencing or reduced charges.) If the defendant pleads not guilty, the case proceeds through the defense’s procedural motions, if any, and finally to a jury or bench trial for determination of guilt.
Why do we screen criminal cases before going to trial?
Two of the stated goals of the criminal justice system are (1) efficiency and (2) the presumption of innocence. The rationale for screening goes:
(1) Because the system has limited resources, it should screen out whichever accused persons it believes cannot be convicted of a crime. Everyone else should be passed through the process expeditiously.
(2) Given limited resources and human flaws, the system will make mistakes. If mistakes are going to be made, they should go in the direction of making sure innocent persons are not convicted (which necessarily means some guilty persons will be set free).
Are all cases screened by a grand jury?
No. All federal felony cases go through a grand jury before trial, but many states use a preliminary hearing in front of a judge instead.
Why are grand juries “grand”? Is it because their rulings are superior to other juries’ decisions?
No — the name was given because grand juries are generally larger in number of jurors. (Grand means large in French; contrast with a petit (small), or trial, jury).
But the grand jury in Ferguson was fairly small, wasn’t it?
Yes – only twelve jurors. Federal courts always use grand juries of 16–23 citizens, but different states have differing requirements for state grand jury sizes, ranging from 5–23.
What usually happens in grand jury proceedings?
In a grand jury review, only the prosecution presents evidence; the defense is not allowed to participate. No judge is present. Since the prosecutor is the only one controlling the proceedings, a grand jury usually hears only evidence implicating the defendant. Accordingly, grand juries almost always decide to indict.
(This is not what happened in the Darren Wilson or Daniel Pantaleo cases — in both cases the prosecutors presented evidence from the defense’s side.)
Where are the rules governing the criminal justice system defined?
As with other areas of law, criminal procedure is defined by a motley of constitutional law, legal codes, and culture.
Both federal and state courts are required to adhere to the U.S. Constitution; most relevant are Article III Section 2 and the Fourth, Fifth, and Fourteenth Amendments. Some individual states’ constitutions also address criminal procedure (usually to extend rights for defendants).
Similarly, the Federal Rules of Criminal Procedure define the specifics for federal courts, while states use a mix of state codes, statutes, and past decisions to conduct their criminal procedures.
As we know, though, the culture of the justice system’s participants (police, prosecutors, defense attorneys, courts) makes up a large part of how criminal law is actually applied. The law tries to standardize the behavior of its agents, but often can’t.
Do other countries use grand juries to screen out cases?
I can see how the decisions in the Michael Brown and Eric Garner cases failed the victims’ families and communities. But how did these decisions fail criminal law itself?
We know that the criminal justice system—like all human institutions—is imperfect, and will introduce some injustice. Some innocent people will be imprisoned, and some guilty men will walk free; we accept incorrect rulings as the cost of living under any system. But the goal, always, is for the law to be fair: all we can hope for, and work towards, is a system that treats people in the same situation the same way. By treating the deaths of black men at the hands of white police officers differently from all the others, we’ve failed.
Scheming humans have always faced a basic problem: how can we communicate securely in the presence of adversaries?
Since ancient times, the art of secret communication, or cryptography, has been crucial for governments and the military.
But today cryptography affects us all.
As messages are increasingly transmitted through public channels, cryptography prevents malicious interceptors from using our information against us.
The evolution of cryptography can be split into two eras by the invention of asymmetric ciphers in the 1970s.
Historically, encrypting and decrypting a message had been symmetric processes — that is, unscrambling the message required knowing the key that had been used to scramble it.
This begat the problem of key distribution: before sending a secret message, the sender would have to take great precautions to transport her encryption key safely to the recipient.
In asymmetric cryptography, the keys used to encrypt and to decrypt a message are different.
This means that the recipient can make his encryption key public, as long as he keeps his decryption key private.
What we need for an asymmetric-cryptography protocol to work is a trapdoor one-way function.
This is the mathematical equivalent of a physical lock: easy to process in one direction (snapping the lock closed), but hard to undo (opening the lock), unless we have a special secret (the key).
In RSA – the first public-key cryptosystem, and still the most popular – the trapdoor function exploits mathematical features of modular exponentiation, prime numbers, and integer factorization.
Let’s throw together a toy implementation in Scala.
Helpful math functions
A number is prime if it has exactly two divisors: 1 and itself. Sometimes it’s useful to have a list of small primes, so we need a Sieve of Eratosthenes:
What we really want, though, is large primes – ones higher than 2500.
To get a prime that large, we can’t sieve up from 1; instead, we find a prime by taking a random number, checking if it’s probably prime, and repeating if necessary.
The primality test of choice in real systems is Miller-Rabin, but we’ll use Solovay-Strassen to keep things simple:
Finally, the main reason we’re interested in primes is so we can do calculations modulo our prime. To do this, we need the Extended Euclidean algorithm:
The RSA system
An asymmetric encryption system has two parts: a public key and a private key.
In theory, encrypting a message with the public key can only be reversed by decrypting with the private key.
In RSA, we encrypt a message by computing msge mod n, using the publicly-known information n and e.
This is the crux of RSA’s security: modular exponentiation is easy to do, but exceedingly hard to undo.
To make it possible to retrieve the message from the ciphertext, we build a trapdoor into our encryption routine by making n the product of two large primes p and q (which we keep private).
We can then reconstruct the message using a calculated decryption exponent d.
To choose the right decryption exponent, we first need a value φ based on n’s factorization such that xφ(p, q) = 1. Luckily, Euler’s totient function gives us just this:
As long as we choose our public exponent e so that it doesn’t share a common factor with the totient φ, we can decrypt using the inverse of e mod φ:
Encryption and decryption are now trivial:
(In reality, RSA is never used to encrypt messages — for n to be large enough to encode any reasonable length of message, the computing resources required for even the “easy” process of encryption are prohibitively high. Instead, RSA is used to safely deliver symmetric keys, which can then be used with block or stream ciphers to encrypt and decrypt large messages.)
Growing up in coastal California it’s easy to forget the truth: it’s weird that we can live here at all.
We owe our survival to massive hydraulic works—aqueducts and siphons and weirs bringing us water from Hetch Hetchy or the Oroville or the Colorado River.
I’m not sure how or why, but I found myself embroiled in an hour-long argument with a bright and—palpably—earnest young man about video games, their import, and who may or may not be ruining them for everyone. What prompted this was my audible praise for last year’s Gone Home, so I figured it was worth writing about the game, plus some other things I’ve enjoyed recently.
Gone Home: Polygon’s 2013 Game of the Year. Recipient of glowing reviews in the New York Times, Kotaku, the Atlantic, and umpteenotheroutletsofopinion. VGX’s Best Indie Game and Best PC Game of 2013; winner of the Best Debut awards at both BAFTA and GDC 2014; Games for Change 2014’s Game of the Year. Also: the recipient of user reviews calling it a “2deep4u indie hipster walking simulator” and “an interactive chick story” that “sucked balls like ICO,” and “quite possibly the worst game of all time.” Why do people feel so strongly? Why might you care?
I grew up in suburban California in the ’90s, and like everyone my age I dropped everything to obsess about Pokémon. Thanks to crippling social awkwardness and my brother outgrowing his N64, I kept playing games—first finishing Ocarina of Time and Mario 64, then spending way more time with SNES emulations and gaming IRC rooms and BBSes than a kid probably should.
When I got older, video games seemed like the perfect medium for art. Games marry visual art, music, and written fiction: all things I loved. But beyond that, video games involve the gamer—they force you to play a part in creating the art. This makes games expressive in a way other media can’t be. What video games deliver is experience itself—and if art is about evoking a response in the viewer, what could be more effective?
As a gamer, though, I hit a dead end. I couldn’t make the transition from kid games to “real” games. Nothing resonated the way Zelda or Chrono Trigger had. The popular games all seemed to be about something else.
It’s not a coincidence that big-budget games both revolve around empowerment and are marketed mainly to teenagers and very young adults, the demographic groups most likely to feel frustrated and powerless. These games let people do all the stuff they can’t in real life: drive hot cars, get hot girls, shoot guys in the face, be the hero. Domination and destruction will always be appealing—and some of these games are masterpieces. As representatives of a whole art form, though, they cover a narrow range. We’re handed adrenaline, victory, and good-vs-evil times twenty, but what about empathy? alienation? romance?
That’s why it’s so heartening to play Gone Home.
It’s hard to write concretely without spoiling it, but I’ll try. Gone Home is short: roughly movie-length on a typical playthrough. You’re dropped onto a porch on a dark and stormy night, and you walk around examining things, piecing together a story from the fragments you find. All you have is a window into a 3D world—you can’t see yourself, there’s no health meter, no number of lives. There are no character models, no cutscenes, no puzzles; there’s no combat, no story branching, no fail state. It’s the opposite of high-octane.
But it’s spellbinding. As you probe the intricately-crafted spaces, each element lures you in. The art is sumptuous and hypnotic, and the voice acting is exquisite. It’s all just right—the music, the lighting, each squeak of a floorboard and clack of a light switch—multilayered and cohesive, like when someone’s fingers intertwine perfectly with yours. And Gone Home stays playful throughout: witness interactive food items, endlessly flushable toilets, the inexplicable omnipresence of three-ring binders. If you like, you can heap a pile of toiletries on your parents’ bed, or turn on all the faucets. If you’re scared, you can carry a stuffed animal.
It works. Not because it’s high-concept, but because it’s deeply human. For me, at least, the game unearthed some long-repressed feelings—anxiety, ostracism, the thrill and poignance of a first love; how everything then is either exhilaration or heartbreak.
More than that, though, Gone Home shows that a game can simply tell a story. A story anyone can take part in, one where someone new to games won’t die instantly.
If you’ve ever been intrigued by video games, whether you’re a gamer or not, you should try Gone Home. If you don’t have $20, let me know and you can play it here. If you have $20 but you’re not sure about spending it, try thinking of it as an investment in the Fullbright Company, and in the future of games.
(And if you’re a longtime gamer worried that an influx of new people will dilute gaming culture, I ask this: Have you ever been shunned and called names by people whose approval and acceptance are important to you? Yeah? Me too. It sucks, right? Please, don’t be that person, don’t reject someone’s bid to be allowed to love the things you love. Welcome them in instead; be like John Scalzi, or like John Siracusa. Speaking of which—)
Every culture has its rock stars, and in the Apple-geek community John Siracusa is a headliner. Appropriately, he is understated and tasteful—an engineer and a humanist—and a delight.
As the title suggests, the premise of the show is that John is hypercritical. The show’s tagline expresses a curmudgeonly despair: “nothing is so perfect that it can’t be complained about.” But the show’s billing sells it short. Rather than the complaining-about-things it purports to be, Hypercritical is, instead, the most nuanced take I’ve heard on technology and human endeavor.
John and Dan cover esoteric topics with a missionary zeal: spatial interfaces, fault-tolerantfile systems, the agony and ecstacy of gamecontrollers. “Why,” we may ask, listening, “might we care if the file system checksums our data?” John teaches us why. He shows us that analysis and rationality don’t preclude emotional fulfillment, but foster it. That details are superficial only to those to view them superficially. That, by applying earnest (and ruthless) intellectual effort, we can make better things; change lives.
Also, the show is fun. John is a geek’s geek, and an omnivorous one (never before has a man opined so eloquently on toasters). John is witty and smart and self-deprecating, while Dan (the consummate host) alternately teases him/eggs him on/is sage.
But the real genius of the show, and why I love it, is its conscience. Too many tech personalities combine intelligent analysis, on the one hand, with condescension and a weird arrogance on the other. As a high officer in the nerd elite, John could easily make cruel sport of those less sophisticated, less informed, or less bright than him. But he never does. Despite his fervor, John has a steadfast respect for the convictions of others. He thinks hard and doesn’t leap to judgment, and he airs his critics’ critiques whenever relevant.
And John brings this respect for others further. Prejudice in the tech industry has been a hottopiclately—specifically sexism (though racism and anti-LGBT sentiment also get mentions). Now, it’s not news to anyone that people can be assholes, but it seems particularly galling when nerds are doing the discriminating—nerds who at some point (I’m guessing) were themselves excluded from a group, and who lived through the emotional consequences. It seems like a debasement of nerd culture.
But, so, the genius.
To reiterate, everybody loves John. On the totem pole of tech-hero status, he’s that one guy in whichever totem-pole position is the best. John has geek cred. When John talks explicitly about inclusion and exclusion, the nerds (mostly) listen. But he does this only rarely—he knows that people don’t like being “preached to” about their baser human instincts, and he knows that even people who take his point seriously won’t necessarily internalize it in a way that translates to better behavior. So he does something different. He models correct behavior, stealthily and maybe unconsciously, through 159 hours of discourse on disk encryption and chip architecture, layout engines and connector design. I wish there were more.
Depending on your level of geekiness, you may have heard Apple’s announcement today that a team of mutantprogrammingvirtuosos has been working—in secret, for years—on a new language that will replace Objective-C as the lingua franca of the iPhone, iPad, and Mac – plus presumably whatever Apple televisual, home-automating, and/or wearable products are to come. Thecircleis—atlast—complete.
The language is called Swift—and, if you’re interested, Apple has released a 500-page ebook detailing the language to let developers get started today.
Technically, Swift has some nice characteristics:
Like C, C++, and Objective-C, Swift compiles down to native code—there’s no VM or interpreter slowing things down at run time. Coupled with some serious compiler optimization, this should make Swift performant enough for any size of application.
Unlike Objective-C, Swift code is proven safe at compile time—that is, no compiled code can cause an access violation at run time (unless we explicitly mark our code as unsafe).
It has first-class functions.
Functions can be nested within other functions, passed as arguments to other functions, and stored as values.
It has closures.
Along with the code for a function, Swift stores the relevant parts of the environment that was current when the function was created. (This makes higher-order functions like sort and fold possible and powerful.)
It supports both immutable values and variables.
Languages usually choose one or the other, but Swift has both—just type let for immutable values and var for (mutable) variables.
Swift is statically-typed, but types can be inferred.
If the type-checker can tell what type the value/variable should be, we don’t have to write it.
And types can be generic.
Functions and methods are automatically generalized by the type-checker, so our functions can be used across compatible parameter types—without manually writing overloads, and without resorting to id and using casts everywhere.
Swift gives us namespaces, with its module system.
Memory is managed automatically, and without pointers.
Like Objective-C post-ARC, Swift uses compiler magic to track and deallocate instances as needed. We don’t need to explicitly specify strong vs. weak references any more, or use * for reference types. This makes iOS programming more accessible (pro/con), though not as easy as with garbage collection. And since ARC happens at compile time, there’s no performance hit at run time.
Swift has tuples (or product types). We can group multiple values into one compound data type, without using container classes/structs.
It has very-special enums (or sum types).
A Swift enum contains a value of one of various specified types.
We can use these like plain C integer enums if we like, but Swift enums go further: Swift lets each enum type carry data.
It has pattern-matching to go with tuples and enums.
Values can be matched against patterns to decompose them concisely.
And much more: lazy properties, property observers, class/struct extensions, buffed-up structs, option types, function currying, type aliasing, and more. And Swift manages to do all this while maintaining smooth interoperability with Objective-C classes.
(For an admittedly-trivial demo of some of these features, here’s code for Minesweeper in Swift.)
With ideas drawn (in Chris Lattner’s words) from Haskell, Rust, C#, Ruby, Python, and the last twenty years of PL research and practice, Swift looks like a language even programming-languages nerds should find pleasing. It delivers what everyone wants from a language: an elegant way to interface with software.
More important than Swift’s elegance, though, is its context. Programming languages rise and fall not on language quality but on more-practical factors: which libraries and tools are available, what the industry standard is, and what’s most likely to get you a job. Swift is the first place all these factors converge: a highly relevant platform, a sole platform owner with the will to impose a standard, a human-friendly toolset, and a newly-modern language.
Have you ever wanted to start programming on iOS? Today’s the day.