Friday, June 29, 2012

Scrolling in Lion

Fair warning: this will be a Mac-centric geeky entry and I will not be explaining the terminology or context.

Apple decided to incorporate certain iOS-isms into Mac OS X. I guess somebody in the user experience group decided this was the direction OS X had to take, and as I know from painful experience that I have almost zero expertise in user interface design, I'll defer to those who allegedly know.

That said, somebody needs to drag the iTunes and AppKit teams into a conference room and bang heads together until they get the actual user experience right. (It wouldn't hurt to have the Safari team in there, too.)

The problem is that by doing away with AppKit's former scrolling behavior, Apple has made it much more difficult to navigate within windows that have embedded, scrolling subviews. Imagine gesturing on a MacBook's trackpad to scroll down an iTunes artist page in which there are scrolling views for songs, videos and movies. You want to get to the "movies" section, which isn't visible on the screen. However, the cursor enters the "music" section and you're suddenly scrolling within that instead of the surrounding view.

You sigh, then look for the scrollbar that shows up at the far side of the surrounding view as you scroll through it. Then you realize that the scrollbar goes away when the scrolling movement stops. You move the cursor over where the scrollbar shows up, but just hovering the cursor over that area doesn't make the bar show up: only scrolling will do that.

The arrow keys work, but only for the outermost view. They might work on the inner views too, but I haven't tried that, and before that could work you'd need to get focus into whichever inner view you want to manipulate. I don't know how to do that without selecting something in the inner view, which is not what I want to do.

I use hardly any iOS apps so I don't know how many include embedded scrolling subviews, but I haven't experienced the problem I just described in Lion. Part of the problem, I just realized, is that iTunes in OS X allows its window to consist of nothing but embedded subviews: there's no requirement that the outermost view be accessible at all times via, for example, an outermost strip just inside the enclosing window. That means there's no place to put the cursor within the iTunes window boundary to indicate "my gestures apply to the outermost view". (Or if there is such a place, it's not obvious.)

The pre-Lion AppKit, and before it AppKit in NeXTStep, had a solidly-designed set of user interface elements and behaviors. Other versions of OS X tweaked the visual appearance of the elements and sometimes the default layouts (e.g., the location of the scrolling arrows), but these changes didn't disturb the logic of the fundamental behaviors. With Lion, AppKit has stepped into quicksand because the user experience metaphors and hardware/software environment of iOS don't completely match those of OS X.

I hope the user experience and AppKit teams are already thinking long and hard about those user experience metaphors in OS X. It's long past time for an overhauled human interface guidelines document (which Apple issued early in the life of OS X and proceeded to violate whenever the guidelines were inconvenient, but never mind that). The upgraded document would be valuable in itself, but the thinking behind it — the thinking needed to arrive at a consistent model of user interaction and a coherent set of user experience metaphors — is what is really crucial.

The forgotten man in the ACA ruling

One reason a lot of people thought the Supreme Court would strike down the Affordable Care Act was the grilling that Obama's Solicitor General took at the Justices' hands during oral arguments. Some of us — certainly including me — had forgotten about those oral arguments, which garnered Solicitor General Donald B. Verrilli, Jr. harsh and often snide criticism in the media for what those critics considered an awkward and fumbling performance.

The New York Times has a brief reminder article about Verrilli today.

I can't resist spoiling perhaps the best part of the piece.

“Let me just say on that point that people who say there’s no such thing as bad publicity have no idea what they’re talking about,” he said. “There is definitely bad publicity. Being on the wrong end of a Jon Stewart monologue is bad publicity.”

Thursday, June 28, 2012

SCOTUS and the Affordable Care Act

There's lots of coverage about the U.S. Supreme Court's decision today that upholds most of President Obama's signature health care reform legislation, the Affordable Care Act. I'm therefore not going to link to any specific article; if you haven't already read or watched anything about it, do a Web search.

My only thought on the subject is, I'm wondering how President Obama and the Democrats generally will fumble the (unfortunately) important battle over the public perception of the Court's decision and of the Act itself. Because if recent history is any indication, they will screw up and lose that battle to the Republicans, even if the Dems have the facts on their side. Republicans in the last twenty years have excelled at repeating The Big Lie (which varies according to which issue they're pushing) often enough that people who aren't paying attention start believing it ("Saddam Hussein is linked to the 9/11 terrorists", for example).

A worrying and potentially more significant byproduct of today's decision is discussed in a blog post in the New York Times, namely, the potential gutting of the Commerce Clause. We may end up regretting this decision as much as we regret Citizens United in the not too distant future.

Oh, and with this Constitutional Twister of a decision (by which I mean it bends first one way, then another; you should see the disparate alliances he cobbled together in different parts of the decision to muster a bare majority for the whole), Chief Justice Roberts looks like he excels at playing the public-relations game. He has, at first blush, apparently improved the Court's battered reputation, about which he is known to care a good deal. Whether he is worth respecting for his judicial principles and dedication to a genuinely nonpartisan reading of the law and the Constitution is, in my opinion, still gravely in doubt. I suspect, though, that a few decades hence, he and this era's Court will not be as well-regarded as he would like.

Goodman on Ann Curry

If there's a silver lining to the strange brouhaha surrounding Ann Curry's tenure on the Today show, leading up to her ouster as its cohost, it's that the fuss has resulted in what looks like a future classic Tim Goodman column for The Hollywood Reporter.
Can’t this network do anything right? We’re a long way past “this is starting to get embarrassing” and nudging up against “this is starting to get pathological.”
Goodman reminds us again of NBC's fascinatingly and hilariously awful mishandling of the Jay Leno-Conan O'Brien-Tonight Show succession. You would think one reputation-damaging, talent-related fiasco a decade would be enough, but no, the network can't seem to get enough.
All those years of Couric’s sincerity, which seemed so false to some (ahem), might have trained morning show viewers to want a certain someone. And perhaps Curry is not that person.

But is NBC absolutely sure that Savannah Guthrie is that person? I’m willing to bet that NBC has no idea whether Guthrie is the new It Host. Whoever made this decision probably got tired of having the bean counters in their office saying: “We’re hemorrhaging money, and we think it’s Curry. You’ve got to do something.”

I don't watch the networks' nauseatingly chipper morning products so I don't know or care whether Curry was any good in her role. I'm just grateful her travails gave Goodman an excuse to unleash the snark on the hapless NBC.

Wednesday, June 27, 2012

If we can't have Captain Kirk ...

In the category of "I don't know if this is good news or bad news for Obama" is a survey that finds more Americans think President Obama would be better able to handle an alien invasion than Mitt Romney.

No, really: I read this in a Los Angeles Times story.

And the survey didn't stop there.

It even asked which superhero Americans would turn to first in the event of an alien invasion. (It's the Hulk.)
I happen to believe that there must be extraterrestrial life, and very likely intelligent extraterrestrial life (the universe is just too big for Earth to be the only place harboring life of any kind), but I don't believe in the Hulk, so ... I'm not sure where this survey leaves me.

Tuesday, June 26, 2012

Denying science in North Carolina

Politicians in North Carolina are trying to legislate not merely in ignorance of science, but in defiance of it. According to a Los Angeles Times article:
The result is House Bill 819, a measure that would require sea level forecasts to be based on past patterns and would all but outlaw projections based on climate change data.
The bill is the result of a state commission's prediction of a disturbing 39-inch rise in sea level by 2100. "[C]oastal business and development leaders" were concerned that this prediction would cost them and current homeowners millions, so they pressured legislators to do something about it.

But perhaps I'm being too harsh. Perhaps the developers have genuine concerns about the quality of the research conducted by the commission.

[Tom] Thompson, director of the Beaufort County Economic Development Commission, called the 39-inch prediction "dishonest statistically" and no better than a coin flip. In an interview, he dismissed climate change as "a phobia" pushed by environmentalists.

John Droz Jr., NC-20's science advisor, said commission scientists were "bent on promoting their personal political agenda." NC-20's projections "are entirely about the science" and have nothing to do with developers, or economics, Droz wrote in a letter to the News & Observer newspaper.

Let's see: Thompson, who does not appear to have scientific training, dismisses climate change outright, while Droz, who presumably has some kind of scientific training if he's a "science advisor", impugns the motivations of scientists with whom he disagrees (or perhaps is paid to disagree). One completely dismisses a scientific finding with plenty of evidence behind it, the other disingenuously attacks the character of the messengers. There's a pair you can rely on.

Perhaps they don't trust the commission's scientists because they know how untrustworthy they themselves are.

Or maybe they just have a lot of money on the line.

The politicians, of course, are all about the money, and I'm sure the business community has thrown plenty of it to push Bill 819.

If I lived in North Carolina, I'd be plenty sore that some of my elected representatives were behaving like jackasses, ignoring the best scientific recommendations available in favor of avowedly biased, financially motivated, antiscientific claptrap.

The great pity is, none of those elected representatives will be alive to see how wrong they were about climate change, and how devastating the consequences are for future generations. It's also a great pity that those of you who keep electing these venal, ass-ignorant politicos won't be alive to see how badly your children, grandchildren and great-grandchildren fare in the damaged world you leave them in your own ignorance and willful blindness.

Aaron Sorkin being glib

I didn't think I'd be linking to a Hollywood Reporter story that wasn't written by Tim Goodman, but I had to comment on this remark by writer/producer Aaron Sorkin:
I think it [the United States] is the greatest country in the world, and a lot of it has to do with what Emily Mortimer's character said: "We are a country that keeps saying that we can do better." It's also a country where I'm allowed to write a show like this. I'm glad we're that. I'm two generations removed from being blacklisted in Hollywood, I surely would have been one of those guys called in front of the committee, and we're not that country anymore.
[emphasis added]

Really, Mr. Sorkin? Really?

You know what the surest sign is that a country is susceptible to the authoritarian thinking that Joseph McCarthy exemplified?

Denying that it's possible.

Don't ever think your neighbors — or you — are immune to the kind of hypernationalism, jingoism, paranoia, and blame-mongering that the U.S. displayed so disgracefully in the 1950s. Hell, we demonstrated these same terrible traits again in the wake of the 11 September 2001 attacks on the World Trade Center and the Pentagon — and some, especially the more despicable commentators and politicians who style themselves as "conservatives", continue to indulge their basest instincts along the same lines. (Some of the latter may just see dollar signs and not be true believers themselves, but they're just as despicable.)

The ugly truth is, we are still that country.

That doesn't mean there's something uniquely broken about the United States. The U.S. is no worse than most other countries in that respect.

The ugliness that McCarthy so enthusiastically spewed, and encouraged so many others to spew in their turn, is deeply embedded in human nature. To deny that is to leave yourself open to falling victim to it. That's why these dark impulses have resurfaced again and again: in the Sudan, in Rwanda, in the former Yugoslavia, in India, and perhaps most tragically and unforgettably, in Germany in the 1930s and 1940s.

I've never watched any of Sorkin's TV shows or movies. I only hope they're not as glib as he was in this interview.

Sunday, June 24, 2012

Speaking of where the jobs are ...

In the previous post, I asserted:
Tech, being concentrated on intellectual property, doesn't need a huge number of people to do anything: it needs a limited number of people to think of things. To the extent that the tech industry makes tangible goods, those goods are made overseas.

... Tech is a high-profile but totally insufficient source of employment.

I had been thinking of the kind of work performed by software companies like Facebook, Google, and Twitter. Companies like these don't have retail arms, nor do biotech firms like Genentech. Even companies that make hardware, like H-P and Dell, don't tend to have their own stores; those that do, like Dell and Gateway (if either still has any), have a vanishingly small number.

The exception to the foregoing, of course, is Apple, which has a robust, growing and high-profile retail operation. In spite of that high profile, I confess I had completely forgotten about these stores until I ran across the New York Times' lengthy piece discussing some former workers' dissatisfaction with their pay and working conditions.

It's worth reading the whole piece, especially if you've always suspected there was something a little, well, off about the Apple fans of your acquaintance — if, in other words, you've wondered if they didn't qualify as "fanatics" rather than mere "fans". However, if you simply can't be bothered to read it, then at least note this telling statement:

The Internet and advances in computing have created untold millionaires, but most of the jobs created by technology giants are service sector positions — sales employees and customer service representatives, repairmen and delivery drivers — that offer little of Silicon Valley’s riches or glamour.
In other words, these service jobs are no more likely to elevate you into the middle class than working at Walmart or Starbucks. And yet, these jobs constitute the majority of those created by high tech. The high-paying jobs are quite few, by comparison, and totally inadequate to creating a middle class.

The Times article simply reinforces what I said in that last post: this country isn't creating things any more, but instead is providing services. Since altogether too many of those services are not worth a great deal of money, the majority of people aren't making a great deal of money — hence, fewer and fewer people can genuinely afford the kind of life that we have convinced ourselves we require. And my guess is that even if you're frugal, you're still having a tough time making ends meet on the salaries offered by most of the service jobs out there today.

To dig ourselves out of the hole we're in, we must do something either to lower our overall costs (so our limited incomes go further), or to increase our overall income. Neither of these is easy to do, and don't believe any politician who tells you otherwise (I'm looking at you, Mitt Romney).

And while we could think in terms of fixing our services-based economy to work better (again, either by lowering costs or increasing incomes), wouldn't it be nice if we started to restore our self-respect and our national security by restoring our ability to build tangible goods again at the same time? I'm not just talking about making new and modern factories: I'm also talking about not forgetting the hard-won lore and wisdom of the people who spent years observing and understanding what it takes to build things, big and small — tapping these people's brains before they're all dead and we have to relearn what they knew on our own.

What I'm talking about is anathema to free-marketeers because I'm talking about creating and implementing a national strategy, one that places the interests of the entire nation above the interests of shareholders. It's an audacious thought, to defy Adam Smith's invisible hand — and yet, this is what some of the U.S.'s most successful competitors like Germany and China have done. There are probably pitfalls to be avoided along the way (China's in particular is not a model we should emulate), but before you dismiss the idea out of hand, ask yourself this:

Has the last thirty to forty years of free-market sloganeering — in particular, the mad dash to deregulate everything in the name of unfettered capitalism — left this country better off? Has it left you better off?

If you answered "yes" to either of those questions, you're either part of the 1 percent, or you haven't been paying attention. Enron and the toxic-mortgage meltdown are the best-known catastrophes that arose from reckless deregulation in the last decade, but they're merely the tip of a very ugly iceberg. More shoes are waiting to drop (one fell on Jamie Dimon recently), and some of them could be much less abstract than the ones we've seen so far: for instance, I have a bad feeling about the long-term health consequences of hydraulic fracturing, which is not subject to clean-water regulations promulgated by the EPA (thanks to shady dealings between the energy industry, Congress and the George W. Bush administration).

Jobs, deregulation, declining wages, increasing costs: it's such a tangled web we allowed big business to weave. Yet we have to start untangling it, if we're to keep alive at least some of what makes our nation great. And time's a-wasting.

Saturday, June 23, 2012

The lie of the free market

"If you work hard, you will succeed. If you don't succeed, it's because you didn't work hard."

That's essentially the credo of free-marketeers everywhere. It's intuitively appealing in part because it's such a simple formulation.

That credo rests on an unspoken assumption that remains unspoken because it seems so obvious: the playing field is level. Everybody has an even chance to climb the ladder.

It's time for us to acknowledge that the playing field isn't level. Not in the United States.

I'm not talking about the well-known phenomenon that wealth begets wealth. That has always been the case, and will always be. I'm talking about the reality that if you want to go from working at Walmart to earning a decent, middle-class living, the odds are stacked hugely against you.

Don't believe me? Then tell me: where are the jobs?

Even if you have a college degree, the jobs are scarce. And if you don't have a college degree, the jobs are not only scarce, they're totally inadequate to supporting a family.

Where are the good-paying jobs of yesteryear? Sent offshore.

Where is the seed crop, so to speak, for creating new good-paying jobs? It's either on Wall Street, figuratively and literally speaking, or in tech. The trouble is, Wall Street today is essentially parasitic and tech industries are all about intellectual property. Being parasitic, Wall Street doesn't create real value, it creates imaginary value by leveraging real assets in irresponsible ways. Tech, being concentrated on intellectual property, doesn't need a huge number of people to do anything: it needs a limited number of people to think of things. To the extent that the tech industry makes tangible goods, those goods are made overseas.

Wall Street, in other words, is a morally bankrupt source of employment. Tech is a high-profile but totally insufficient source of employment.

The U.S. economy today is geared toward providing services, not manufacturing goods. It's an emphasis that is guaranteed to spiral this country downward, both in terms of economic prowess and the difficult to measure capacity of self-respect. Except for food (and not even all of that), we don't make things for one another: we offer services to one another. The trouble with that is, we still need tangible goods.

Kevin Phillips in the 2006 book American Theocracy explains that since the 1980s, the financial-services industry has played an outsized role not only in the economy, but in policymaking. The industry skewed federal legislation and regulation to benefit itself and in the process promoted the transfer of good-paying jobs overseas. With those jobs went the middle-class standard of living.

At one time, this nation might have been the land of opportunity free-marketeers still think it is. But no longer. The playing field stopped being level twenty or even thirty years ago, and it has been tilting away from the 99% ever since. It's time for us to acknowledge that reality — and to stop calling attempts to restore some kind of equilibrium "socialist" or other phony, stupid epithets.

And it's time to call those supposed defenders of the free market who are perpetuating the distorted status quo what they are: self-interested tools who are looking out for nobody but themselves. That's the capitalist way, of course, but it only works when we all have the economic freedom to pursue our own self-interest. We don't have that today.

(This post wouldn't have happened if I hadn't been inspired — or perhaps the word is "outraged" — by tonight's episode of Moyers & Company in which Matt Taibbi and Yves Smith excoriated our too-big-to-fail financial institutions.)

Thursday, June 21, 2012

Paying for music

A blog post by an NPR intern, Emily White, explains that while she has 11,000 songs in her music collection, she has only ever bought 15 CDs. She has, she says, no attachment to physical media.
I wish I could say I miss album packaging and liner notes and rue the decline in album sales the digital world has caused. But the truth is, I've never supported physical music as a consumer. As monumental a role as musicians and albums have played in my life, I've never invested money in them aside from concert tickets and T-shirts.
It seems to me she is confusing physical media with paying for something she readily admits she enjoys. But let's set that aside for the moment. What's the bottom line for her?
What I want is one massive Spotify-like catalog of music that will sync to my phone and various home entertainment devices. With this new universal database, everyone would have convenient access to everything that has ever been recorded, and performance royalties would be distributed based on play counts (hopefully with more money going back to the artist than the present model). All I require is the ability to listen to what I want, when I want and how I want it. Is that too much to ask?
That's a beautiful vision, up to the last couple of sentences. As for those last sentences, well, how fucking entitled can you get?

If you don't like enriching giant agro-corporations, does that entitle you to take food at the store?

Tell me, Emily, since that Utopian infinite music library you envision doesn't exist yet, how exactly do you expect those musicians you love to earn a living today?

I don't get this mentality that if you can steal, it's okay. And what really gets me is pretending that you're not stealing, that you're trying to make a better world.

What absolute horseshit.

As I've grown up, I've come to realize the gravity of what file-sharing means to the musicians I love. I can't support them with concert tickets and T-shirts alone. But I honestly don't think my peers and I will ever pay for albums. I do think we will pay for convenience.
More succinctly: "I like what you do. I just can't be bothered to pay for it."

You may have gotten older, Emily, but you haven't "grown up". Adults pay for what they want.

Thursday, June 14, 2012

Getting past the religious divide

David Bornstein has an Opinionator piece in the New York Times about the Chicago-based Interfaith Youth Core (IFYC) whose mission is to foster greater meaningful interaction between those of different faiths (and those who do not profess a religious faith at all), with the ultimate goal of increasing mutual understanding.
“We can show in a quite rigorous way that when you become friends with someone of a different faith, it not only makes you more open-minded to people of that faith, it makes you more open-minded about people of all other faiths. It makes you more tolerant generally,” says Putnam. “That’s the fundamental premise of the Interfaith Youth Core’s work.”
While I find the premise immensely appealing, I also find it hard to be optimistic that IFYC's model is a workable one for society at large. IFYC operates among college students, who are among the better-educated members of our nation: they are, in short, part of the elite. Also, being young, they're more likely to be open to new ideas and new experiences. Now, while I think much of the dysfunction and polarization caused by religious sectarianism is fostered by elite members of society (specifically, media-savvy and ambitious religious leaders and politicians), these elites aren't generally uninformed: rather, they've made the conscious decision to foment sectarianism because it strengthens the influence they have over their followers. Even if you think I'm wrong, even if you think those leading the sectarian charge are sincere in their beliefs, getting them to back down from their often incendiary rhetoric toward The Other is a tough sell because they're only human, and being human, have a tendency to double down when their cherished beliefs are challenged. (You might even look on my skepticism of IFYC's work as evidence of that tendency, and you'd probably be right.)

But I think the real challenge for those who want to believe in the IFYC model is to consider those who aren't of the elite: those who can't attend college, or who won't, or who didn't and never will. How do you foster engagement with those of a different (or no professed) faith outside a college campus?

It's the same problem that makes racial animus so hard to eliminate. It's easy in this society, in which individualism is all but canonized as a sacred right, to create a fortress for yourself into which you admit only those who pass your tests for admission. Heck, I cherish individualism myself. I think the crucial question is how you view those outside your fortress: are they strangers, who simply are unknown to you, or enemies? The problem is that a lot of extremely faithful religious believers in this nation are acting as if the rest of us are enemies, not mere strangers. That may not be how these people actually think of us, but that's how they're coming off because of the rhetoric they choose to applaud (from the aforementioned religious and political elites).

For myself, by the way, I have no animus toward believers, though I can see how you might think otherwise if you've read much else on this blog. I'm more than willing to live and let live. The reason my dander is frequently up about religion is that I consider it an extremely private matter, and I vehemently object to attempts to export it into the public sphere. Thundering rhetoric proclaiming that "this is a Christian nation", for instance, is simply beyond the pale: it is exactly the kind of establishmentarianism that the sainted Pilgrims (forgive my sarcasm, but that is how they're portrayed) sought to escape in Europe. I will not stand for my identity as a non-believer being trampled by the misguided zeal of sectarians. I demand my right to exist and to a minimal degree of respect and dignity in the civil sphere. It is the same right you, as a believer, insist upon, and it is the same right we are both guaranteed by the non-sectarian Constitution.

In spite of my skepticism, I really, really, really, really hope that efforts like IFYC's succeed. A lot, after all, is at stake.

Americans celebrate diversity. But one of the mistaken beliefs about diversity is that it leads to greater tolerance. Putnam’s research indicates that, unless people make a concerted effort to build bridges, diversity leads to greater social fragmentation — with lower rates of trust, altruism and cooperation. “What ethnic diversity does is cause everybody to hunker down and avoid connection,” he explained. “It’s not just the presence of diversity in your neighborhood. You’ve got to actually be doing things with other people in which you have a personal attachment. Diversity is hard, not easy.”
If we are to renew our democracy — and is there anyone who thinks that's not necessary? — I think working hard at getting along, at understanding what our diversity actually means and why it's important, is a worthy goal to set for ourselves as a nation. It's not as sexy as winning World War II or putting a man on the moon, but it's at least as important. And libertarians, don't fret: unlike those efforts, we don't actually need government to show the way. This one can and probably should start at the bottom, with each of us.

Monday, June 11, 2012

A good use of social media

Horace Mann Academy alumni have set up a couple of Facebook pages to air and to discuss sexual abuse allegations against former faculty and staff, according to the New York Times. There are accounts surfacing elsewhere, too.

I am, as you might suspect, no fan of social media. I honestly don't get all the fuss about Facebook or Twitter. However, this use of social media I understand and appreciate. I hope the virtual gathering places help the victims come to terms with what happened.

By the way, if you haven't read the original account that sparked the sudden flurry of stories and discussion, it's worth casting an eye over it. The allegations are deeply disturbing. However, as hard as it is to remember in such a charged atmosphere, they're still — and likely will remain — just that: allegations.

Sunday, June 10, 2012

On class reunions

I'm in the part of my life when class reunions are supposed to figure in it, to a greater or lesser extent: greater, if your glory days coincided with your school days, or lesser, if you're like most of us. (TV's Buffy the Vampire Slayer got it right when Buffy snapped at a would-be suicidal classmate, "Every single person down there is ignoring your pain because they're too busy with their own.")

I was on good terms with most of my classmates, and I was never the class leper (sadly, there was always at least one). Attending small schools meant that most everybody shared the same experiences and environment, not to mention that it wasn't difficult to figure out everybody's name. Graduating meant the tight-knit community of familiar faces was disbanding. It hurt, a lot, to lose that every four years.

So those must have been pretty good days for me, modulo graduations, right?

Er, no.

When I think back on those days, I remember my incredible discomfort in my own skin. I was a hopeless outsider who through luck and a gift for mimicry had figured out how to ape everyone else just well enough to pass for a normal person. But like a singer who masters a foreign-language song phonetically but grasps not a whit of its meaning, I lived in dread that I'd stumble. Every day was a nerve-racking, exhausting performance, and like any actor I both dreaded and longed to hear what the critics -- my classmates -- thought. The reviews weren't verbal so much as behavioral: were people still looking me in the eye? Did they sidle away as I approached? Did I hear about next weekend's party?

It was a colossal waste of effort, but I didn't know any better. Besides, rampaging hormones generated a surfeit of nervous energy that had to find an outlet.

Why, then, was graduation painful? For the same reasons a lot of people take comfort in less than perfect situations: a preference for the familiar, and an accompanying fear of the unknown. Besides, even for me, school wasn't completely horrible. I made a few genuine friends. Academically, I did well enough. And thankfully, my social skills improved, however slowly and painfully.

Still, I'd never go back to that time.

Even if I didn't feel so ambivalent about those days, a reunion wouldn't be that compelling. I'm not the same person I was then, and neither is any of my former classmates. A reunion plays on picking up old threads, but they're not attached any more. Not to me, anyway.

Those enthused about reunions say that we're all curious about our old classmates: who got married to whom, who exceeded expectations, who took an unexpected path. Sometimes this is phrased more sardonically: "Don't you want to see who got fat and old?"

Not really.

Looking back, I realize that what I enjoyed with most of my classmates was what a mature person would call "cordial relations", not friendship. I didn't know the difference back then. Perhaps for me, there was no difference. But that's neither here nor there. The point is, what I had back then simply isn't a strong enough inducement to revisit those times. I know now that we simply weren't that close.

I have nothing against you, my former classmates. I'm just not that curious about you. I doubt you're that curious about me, either. (You're not missing much if you are.) We're practically strangers.

So enjoy yourselves when you gather. Just don't expect me to join you.

Thursday, June 7, 2012

R.I.P. Ray Bradbury

I dipped my toe into the vast ocean of science fiction when I was young. The dozens of dog-eared paperbacks at the local library fascinated me -- and scared me: I was susceptible to nightmares and it didn't take much to set me off.

I don't remember when or how I figured this out, but at some point I came to understand that SF broke down into two large categories: the "hard" SF works that were technology-centric, and everything else (which, curiously enough, didn't have a descriptive name like "hard"). "Everything else" could include fantasies, more psychologically complex works, alternate histories ... basically everything that didn't have to do with spaceships or robots. I own to being a little vague about "everything else" because my preference was strongly for hard SF, not least because it didn't give me nightmares.

The Big Three hard SF authors were Asimov, Heinlein and Clarke, the "deans of science fiction" as critiques and commentaries about SF proclaimed again and again. Yet annoyingly enough, a lot of commentators insisted on adding a fourth name: Bradbury.

Bradbury's works fell into the nightmare-inducing category for me. The Martian Chronicles was my first encounter when a well-meaning relative gave me a copy as a Christmas present. The stories weren't to my taste, yet they were weirdly compelling and compellingly weird; they left me unsettled, yet unable to comprehend why. If Asimov's tales were big, bold Statements Of The Future done in broad but exact strokes, Bradbury's tales were Impressionist sketches of less clear-cut subjects. Bradbury had an eye for how humanity's less noble instincts -- or simply bad luck -- could diminish or subvert its mere technological prowess.

Here's the effect a Bradbury story could have on me: the image I took away from his famous short story "A Sound of Thunder" was not of the hunting party, nor of the carnivorous dinosaur that pursued them. Rather, it was that of the squashed butterfly. I have never seen a filmed version, nor have I read the story in decades, yet in my mind's eye, clear as day, is that hapless insect, embedded in the thick coating of mud on a hiking boot. It is huge, perfectly formed (no torn wings), and its markings are delineated by thick, garish black fuzz that almost seems to glow, it's so prominent. It is like a leering, demonic face, mocking me with the promise of no happy ending for humanity. The "sound of thunder" closing the story is perversely anticlimactic by comparison.

No Asimov tale ever left so strong an impression, and I read a lot more of his output than Bradbury's.

So even though I never much cottoned to his work, I can but tip my imaginary hat to Ray Bradbury, and thank him for showing me that science fiction stories could be more than space operas with heroic square-jawed engineers. I might not have liked the worlds he opened up, but I, and science fiction, needed them.

Bradbury passed away Tuesday (5 June 2012). Michiko Kakutani penned a nice appreciation.