Monday, January 30, 2012

Greenwald on Panetta and Obama

Specifically, Salon's Glenn Greenwald blogged about Panetta's interview with CBS News' Scott Pelley. Greenwald says Panetta's defense of the assassination by drone of Anwar al-Awlaki "viscerally conveys the rigidly authoritarian mindset driving" not just current Administration policy, but the viewpoints of nearly all of Obama's would-be Republican opponents.

Greenwald lays out in detail why Panetta, not to mention Obama, is completely wrong on the law and possibly wrong on the key fact, i.e., Awlaki's being a terrorist. The central premise of Greenwald's argument, though, is incontestable:
Here we have the U.S. Defense Secretary, life-long Democrat Leon Panetta, telling you as clearly as he can that this is exactly the operating premise of the administration in which he serves: once the President accuses you of being a Terrorist, a decision made in secert and with no checks or due process, we can do anything we want to you, including executing you wherever we find you. It’s hard to know what’s more extraordinary: that he feels so comfortable saying this right out in the open, or that so few people seem to mind.
Barack Obama has been entirely too fond of secrecy as President, and he has no claim on the loyalties of those of us who voted for him in 2008 hoping for a pullback from our rush toward the authoritarian national security state for which George W. Bush laid the foundations. Whatever you think of Awlaki, his assassination was a profound betrayal of one of our most crucial national principles: due process of law.

What's appalling is, there is no compelling alternative to Obama in either of the two main political parties. It's not clear that even Ron Paul would want to give up this secret power, were he somehow to gain the office (which would be a mistake for the country for a host of reasons). Every one of the mainstream candidates for President is a cheerleader for anti-terrorism measures, no matter how fatally they assault the Constitution.

Why software engineering estimates are so often wrong

Courtesy of Marco.org, a link to an explanation -- an imaginary hike from San Francisco to Southern California.

I found this quite amusing, and there's a grain of truth to it -- no, actually there are several grains of truth to it. However, you rarely go into a development project as ignorant of the upcoming terrain as the imaginary hikers are. Still, the reality is that engineers often are inordinately optimistic about these things. Or if they're not optimistic, they're so pessimistic about the effort required that not only are their estimates no more useful than an optimist's, but their estimates antagonize their managers and can harm their careers.

Thursday, January 26, 2012

On "the social graph"

"The social graph" is one of the many, many buzzwords and expressions that have flitted through my consciousness without taking hold. Fortunately, Maciej Ceglowski put more than a little thought into that expression (or rather, the concept behind it), and concluded that "The Social Graph is Neither".
There's no way to take a time-out from our social life and describe it to a computer without social consequences. At the very least, the fact that I have an exquisitely maintained and categorized contact list telegraphs the fact that I'm the kind of schlub who would spend hours gardening a contact list, instead of going out and being an awesome guy.
And the social networks we currently have are ... well, I can't improve on Ceglowski's description:
Social networks exist to sell you crap. The icky feeling you get when your friend starts to talk to you about Amway, or when you spot someone passing out business cards at a birthday party, is the entire driving force behind a site like Facebook.
I don't think this has to be so, but it certainly is so today. It's the only reason Facebook and other social networks have value to investors.

I've spent a little time around Facebook and am surprised by the lack of subtlety it allows. In spite of its much-vaunted privacy settings, nobody seems much interested in discriminating between "friends" and "friends of friends" and "everyone". And why would they be? Ceglowski makes the point that modeling the complexity of human interactions is hard, and we've never had a successful mapping of those interactions in software. Facebook's privacy (or, if you like, audience-dissemination) settings are a crude tool with which to control sharing of information, so there's no point in obsessing over who gets to see the news that your aunt is sick. (I would guess that you could choose to share only with selected Facebook "friends", but that virtually eliminates Facebook's advantage over email.)

The other thing about Facebook and other social networks is, they promote oversharing. "Oversharing" typically implies trading in intimate details the rest of us don't want to know, and there's certainly some of that, but I mean something different. What people overshare is trivia. Passing thoughts that in the past would have made up a moment's brief conversation with your companion of the moment now are memorialized on the Internet and shared with just about everyone. In principle there's nothing wrong with that, but in practice, I find these constant, niggling little updates to be like flies buzzing past my ear: I want to swat them away as the annoyances they are.

The sharing fostered by social networks is a lousy analogue to an actual human relationship.

Religious landscape survey

For various reasons I had occasion to look up religious affiliation figures for the U.S. I happened upon The Pew Forum's U.S. Religious Landscape Survey, which is the only such survey with which I'm familiar. If you're curious about the composition (by religion) of arguably the most religiously-minded industrialized society in the world, this is a gold mine.

A couple of fascinating tidbits:
  • Buddhists in the U.S. slightly outnumber Muslims.
  • Mormons outnumber Buddhists and Muslims combined.
Here is perhaps the biggest surprise to me:
The survey finds that constant movement characterizes the American religious marketplace, as every major religious group is simultaneously gaining and losing adherents. Those that are growing as a result of religious change are simply gaining new members at a faster rate than they are losing members. Conversely, those that are declining in number because of religious change simply are not attracting enough new members to offset the number of adherents who are leaving those particular faiths.
I know that some religions require that a nonbeliever marrying an adherent must adopt that religion, but surely that doesn't explain all the comings and goings. I always thought that faith was a core part of one's identity, but evidently that is not always the case.

Jay Leno sued over Romney joke

Per the BBC:
A lawsuit has been filed in California suing US comedian Jay Leno for what it calls "racist" comments on the Sikh shrine, the Golden Temple of Amritsar.
Why on earth are Sikhs so up in arms about this? I have no use for Leno as a comedian, but I'm at a loss to see how his joke was racist. The joke's target was Romney, or at least, his great wealth. Nothing I've read suggests that Leno mentioned Sikhs or identified the image as the Golden Temple.

The charge of racism concerning what to most of us was a middling joke is disturbingly reminiscent of the extreme touchiness exhibited by some Muslims, and as I argued with reference to those hypersensitive Muslims, there is a point beyond which catering to a religious group's sensitivity about what is sacred is actively harmful to a truly pluralistic, peaceful and denominationally neutral polity.

This lawsuit seems irrational and looks like an abuse of the legal system. And this remark comes from somebody who acknowledges the appalling resurgence of bigotry as a socially acceptable attitude in the U.S. If Sikhs far and wide agree that Leno was racist, they have a lot of work to do to explain why to the rest of us.

Wednesday, January 25, 2012

So, what are the ties that bind?

I have been reading Michael Sandel's 1996 book Democracy's Discontent. Its thesis is that the polarized and non-nuanced political culture in the U.S. works against any notion of civic responsibility, and thus, any hope of good government. In particular, the trend of Supreme Court decisions over the last 150 years has been away from applying the Constitution, and in particular the Bill of Rights, exclusively to the federal government: the Court instead has found that Constitutional protections apply to all levels of government. What this has engendered is what Sandel pithily, if a bit misleadingly on occasion, calls an emphasis of the right over the good: that is, we as a nation prioritize individual rights over our collective good (as expressed through a national government).

So it was with a certain weariness that I read of New Hampshire's new law giving parents "greater control over course materials taught in school".
Both the House and Senate voted to override the governor's veto of HB 542, which requires school districts to adopt policies to allow "an exception to specific course material based on a parent's or legal guardian's determination that the material is objectionable."
Those who agree with the bill, including some commenters to this article, say that it's only right that parents should be the final arbiter of what their kids are taught.

As one who "desires nothing more than just the ordinary chance to live exactly as he likes and do precisely what he wants" (to quote Alan Jay Lerner from My Fair Lady), I find the principle of individual choice to be enormously compelling. And yet, consider where New Hampshire's law leads us.

What led to New Hampshire's law was one parent's disapproval of a book taught in a personal-finance course. I can imagine any number of other books to which various people might object. So can you.

I can also imagine whole subjects that might be taboo. You don't have to resort to citing the stereotypical fundamentalist Christian strawman who might object to a biology course on the grounds that it might mention evolution: now a committed member of People for the Ethical Treatment of Animals (PETA) could object to the biology course as well, on the grounds that dissection was repugnant.

What is a public school -- what is a public education system -- supposed to teach?

That's not an exasperated question. That's a genuine query that has two further, deeper queries behind it:

What can we agree that kids need to learn?

and

Is a child's education supposed to prepare her to be a good citizen?

I don't have answers to these questions. They're more than ripe for answering, though. They also touch on a central issue in Sandel's book: what ties us together as a nation?

The trouble with the primacy of individual rights in our political culture is that it does absolutely nothing to promote any sense of commonality among us. I look forward to seeing what recommendations Sandel has for restoring a sense of unity, or at least comity, among us.

(On a totally separate note, boo to the Union Leader for failing to provide a byline to the linked article. It was short, but it didn't write itself.)

Wednesday, January 18, 2012

Paula Deen: a study in bad taste

I could easily learn to love a diet consisting exclusively of Paula Deen's recipes. I've watched more than a few episodes of her shows, Paula's Home Cooking and Paula's Best Dishes, so I know a lot of her go-to ingredients: mayonnaise, butter, cream, cooking oils of all types. I also know that, contrary to what a lot of you probably think, she occasionally has prepared lighter and more healthful fare.

I won't deny that I occasionally thought about trying out some of her less virtuous recipes, but I could practically feel my arteries hardening at the prospect. No, I watched not to plan meals, but just to gaze in fascinated horror at the delicious-looking yet health-devastating results. And as for those lighter, more healthful dishes I mentioned, I changed the channel when she prepared them: they just weren't interesting enough. If she and her producers deliberately slanted her shows in a high-caloric, heart-stopping direction, I imagine it was at least in part because of viewers like me.

So the fact that Deen publicly acknowledged she has Type 2 diabetes didn't surprise me. How could it? The food she is so well-known for showcasing practically invited the disease.

Her announcement, though, isn't what prompted me to blog this piece: a celebrity suffering from a disease isn't usually edifying or entertaining. What made Deen's story interesting (to use a neutral term) was how long she waited between the time she found out she had diabetes and the time she decided to announce it. That interval was three long years, during which time she wasn't in the hospital or in seclusion: she was making more episodes of her TV shows and otherwise continuing to play up the kind of cooking and lifestyle that undoubtedly played a major role in her becoming diabetic.

The question is, how did those three years, not to mention the six or seven before that (during which time her first show was in first-run production), affect her fans' health?

If you're feeling charitable, you can give her a pass on those earlier six or seven years, arguing that until she was diagnosed she might genuinely have been unaware of how her diet was affecting her own health, to say nothing of her viewers'. It's hard to believe her doctor didn't have words with her on that subject long before the diagnosis. But even if you give her the benefit of the doubt, how do you explain her silence for the last three years?

Here's how Deen herself answered Al Roker, who asked that very question:
"I intentionally did it, Al," Deen said. "I said 'I'm going to keep this close to my chest for the time being,' because I had to figure things out in my own head. I could have walked out said, 'Hey y'all, I have been diagnosed with Type 2 Diabetes,' and walked away. I had nothing to give. I wanted to bring something to the table ... I did not [want to] let diabetes stand in the way of enjoying my life."

When asked directly if her eating habits led to her diagnosis, Deen demurred.

"Certainly, that is part of the puzzle," Deen said. "But there are other things that can lead to diabetes."
Figure what out in her own head? Bring what to the table?

And if diet "is part of the puzzle", what stopped her from mentioning that to her fans?

Deen's excuse for her three-year silence is so fishy, it reeks. But what really frosted me were remarks quoted in Julia Moskin's article in the New York Times.
“I’ve always preached moderation,” she said. “I don’t blame myself.”
Is she kidding?

Deen's show was called Paula's Home Cooking. You know, cooking you do at home. You could be forgiven for assuming she meant it to be everyday cooking. Furthermore, in a New York Post piece from last August, she fired back at Anthony Bourdain's sour remarks (he called Deen "the most dangerous person in America") by saying:
You know, not everybody can afford to pay $58 for prime rib or $650 for a bottle of wine. My friends and I cook for regular families who worry about feeding their kids and paying the bills.
It's hard not to conclude that Deen intended Paula's Home Cooking, at least, to be a guide to everyday meals for "regular families". Even if you didn't intend to deep-fry sweet potatoes that night, the show certainly implied that deep-fat frying, and liberal doses of mayo, and copious amounts of beef, and so on, were okay in your diet. You might gain a little weight, but look at how happy Paula was making and serving the food! You and your family would be happy, too!

Trying to distance herself from the foreseeable consequences of her relentless promotion of her style of cooking is bad enough. To add insult to injury (or perhaps a better metaphor would be, to add frosting to the cake), though, at the same time Deen revealed she suffered from diabetes, she announced that she was now a paid spokesperson for a diabetes medication. According to the Times article, Deen cut for herself and her two sons
... a multiplatform endorsement deal with Novo Nordisk, the Danish pharmaceutical company that makes Victoza, a noninsulin injectable diabetes medication that she began promoting on Tuesday morning.
Let's run down her options.

She could have concealed her diabetes entirely.

She could have announced that she was diabetic, that she'd be revisiting her recipes in that light (which she has indeed suggested she'll be doing), and left it at that.

But she didn't. Instead, she waited three years to announce her diabetes, and in the same breath unveiled herself as a shill for a diabetes drug.

The unavoidable conclusion is that Deen deliberately concealed her diagnosis until she was in a position to leverage it for her own benefit.

She refused even to risk damaging her brand until she was ready to spin the news to her profit. And that profit will come right out of the pockets of those whose health suffered by being her fans. Yep: first they paid for her cookbooks, now they'll shell out for her drug. Talk about getting it coming and going.

Deen should be ashamed of herself.

Chris Dodd on Blackout Day

Former senator Christopher J. Dodd, now chairman and CEO of the Motion Picture Association of America, issued a statement in response to the Web Blackout Day protest. Blackout Day is an effort by some of the World Wide Web community to call attention to badly written legislation pending in the U.S. Congress that could, if abused, permit censorship of sites alleged to permit access to pirated content. (Visit fightforthefuture.org for more info on the legislation, which you might have heard about under its acronyms "SOPA" and "PIPA" or "Protect IP".)

Dodd's angry.
Only days after the White House and chief sponsors of the legislation responded to the major concern expressed by opponents and then called for all parties to work cooperatively together, some technology business interests are resorting to stunts that punish their users or turn them into their corporate pawns, rather than coming to the table to find solutions to a problem that all now seem to agree is very real and damaging.
Who are you, Mr. Voice Of The MPAA, to be calling anybody a "corporate pawn"?

Oh, and to which "chief sponsors of the legislation" do you refer? I haven't heard squat from Rep. Lamar Smith, chief sponsor of the House bill. Sen. Patrick Leahy has expressed only condescension for his Senate bill's detractors:
Perhaps if these companies would participate constructively, they could point to what in the actual legislation they contend threatens their websites, and then we could dispel their misunderstandings. That is what debate on legislation is intended to do, to fine-tune the bill to confront the problem of stealing while protecting against unintended consequences.
Of course, he ignores Congress' refusal to hear from the bill's opponents (see below).

But again, Chris, do enlighten us: before Sen. Marco Rubio and a few others only today (today!) expressed their reservations or explicitly reversed their prior support for these bills, who, exactly, "responded" to opponents?

Congress so far has done exactly jack and shit to educate itself on the potential technological and legal fallout of the pending legislation, as Josh Kopelstein reported in December. His account vividly portrays elected representatives as not merely ignorant of the Internet's technological underpinnings (forgiveable), but as totally uninterested in becoming informed about them (not forgiveable). In the face of a resolute refusal to understand just what the hell they were legislating into existence, it was not merely appropriate, but absolutely necessary, to throw cold water into our elected officials' collective face. Thus, Blackout Day. And without Blackout Day, do you think Sen. Rubio or anyone else would have taken a second look? You'd think, for instance, that Sen. Rubio would have taken the time to understand the bill before he decided to co-sponsor it -- and yet, here he is, withdrawing his support. Evidently it took that dash of cold water.

Dodd continues:
It is an irresponsible response and a disservice to people who rely on them for information and use their services. It is also an abuse of power given the freedoms these companies enjoy in the marketplace today. It’s a dangerous and troubling development when the platforms that serve as gateways to information intentionally skew the facts to incite their users in order to further their corporate interests.
As opposed to content providers intentionally skewing the facts to incite their users in order to further their corporate interests?

Please. Sen. Dodd, that was a particularly foolish thing to say. You don't represent some little guy about to be squashed by Big Bad Corporate Interests: you represent Big Bad Corporate Interests who have a well-publicized history of excessive zeal in squashing actual little guys.

The fact that Dodd and the MPAA have stubbornly refused to climb down from their haughty perch and acknowledge the fatal flaws in the legislation they played such a big part in crafting shows how absolutely untrustworthy they are on this subject. Whether they're as ignorant as some of our elected representatives or they're simply evil, they cannot be trusted to find solutions to the problem of unauthorized access to content by themselves.

As Fight for the Future's educational video reminds us, Big Content is the same bunch that fought to make the videocassette recorder and the MP3 player illegal. Yes, creators need to be paid. However, they can't be allowed to run the whole show. To quote from the video:
How far will they take all this? The answer at this point is obvious:

As far as we'll let them.
Until Dodd and the MPAA prove that they give two figs and a damn about anyone else in the fight to protect Big Content (and we're talking about big content providers, not little folk), they have no moral standing to protest or to lecture.

[UPDATE: Changed reference from "Lamar Alexander" to the correct "Lamar Smith"; my apologies to Sen. Alexander.]

Tuesday, January 17, 2012

Bill Kling

I had no idea who Bill Kling was (it turns out he's the founder and president emeritus of the American Public Media Group), but this interview by Adam Bryant for the New York Times made me sit up and take notice.
You have to be willing to go into a room and say, “Why can’t this happen?” And then have someone look at you and say, “That’s the dumbest question anybody ever asked.”

Even though you are the C.E.O., you have to allow and encourage that kind of feedback. Because you can sink a company if you come in with a load of ideas and innovation and creativity that’s bigger than the company can carry. So you’ve got to have people coming back and saying, “We know that,” or “We understand where you’re going with it, but it’s not something we can do at this point.”
And:
A mentor of mine taught me that every perspective is additive, because every person sitting in a room is looking at things differently. Each of them has a different perspective. They come from a different way of thinking and different experiences. And their collective perspective gives you a better outcome. So you have to value the perspectives and try to organize those perspectives in some useful way that lets you go forward. Anybody who tells you that they can do it all themselves needs an ego adjustment.
Kling generally evinces a perceptiveness borne of a refreshing lack of egocentrism. This is the kind of guy for whom I'd be happy to work. Check out what he has to say: it'll be well worth your while. And if you run a company, think about incorporating his wisdom into your own management practice.

Monday, January 16, 2012

More on the ring that stopped the show

I thought it was merely a funny story, albeit mortifying for the poor fellow at its heart: during a quiet passage of a symphony, a front-row patron's phone went off, echoing throughout the hall and causing the conductor to halt the performance until the offending phone was silenced.

The details are key:
  • The phone was an iPhone.
  • The owner, who is anonymous in the New York Times account everyone has read, said he had been issued the phone only the day before by his employer.
  • The owner had hit the "Silent" button before the concert started, and believed this would keep the phone, well, silent.

It turns out that, first of all, I have to 'fess up to totally bogus technical advice:
... he seems to have been laboring under the understandable misconception that merely locking the screen turns the iPhone "off". (For future reference, the easiest way to silence your iPhone is to use the "vibrate" button and to keep it out of contact with resonant surfaces like hard tabletops.)
In fact, current iPhones have a Ring/Silent button, so I was completely off base suggesting he had merely locked the screen. I still use an original iPhone, so I am unacquainted with more recent models' hardware and software (the original iPhone cannot run iOS 4 or later). I therefore should never have piped up with my unsolicited and smug advice. I thought about deleting that part of the paragraph as unduly smug, in fact, but finally let it stand. Ah me: hindsight is always 20/20.

Anyway, it seems that the iPhone's behavior is more subtle than I thought. The Ring/Silent button, according to Apple's developer guidelines, is meant to suppress sounds the user didn't "expect or explicitly request, such as ringtones or new message sounds". The reason the phone started ringing was because the fellow had (apparently by accident) set an alarm earlier in the day. An alarm is deemed to be an intentional action that is not subject to the Ring/Silent button's "silent" setting.

Daring Fireball's John Gruber argued that Apple's design decision, to allow explicit user actions to override the "silent" setting, was correct. His rationale heavily favors those who use the iPhone as an alarm clock:
... if the mute switch silenced everything, there’d be thousands of people oversleeping every single day because they went to bed the night before unaware that the phone was still in silent mode.
It seems to me this is the kind of mistake you make only once, assuming you figure out the root cause. Still, there's one vote for the status quo.

Andy Ihnatko has a lengthy counterargument. His argument is to follow the principle of least surprise.
The key question to ask is “When the user slides the switch to ‘Mute’, what does he or she think is going to happen?” They’re most likely to think that their iPhone will be completely silent until they flip that switch back.
Ihnatko concedes that Gruber's unsuspecting sleeper will be disagreeably surprised when she wakes up late the next morning, but says that this bad outcome is better than what happened to the fellow at the concert.
My philosophy is “It’s much better to be upset with yourself for having done something stupid than to be upset with a device that made the wrong decision on its own initiative.” Every time I screw up and take responsibility for my own stupidity, it’s another Pavlovian stimulus that encourages smarter future behavior. If I forgot to unmute my phone after a movie, I’m a dumbass. But if my iPhone makes noise during the movie despite the fact that I’d deliberately chosen to silence it, I can only conclude that the dumbasses in this equation reside about 3,000 miles west of here.
Marco Arment, in turn, agrees with Gruber. He boils down the situation to its essence (emphases in the original):
The user told the iPhone to make noise by either scheduling an alarm or initiating an obviously noise-playing feature in an app.

The user also told the iPhone to be silent with the switch on the side.

The user has issued conflicting commands, and the iPhone can’t obey both.

...

When implementing the Mute switch, Apple had to decide which of a user’s conflicting commands to obey, and they chose the behavior that they believed would make sense to the most people in the most situations.
Arment is the one who cited and quoted the Apple developer guidelines I mentioned above. They're a fine explanation, but not entirely appropriate to what is essentially a user-experience question. The fact that the developer guidelines explain Apple's rationale means exactly squat to the vast majority of iPhone users, who, after all, aren't developers.

It turns out, though, that Apple's manual for the current iPhone running iOS 5 sets forth the behavior (though without giving a rationale for it) on page 146:
When set to silent, iPhone doesn’t play any ring, alert, or effects sounds. It does, however, play Clock alarms.
Not that the humiliated patron of the arts who started this back-and-forth would likely have seen this: as noted, he had only been issued the phone the day before. He would have to have been an idle (or obsessive) fellow indeed to have made it through the hundreds of pages of the user guide to find this sentence in only a day, or even the similar one on page 49. And if he had been the kind of person to have read the entire user manual that quickly, he also would have realized he had set a bogus alarm, and would have cancelled it prior to the concert.

Ihnatko's position makes more sense to me than Apple's because it's so easily stated: "Silent" means silent, period. It also emphasizes the primacy of a hardware control over software ones, which is intuitively sensible to me. There's nothing subtle, nor should there be, about a physical button: it should be a big, inarguable override to whatever you might have done in software. The reason is, you're likely to use the hardware control when a situation requires enforcing swift and sure behavior on your phone. When you enter the symphony hall, you're concentrating on the concert ahead, not on the weekly reminder to take out the trash that you set months ago which will, it turns out, go off during the concert. Can you imagine instead having to unlock your phone and search it for all the potential noise-making triggers you might have set? Madness.

I'm sure there is or will be a bug filed about this incident with Apple. I only wish I could see the discussion among the engineering staff.

Sunday, January 15, 2012

Freemason conspiracy

I caught a documentary weighing the claims of conspiracy theorists that Freemasons planted hints of their plans to take over the world (again, Pinky!) on the back of the dollar bill. Freemasons, the claim goes, don't want the rest of us to understand the meaning of the symbols because they don't want us to know they're ushering in their vaunted New World Order.

I don't want to spoil anyone's fun, but ...

If you wanted to keep a secret, would you embed clues to it in a piece of paper that hundreds of millions see every day?

Saturday, January 14, 2012

"The Rise of the New Groupthink", Susan Cain

Susan Cain's piece in the New York Times Sunday Review argues that American society is "in thrall" to the notion that forced gregariousness is key to "creativity and achievement". "Most of us now work in teams, in offices without walls, for managers who prize people skills above all. Lone geniuses are out. Collaboration is in."

I haven't worked in a non-software development business environment in years, so I don't know whether Cain is accurately portraying how the world works, or rather, how the world is trying to get work done. I definitely agree with the following point, though.
But there’s a problem with this view. Research strongly suggests that people are more creative when they enjoy privacy and freedom from interruption. And the most spectacularly creative people in many fields are often introverted, according to studies by the psychologists Mihaly Csikszentmihalyi and Gregory Feist. They’re extroverted enough to exchange and advance ideas, but see themselves as independent and individualistic. They’re not joiners by nature.
I can't stress enough the importance of giving people with genuinely creative work to do the time and space to do that work by themselves. A person tasked with bringing something new into the world through the power of his brain cannot succeed if he's constantly forced to interact with others.

Cain's other point, about the most creative people often being introverted, is a point that is slowly dawning on the world. I was unaware of that truth until I started reading The Introvert Advantage: How to Thrive in an Extrovert World by Marti Olsen Laney. I don't often link to books, but this one has had an enormously liberating effect on me; it explained why I have always reacted so negatively to certain situations, like forced socializing and team-building exercises, and has allowed me to consider different strategies of responding. I'm not, as Cain put it, a joiner by nature. (To be clear, let me add that I'm not among the most creative people: I'm just introverted.)

As I hinted above, the software development environments in which I've worked have all acknowledged programmers' absolute need for extended intervals of solitary work. I haven't worked for many different employers, but I can't imagine "the new groupthink" being successfully employed anywhere software is being written. Meetings and collaborations are necessary, of course, especially at the start of projects when goals are being set and no clear solution has been settled upon. After all have agreed on how a project is to be carried out, though, meetings are necessary evils at best, useful only to ensure that people are still on target and haven't created problems for themselves or others. Otherwise, software developers need to be as free as they can to focus.

After citing studies that show collaboration to be less optimal as a creative strategy than previously thought, Cain notes the intriguing counterexample provided by the Internet:
The one important exception to this dismal record is electronic brainstorming, where large groups outperform individuals; and the larger the group the better. The protection of the screen mitigates many problems of group work. This is why the Internet has yielded such wondrous collective creations. Marcel Proust called reading a “miracle of communication in the midst of solitude,” and that’s what the Internet is, too. It’s a place where we can be alone together — and this is precisely what gives it power.
As an explanation, this is mere hand-waving (and I suspect Proust did not mean that reading constituted mutual communication such as we're discussing). My own experience suggests a concrete explanation for why certain types of electronic communication work well for collaboration. As I noted several weeks ago, communications technologies come in two flavors:
The phone and other synchronous communication media, like instant messaging, are fundamentally unwelcome distractions. The ring of the phone or the chime of the new message is an imperative that must be responded to right now, even if the response is to let the call go to voicemail or to ignore the incoming message. Such distractions break one's concentration and can be significant impediments to progress.

Email represents a good tradeoff, in principle, between the sender's and receiver's priorities. The sender generally wants an answer sooner rather than later, but some respect is given to the receiver's need to get real work done.
Email is an asynchronous communication technology, as is Usenet news. Asynchronous electronic communication in general strikes the same balance between the sender's and receiver's needs as email. Many Internet communications protocols are asynchronous, and that's why they're so well-suited to collaborations. (It helps that those who collaborate via the 'net have been trained, by long use of the Internet, to expect that collaboration should work in this way.)

Extroverts, call your meetings: we know you love them, and we even understand that you benefit from them. Understand, though, that we introverts don't. Give us the space, and time, we need to be effective (and happy, for that matter).

Friday, January 13, 2012

The ring that stopped the show

Generally I don't have sympathy for those whose cell phones ring out in the wrong places -- during a movie, say. But I feel for the fellow whose phone went off during Tuesday's concert by the New York Philharmonic. According to Daniel J. Wakin's account in the New York Times:
The unmistakably jarring sound of an iPhone marimba ring interrupted the soft and spiritual final measures of Mahler’s Symphony No. 9 at the New York Philharmonic on Tuesday night. The conductor, Alan Gilbert, did something almost unheard-of in a concert hall: He stopped the performance.
Gilbert felt obliged to stop the music because the ring kept going for some time. Why didn't the offending phone's owner, referred to in the article as "Patron X", kill the noise immediately?
Patron X said he had no idea he was the culprit. He said his company replaced his BlackBerry with an iPhone the day before the concert. He said he made sure to turn it off before the concert, not realizing that the alarm clock had accidentally been set and would sound even if the phone was in silent mode.

“I didn’t even know phones came with alarms,” the man said.
(Patron X's best guess is that he set an alarm earlier in the day without realizing it.)

I believe him. It's easy to imagine being the poor schmuck granted a brand-new gadget and having to use it before being fully conversant with its behavior. In his case, he seems to have been laboring under the understandable misconception that merely locking the screen turns the iPhone "off". (For future reference, the easiest way to silence your iPhone is to use the "vibrate" button and to keep it out of contact with resonant surfaces like hard tabletops.)

I would have been ticked off if I had been in the audience. I wasn't, though, so I'm free to extend some long-distance sympathy to you, Patron X.

[UPDATE: my "for future reference" advice was totally off-base, and the issue is more complicated than I thought. See my follow-up post for the details.]

Tuesday, January 10, 2012

Ricky Williams

Not being a sports fan, sports stories generally mystify and bore me. Every once in a while, though, I run across something special. This time, it's Les Carpenter's thoughtful piece on Ricky Williams for Yahoo! Sports. (I'm as shocked as you are that a Yahoo!-linked piece is worth reading.)

Williams is almost a decade older than most of his Baltimore Ravens teammates, and the age difference tells. Nevertheless, he still loves the game, so being the resident team loner isn't the burden you might think it is.
On another team, one with greater expectations for him, he might be uncomfortable. But the Ravens’ locker room is always carefree and Harbaugh has a boyish enthusiasm. Everybody seems secure. The coaches don’t yell much. And that fits Williams at this point in his life. Here he is content to fade into the background.

“I do feel like a loner but I think it’s because I look at things differently than other people,” he says. “There’s a quote out of Carl Jung’s autobiography and … he’s talking about when he was a kid and he saw a pattern when he was a kid of aloneness and separateness because he sees things that most people don’t and he wants to talk about them, but most people don’t.

“And so I kind of feel like that. When you look at the world like that, especially as a football player, chances are you aren’t going to find a lot of people to relate to.”
I love the idea that a self-described loner can carve out a niche for himself in as extravagantly extroverted a career as pro football. It gives introverts everywhere a boost to know that such a thing is possible.

I wish Williams all the best in his football career and in his post-football life, whenever that starts.

And it's a great pleasure to read pieces as well-written as this one, so thanks to you, Mr. Carpenter.

Thursday, January 5, 2012

Style icon

The New York Times has a profile of what I can only describe as "style icon" Tyler Brûlé, magazine publisher, radio station founder, and boutique owner.
The common thread behind these disparate ventures is Mr. Brûlé himself, who embodies the border-agnostic sophisticate whom the Monocle brand is built around. His globe-trotting persona (cocktails-with-Danish-diplomats intellectualism, sleeper-seat jaunts to Taipei) has inspired legions of followers, who hang on his oracular pronouncements on what’s next.
Nobody forced me to read this. Nobody is forcing me to read his magazines (and I haven't) or buy his favored goods (again, I haven't), a sampling of which the Times helpfully lists in a sidebar accompanying the article.

And yet, Brûlé bugs the hell out of me.

When there is so much wrong with the world -- when so many lack even basic necessities like food and water -- it seems obscene to give men like him such attention.

"Fashion" and "style" of the sort the Times fetishizes in this piece are such insipid and inappropriately frivolous preoccupations these days, I'd be embarrassed if I gave a damn.