Wednesday, November 30, 2011

Writing has changed

In an article about the late diplomat and foreign policy expert George Kennan, Todd Purdum quotes from a 1992 entry in Kennan's diary:
“The dispatch of American armed forces to a seat of operations in a place far from our own shores, and this for what is actually a major police action in another country and in a situation where no defensible American interest is involved—this, obviously, is something that the Founding Fathers of this country never envisaged or would ever have approved. If this is in the American tradition, then it is a very recent tradition.”
Forget the political content of that statement and consider only the elegance of the style. Can you imagine reading anything this gracefully expressed in any mainstream publication today, let alone in a diary (whether or not intended for eventual public view)?

Kennan was not a professional writer (not primarily, anyway). His writing can fairly be considered representative of how well-educated Americans of his generation expressed themselves on paper.

Forgive me if I lament how badly the standard for good writing has slipped since Kennan's time.

Eliminating email

IT specialists (or, as they like to style themselves, "business technologists") AtoS are looking to eliminate email in the workplace. At least, Chairman and CEO Thierry Breton is making this his goal. From the company's zero-email manifesto:
The volume of emails we send and receive is unsustainable for business. It is estimated that managers spend between 5 and 20 hours a week just reading and writing emails. Furthermore, they are already using social media networking more and spend around 25 per cent of their time searching for information.
I'm sympathetic to this point, but AtoS's goal of using social media and collaboration tools to replace email entirely is misguided.

First, it would help to know if the increased use of "social media networking" is for business purposes exclusively, or if it includes workers' personal social media presences. If the latter is the case, then it's wrong to cite this point in favor of moving further in the direction of social media for workplace communication.

More importantly, though, if your job is in sales or some other area that requires instant communication, social media and the old standby, the telephone, might be good replacements. However, one reason so many of us no longer use the phone for routine communication is that our jobs do not require instant communication. Quite the contrary: our jobs require the ability to concentrate on whatever problem it is we're solving. The phone and other synchronous communication media, like instant messaging, are fundamentally unwelcome distractions. The ring of the phone or the chime of the new message is an imperative that must be responded to right now, even if the response is to let the call go to voicemail or to ignore the incoming message. Such distractions break one's concentration and can be significant impediments to progress.

Email represents a good tradeoff, in principle, between the sender's and receiver's priorities. The sender generally wants an answer sooner rather than later, but some respect is given to the receiver's need to get real work done. The problem is that some people don't understand when not to send email. Sometimes covering your ass by passing along barely relevant information isn't helpful. Sometimes you need to refrain from sending that hilarious but totally useless rant to your colleagues. Sometimes you should consider if there's another way to get your question answered than by bothering the person who's supposed to be getting that critical project done.

Breton and those who feel as he does probably see more useless email than front-line workers because management always generates and receives more useless administrivia than people who actually make products or render services. Making "zero email" a goal, though, doesn't address the fundamental problem that some communication simply isn't necessary. Teach people to say only what they need to say, rather than forcing everyone to make communication rather than real work their primary focus.

Tuesday, November 29, 2011

Wise words from JWZ

JWZ (or jwz, as he usually spells it) is Jamie W. Zawinski, a self-described winner of "the Netscape Startup Lottery" and semi-legendary engineer. In his blog entry "Watch a VC use my name to sell a con", jwz takes Michael Arrington to task for
... trying to make the point that the only path to success in the software industry is to work insane hours, sleep under your desk, and give up your one and only youth, and if you don't do that, you're a pussy. He's using my words to try and back up that thesis.
Read the blog entry to find out the eminently more sensible advice jwz has for young developers today. Along the way he explains why venture capitalists like Arrington want you to believe that working insane hours is the path to success.
When a VC tells you what's good for you, check your wallet, then count your fingers.
(Link courtesy of Hacker News.)

Monday, November 28, 2011

The cult of ignorance

Not sure why this made John Gruber's radar, but Isaac Asimov's brief quote about the "cult of ignorance" in the U.S. is all too appropriate to our time.

Barney Miller on DVD

Finally, somebody's releasing all of Barney Miller on DVD.

Sony evidently didn't see enough of a market for the 1970s TV cop comedy, and after releasing only the first three seasons, announced it wouldn't release any more. That was very bad news for those of us who have somewhat clearer memories of it than the New York Times' Mike Hale, who could only muster the faint praise, "But even though the show was a comedy, its mood was informed by the darkness of the times and the troubles of New York City, where it was set."

Hale's right, but doesn't go far enough. The joy of Barney Miller is that it made that darkness hilarious. The show reveled in confronting the detectives of the 12th Precinct with absurdity after absurdity, all rooted in the difficulty ordinary people have seeing eye to eye in the pressure cooker of New York City. For the most part, the detectives were jaded enough merely to be sardonically amused by their unwilling guests. The witty banter was generally low-key, a refreshing contrast to the dopey histrionics of other shows like Happy Days, the 900-lb. gorilla of ABC's comedy lineup at the time. Even the staging was relaxed, with more long, uncut sequences than usual in sitcoms, giving it the feel of a play. Barney Miller, a little like another contemporary, All in the Family, also wasn't shy about using silence to build tension, both in service of a joke and occasionally to highlight a more dramatic moment. Unlike M*A*S*H, however, Barney Miller never forsook comedy for preachy drama, remaining smart and funny right to the end.

So it's great news that Shout Factory is making all eight seasons available in a 25-disc box set, even if some of us who own the earlier Sony-issued seasons are feeling bitter toward Sony for essentially making us pay twice. It's worth it to be able to watch all the hilarities at the ol' 1-2 any time we like. Raise a mug of Yemana's legendarily bad coffee and be happy.

Brownback vs. Sullivan

Another politician, another imbroglio. This time fate dragged Kansas governor Sam Brownback into the spotlight when Kansas high school student Emma Sullivan tweeted a disparaging remark about the governor. Brownback's staff spotted the tweet and made enough of a stink about it that Sullivan's principal hauled her into his office and ordered her to write an apology. ABC News has the story.

It never seemed to occur either to the principal or to Brownback's staff that the story might not reflect well on them if it became known to a larger audience. As it did, of course.

Ironically, I think this will turn out to be a net win for Brownback. His damage control was perfect:
“My staff over-reacted to this tweet, and for that I apologize. Freedom of speech is among our most treasured freedoms,” Brownback said in a statement.
The notion of punishing an unknown teenager for a rude tweet -- and it was merely rude, not even insightful -- is grade-A stupid, so stupid that it's hard to imagine any seasoned politician countenancing the effort. My guess is that Brownback, no novice pol, genuinely believes his staff overreacted. Whether or not I'm right, though, Brownback looks magnanimous, albeit at his staff's expense. The net boon to his reputation, though, is probably great enough that nobody on the staff will lose his or her job.

However, I'd like to know more about Karl Krawiak, Sullivan's principal. Here's an excerpt from an early account of the incident in the Wichita Eagle:
The principal “laid into me about how this was unacceptable and an embarrassment,” Sullivan said. “He said I had created this huge controversy and everyone was up in arms about it … and now he had to do damage control.

“I’m mainly shocked that they would even see that tweet and be concerned about me,” she said. “I just honestly feel they’re making a lot bigger deal out of it than it actually was.”
How is it that Sullivan, 18, is so much wiser than Krawiak, who one assumes is older than she?

Somebody decided it was better to cater to the governor's clueless and oversensitive staff than to exercise a little common sense and say, "This isn't a big deal." Was it Krawiak or somebody higher in the school district? That is, was Krawiak threatened with disciplinary action by his superiors, or does he come by his servility naturally?

Thursday, November 24, 2011

Killing the golden goose

The golden goose, in this case, is Web content people want to consume, and what's killing it is the overweight and highly irritating context of ads and other noncontributory nonsense surrounding it. I've been seeing more pieces about this since Brent Simmons' The Pummeling Pages started heating up the blogosphere. Rian van der Merwe's "Please let this not be the future of reading on the web" echoes Simmons' complaints and provides examples and analysis of a couple of egregiously bad pages (also read the comments to van der Merwe's piece for more analysis).

What the mainstream publishing industry's presence on the Web has trained me to do is to be expert at ignoring its ads. Anything that flashes, bounces, or otherwise behaves like a five-year-old desperate for the teacher's permission to go to the bathroom simply doesn't register with my brain. If something other than legitimate content obscures the whole page, boom! -- the page goes.

Is that what you advertisers want?

Of course, at home I have broadband, so the considerable additional bandwidth required for this absurd barrage of advertising isn't too burdensome (yet). Wandering about outside, though, is another matter. As the owner of an older phone that doesn't support 3G, I have learned that Web pages load only after interminable waits, nearly always because third-party ads take forever to load and the pages are designed not to display content until the ads have finished downloading. I all but completely avoid the Web when I'm stuck with the EDGE connection.

Is that what you advertisers want?

Here's a perceptive comment from someone claiming to work for "a big media company":
... the systems in place are difficult to remove. I tried to sell my bosses on the Deck (didn’t work, we have an in-house ad networks with thousands, if not millions already already invested, tough to change something when, from the purely bottom-line perspective, the status quo works for publishers). A membership system would require infrastructure changes and a total rethinking of the entire system. That’s not something that’s going to happen overnight. In short, the status quo works, it makes money. And that makes it very tough to convince publishers that something needs to change. Over the long run current practices are hurting publishers because they’re alienating their readers, but publishers are only looking at the bottom line, and, at the moment, they’re not seeing that hurt yet.
Greg Golebiewski responded and further illustrated the problem:
Back in 2008 when we started pitching publishers about other, better, ways of content monetization (including small or nano payments; they can work for large publishers as well), we received a unanimous “no way,’ even though most of our interlocutors were well aware of the changes in the marketplace and willing to admit that advertising might not be enough to sustain their online presence. Still, they were rejecting anything other than ads. One of our contacts then, had the guts to explain why: publishing is run by advertising guys and they see the other streams of revenue as undermining their role. Most journalists are against direct payments as well. Imagine one or two popular columnists receiving all the tips or on-demand payments, and the rest of the writers close to nothing.
If you work in publishing and care more about content than ads, you might want to rethink working for a big publisher, assuming Golebiewski's right about who's running the show.

The trend of the advertising tail wagging the content dog is driving a lot of us away from the ad-supported Web. I, cheapskate extraordinaire, broke down and started paying for the New York Times. Other discriminating readers are going to look for alternative support systems like micropayments. We're looking to escape the torrent of ads -- not just because they're numerous, but because they consume ridiculous amounts of bandwidth and they actively interfere with the primary reason we're on the Web in the first place.

Publishers and advertisers both, wake up before you crush your golden goose under the weight of your ads. Find a better way, because for too many of us, this one isn't working.

Wednesday, November 23, 2011

Give us a better-written future

It amused me to stumble over a small nest of pieces wondering whether it's possible to resurrect the Star Trek franchise as a new TV series. It seems the whole megilla was kicked off by a pair of essays by Graeme McMillan, "Why Isn't There a New Star Trek TV Show Already?" and "Why Star Trek Might Not Work For Today's TV Execs". Susana Polo followed up with "Is Star Trek Unpalatable to the Television Industry's Modern Tastes?". Alyssa Rosenberg cited both Polo's and McMillan's pieces in her own "Would Star Trek Work On Television Today?", while Erik Kain built on Rosenberg's proposal in his "Making Star Trek for This Generation" and Alex Knapp "snarked at both of them" (Rosenberg and Kain) and came up with a proposal for three interlocking series in his blog entry "How to Reboot Star Trek for Modern TV". My goodness.

All of these writers seem to agree that nothing like Trek exists on TV today and wonder if the reason is, TV execs have decided the audience doesn't care for utopian futures. They also wonder if perhaps the broad tonal palette of all the series is perceived to be off-putting: that is, they wonder if modern TV series are expected to be all action, or all comic, or all dramatic, or ... well, you get the idea. And finally, they wonder if TV execs believe science fiction on TV only works if it has long story arcs, as the remake of Battlestar Galactica did.

These folks start with the premise that Trek as a whole, or if pressed one particular Trek series (though they can't agree which), has been good television. So good, in fact, that TV simply can't figure out how to market it.

I'm sorry: I have to laugh.

I've admitted liking the original Star Trek -- to being mildly obsessed by it at one time, in fact.

That doesn't mean I think Trek in any of its incarnations, and I've seen them all, was very good as a TV show. Quite the contrary. The writing and acting for the most part were substandard. Actually, that's being kind: the writing and the acting for the most part were wretched. Any time a Trek series wandered into the areas of character development or emotional conflict, the writing sank into the morass of melodrama and the acting followed. Patrick Stewart and John Billingsley are the lone franchise regulars to have escaped with their dignity intact (and Stewart had more than his share of close calls).

I've never spent time with hardcore Trekkers, as they prefer to be called, but I will guess that their affection for all the series stems from the fundamental hopefulness of Gene Roddenberry's vision of the future, rather than from the episodes being great TV. I, too, have an affection for a hopeful, rather than a dystopian, future. The thing is, the Trek franchises only gave us adolescent portrayals of such a future. These days, even portrayals of adolescents are more emotionally complex and real than most Trek scripts.

My guess is that McMillan's first piece nailed the reasons no new Trek series are in the works: a concern that a series would weaken the newly-strong movie franchise, and uncertainty over who has the right to make a series at all (it's more complicated than you think; read McMillan's piece for details). That is, the reasons probably have little to do with the franchise being wrong in some way for contemporary TV audiences.

But whatever the reason(s) for the dearth of Trek, as far as I'm concerned, this Trek-less period is a good time for somebody else's vision to take hold.

Babylon 5 showed us an alternate and just as hopeful future back in the 1990s, but while J. M. Straczynski's series was exquisitely well-plotted, dialogue and acting were usually abysmal (always excepting Peter Jurasik's and Andreas Katsulas' superb performances). The Battlestar Galactica remake showed everyone how to make a space opera for the modern age, with engrossing writing and compelling acting, but it was a tad grimmer than most Trek fans would probably like.

Can't somebody produce a deep-space series that shows us a hopeful future while being written for adult sensibilities?

It's long past time that the Trek franchises' stranglehold on the public imagination was broken. Make a smart TV show I won't cringe to share with my friends. Show me a future I can actually believe in.

Food labels

It's an old blog entry (from early August 2011), but Anahad O'Connor's piece on serving sizes is still worth reading if you haven't thought much about how food labels can mislead.

Manufacturers of processed foods have to list the amounts of fat (generally broken down further into specific kinds of fat), cholesterol, sodium, protein, and a number of other -- what does one call them, "characteristics"? -- of their products. The nutritional facts are always listed on a per-serving basis, and therein is the rub: the manufacturer gets to decide what size a serving is. Naturally, the way to make your not terribly healthful product look better is to make the serving size as small as you can. Critics have been after the Food and Drug Administration for years to require manufacturers to use more realistic serving sizes in their calculations. O'Connor's piece reported on findings by the Center for Science in the Public Interest that highlighted the "worst offenders" in misleading food labels.

Perhaps I'm getting more reactionary as I get older, but it seems like forcing the FDA to mandate more realistic portion sizes is merely letting Americans get lazier and dumber.

I've been reading food labels for years. It didn't take long for me to figure out that no matter what the label called a "portion", I knew how much of that bag of chips I was going to eat. Eight, ten, or twelve "portions" in that "supersized" bag? Uh, that depends on how hungry I am. And as for canned soups, one of the categories specially called out on CSPI's list of worst offenders, I know I'm not going to eat just half a can: I'm going to eat the whole thing.

For years, then, I've been doing mental arithmetic in the store aisles to calculate just how much sodium and saturated fat I would actually be taking in if I picked up one of these products. It hasn't been hard, and if it had been, I'd have carried a cheap pocket calculator.

Forcing companies to change what constitutes a "portion" of their products isn't going to make people look at the labels if they aren't already doing so. The information is already there for the reading, and has been for years.

(Well, most of the information, anyway. The one change I would favor is for manufacturers to disclose the per-container totals for items whose per-serving percentage is zero, because manufacturers are allowed to round down to zero.)

I understand the impulse to compel reporting more "realistic" portion sizes. But it's pointless: your portion is not my portion, and the manufacturer's idea may not match either of ours. Besides, this is an area where I think consumers have to take some responsibility. If we're not going to think even a little about how much we're eating and how much fat (or sodium, or protein, or ...) we're taking in as a result, the companies selling us our foodstuffs aren't responsible for what happens to us. There's a point past which "help" becomes "coddling".

Tuesday, November 22, 2011

Don Young vs. Douglas Brinkley

Rep. Don Young (R-AK) was participating in hearings on drilling in the Arctic National Wildlife Refuge when he got into a heated exchange with historian Douglas Brinkley. Evidently taking umbrage at Brinkley's temerity for interrupting Young's hectoring lecture to correct the lawmaker's previous misstatement of Brinkley's name, Young burst out:
“I'll call you anything I want to call you when you sit in that chair. You just be quiet.”
Brinkley's breach of Congressional etiquette (nowhere else would it be considered out of line to correct someone for calling you by the wrong name, but it seems lawmakers deem themselves worthy of special consideration) pales in comparison to Young's delusion of importance.

Don Young, if you were anywhere near as smart as the people who testify at Congressional hearings, you'd have a real job. Instead, you're a sorry little man desperate for respect, and you use your public office to extort it from people during hearings. But guess what? A lot of them see through your little game of ... well, let's call it "Overcompensation", and some of them will call you on it. While a distant part of them might feel a little sorry for your, um, shortcomings, they will not put up with your infantile temper tantrums and tragic efforts to shore up your low self-esteem.

This sorry practice of Congress calling people in to "testify" when in reality lawmakers want to puff and preen and scold has got to stop. While I have my doubts that many lawmakers are intellectually capable of understanding it, the testimony they ought to be soliciting is of vital importance to getting the nation's business done. (At least, it should be important. Otherwise, Congress shouldn't be wasting people's time.) If lawmakers aren't going to listen and to learn something, they should give up their seats in favor of those who will. And asinine displays like Don Young's should automatically disqualify the perpetrating lawmaker from further public service.

Don Young, you're a self-important (and, the evidence suggests, little) prick.

Sunday, November 20, 2011

"What is Sony Now?", Bryan Gruley and Cliff Edwards

In my circle of acquaintances, Sony has had a crummy reputation for decades. I was one of the last to give up on it, and that was only after I had blown good money on a dual cassette deck and a multi-CD changer, both of which developed problems within six months of purchase. As the hardware suggests, this was way back in '92 or so.

Almost from the moment competition showed up for Sony's Walkman, the company was said to overcharge for its merchandise. The quality justified the cost, though -- at first. Then companies like Aiwa started making high-quality electronics for slightly less. Simultaneously, Sony's quality control nosedived.

I have heard no indications that Sony's quality-control or pricing problems have turned around in the years since I started reflexively ignoring the company's products. Bryan Gruley's and Cliff Edwards' article for BusinessWeek gives a pretty good idea why. Although the company's problems certainly predate CEO Howard Stringer's tenure, he hasn't helped.
It’s not lack of sleep, though, that irritates him when it’s suggested that Sony is not thought of as the innovator it once was. “Oh, f–k, we make so much more than we used to,” he says. He ticks off some of the products coming out this year, including binoculars that can record video and goggles for watching 3D video games and movies. “Don’t tell me that Sony technology isn’t great.”
Hey Howard: Sony technology isn't great. Get your head out of your ass, and/or stop blowing smoke up ours.

Making more crappy products isn't the answer: making better products is. But as long as management won't acknowledge the mediocrity of what it makes, Sony is screwed.

(Thanks to LongReads for the link.)

Friday, November 18, 2011

"Rock weak"

You couldn’t call it unexpected, but when “rock week” came to The X Factor for two nights, it turns out only one of them actually knew what to do with a rock song. Rock weak was more like it.
That's Tim Goodman's verdict on The X Factor's stunt theme.

Goodman was a music critic before he became a TV critic, so it's surprising to find him so annoyed by The X Factor's failure to live up to his expectations. He even admits, "any rock fan knows that most of these singing competition shows are as far away from the essence of actual rock and roll than [sic] any brain can imagine".

Or maybe it's touching that after years of reviewing TV, "taking bullets" for his readers, he still can hope that these shows will be better than they are.

Tuesday, November 15, 2011

Eddie Murphy

Rolling Stone has an interview with Eddie Murphy conducted by Brian Hiatt. I can take Murphy or leave him alone as a performer, but I have to give him credit for seeming to have his head screwed on right.
This whole period of documenting an artist's work, movies, records, all this shit, it's 100 years old, if it's that. It's brand-new. Beethoven and those fuckers couldn't even listen to their shit, do you know how hard it was to find a mother fucker with a violin that worked back then? And his stuff went through the ages. Technology has it to where they gonna play this stuff forever. But the reality is, all this shit turns into dust, everything is temporary.
(And if stuff doesn't turn to dust, there will be so much of it that much of it will simply be buried in the hard drive of history. Any way you look at it, pop culture is evanescent.)
After all these years, I've done well and I'm cool. I feel comfortable in my skin, I've saved some paper, everybody's healthy, my kids are beautiful and smart, doing different things, it's all good. I'm trying to maintain my shit like this, and do a fun project every now and then.
Good for him. Not only does he deserve his leisure time, but by taking it, he frees up roles for younger performers.

Monday, November 14, 2011

Cain on American Muslims

The big news about Republican presidential candidate Herman Cain today centers on his comments in an interview with the Milwaukee Journal Sentinel. Per the Journal Sentinel's article, Cain "stumbled badly Monday when attempting to answer a question about whether he agreed or disagreed with President Barack Obama's approach to handling the Libyan crisis."

Me, I don't much care. First, no candidate is an expert on everything. Do I wish Cain paid more attention to current events, considering he's running for the highest office in the land? Well, of course. But even if he were a dedicated policy wonk (and all signs are he's exactly the opposite), he's also running for the highest office in the land, which means his most important skill is bobbing and weaving around questions that could alienate large voting constituencies. Let's not kid ourselves: we won't know anything substantive about Cain's policymaking unless he becomes President, and I'm fervently hoping that won't happen because we already know enough about Cain to make me think we'd be better off with Charlie Sheen in the office. In the meantime, if you think it's so easy to master the myriad details of being President of the United States without actually being President, go ahead: run. I'll bet a not insubstantial sum that you would be reduced to a quivering blob before the end of your first no-holds-barred Q-and-A with the national press.

The second reason I don't care is, Cain said a couple of more disturbing things in his interview with GQ.

He's scared of the idea of a Ron Paul presidency, which at first blush is the mark of a reasonably sane person ... but here's why Cain is scared:
I am puzzled by what he stands for. Puzzled by some of his extreme statements, like "End the Fed!" "End everything!" Can't we fix something?
If you're puzzled by Ron Paul, it's because you haven't been paying attention. Ron Paul is by far the most self-consistent candidate to run for President in ages. You might not agree with his uncompromising libertarianism, but you can't deny that he talks the talk and would walk the walk as President (if Congress and the Supreme Court allowed). He is no half- (or less) informed babbler like Cain, either: Paul's Congressional tenure has allowed him to become familiar with a lot of policy issues, especially fiscal and foreign-policy matters. I don't want to live in a President Ron Paul-run country, but I accord him a measure of respect for thoroughly understanding and holding fast to his principles. That Cain either genuinely doesn't understand Ron Paul, or is willing to pretend that he doesn't, means that he doesn't take his competition seriously -- and therefore, that he doesn't take his campaign seriously.

Far more disquieting for anyone who is genuinely interested in assessing Herman Cain as a potential leader is this exchange:
Devin Gordon: What did you think about the fuss around your comments about Muslims. [Cain said in March that, if elected, he wouldn't feel "comfortable" appointing a Muslim to his cabinet] Did you think that you were treated fairly in that conversation?

Herman Cain: No, because a lot of people misrepresented what I said. I know that there are peaceful Muslims, and there are extremists. I have nothing against peaceful Muslims. Nothing whatsoever. But I also know that we must be careful of extremists and we must be careful of the tendency by some groups in this country to infuse their beliefs into our laws and our culture.

Devin Gordon: Do you think that there is a greater tendency among the Muslim faith for that kind of extremism?

Herman Cain: That would be a judgment call that I'm probably not qualified to make, because I can't speak on behalf of the entire Muslim community. I have talked with Muslims that are peaceful Muslims. And I have had one very well known Muslim voice say to me directly that a majority of Muslims share the extremist views.

Chris Heath: A majority?

Herman Cain: Yes, a majority.

Devin Gordon: Do you think he's right?

Herman Cain: Yes, because that's his community. That's his community. I can't tell you his name, but he is a very prominent voice in the Muslim community, and he said that.

Chris Heath: I just find that hard to believe.

Herman Cain: I find it hard to believe.

Chris Heath: But you're believing it?

Herman Cain: Yes, because of the respect that I have for this individual. Because when he told me this, he said he wouldn't want to be quoted or identified as having said that.

Alan Richman: Are you talking about the Muslim community in America? Or the world?

Herman Cain: America. America.
I know only one group of people who are supposed to take the word of one living man as gospel. I think we can safely assume Cain isn't a closet Catholic and his anonymous "source" of Muslim information isn't the Pope.

Herman, do you seriously accept the word of one man on this?

I accept my best friends' assertions of their children's names. Beyond that, I tend to seek corroboration. Especially if they start making vast generalizations about ethnic, religious, cultural, or social groups to which they nominally belong.

You know who's qualified to make sweeping generalizations about all Muslims in the U.S.? Nobody.

I assume Cain is prone to conspiracy theories, because only such a fear-centered mindset explains why one man's opinion could set Cain's in stone: Cain is already disposed to believe the worst.

Either that, or Cain coldbloodedly is playing to what he knows are his party's basest instincts, never mind the consequences to Muslims, the bogeymen of the day.

A paranoiac or a conscienceless asshole ... which is Cain?

More to the point, why is anybody thinking seriously about making this man President?

Quieter bank fees

It shouldn't surprise anybody that in the wake of BofA's very public debit card fee-increase debacle, banks are raising other fees on their customers -- very quietly. The banks are betting that customers won't notice.

This trend will continue until customers vote with their feet and move their money out of the big banks. The only thing management and shareholders will understand is lost revenue directly attributable to lost customers.

Isn't it time that your money mattered more to a bank than its money?

Sunday, November 13, 2011

Jeffrey Sachs on the new progressivism

I don't generally buy into historical cycles. In my opinion, such cycles are usually illusions in the eyes of their beholders, oversimplified interpretations of actual events to suit the contemporary agenda of observers.

So in recommending to your attention Jeffrey Sachs' opinion piece, "The New Progressive Movement", I admit up front that I buy into Sachs' likely oversimplified interpretation of history.
Twice before in American history, powerful corporate interests dominated Washington and brought America to a state of unacceptable inequality, instability and corruption. Both times a social and political movement arose to restore democracy and shared prosperity.
Those two times, per Sachs, were the Gilded Age and the Roaring Twenties. Both eras were followed by tightened regulation to rein in the excesses of the monied class. Sachs thinks the "Reagan Revolution" represents a third era of inequality, instability and corruption, and the Occupy movement the expected (and welcome) reaction.

(As an aside, I think corruption is most effectively curbed by social pressure. That is, while it's important to make specific corrupt actions illegal, what really keeps it in check is the disapprobation of one's family, friends, and neighbors. The more indifferent we are to it, the better it flourishes.)

Sachs, like others, has a few concrete recommendations for the Occupy movement, its silent supporters and those, like me, who aren't comfortable with it (well, it is meant to make a lot of people uncomfortable) but who appreciate that it has changed the national conversation from an obsessive focus on conservative talking points back toward reality. Sachs' recommendations are sensible and require that the movement make its voice heard in the commercial and political spheres. (Sorry, I can't be enthusiastic about the idea of restructuring the country along anarchic lines, as I think some Occupy protesters want.)

Trendiness and age

Edith Zimmerman laments in the New York Times that she's starting to feel old at the tender age of 28, and it's the Internet's fault.
... the Internet is a new kind of barometer for keeping track of exactly how old you feel: how many things you don’t get, how many mini-Internet worlds you can’t find the door to; exactly how many crickets in the world you can no longer hear chirping.
That's not how I think of the Internet (or rather, the World Wide Web), but one can understand Zimmerman's perspective considering how she earns her living.
For the past three or four years my job has been, in some capacity or another, to stay on top of Internet trends and viral videos and memes and other nerdy and non-nerdy things that take up all my time and energy and days and nights and dreams and thoughts.
No wonder she's feeling old. Trendspotting as an avocation I can understand, but as a vocation?

Trendiness is for the young. I don't mean that only the young can be trendy; I mean that the older you are, the less likely you are to give a crap about trends.

The young are still figuring out who they are and what they want out of life. Trends keep them in the mainstream, sometimes ahead of it.

As you age, a couple of things happen. First, you find that some of what you're told is new, isn't. You're reminded of things you've seen before. In a word, you start to get jaded. Or, if you're less cynical, you become "worldly".

Second, ever so gradually, you discover who you are and what you want out of life. You want to dig deeper into your particular interests rather than flit from one thing to the next. Trendiness is all about flitting, so you lose your taste for it.

Zimmerman thinks she's feeling old. The reality is, she's growing up. She should be happy about that.

(She would probably make herself happier if she got into a different line of work, but she'll figure that out.)

Tuesday, November 8, 2011

"The Elusive Big Idea", Neal Gabler

We don't have big ideas any more. That's the essence of Gabler's opinion piece in the New York Times (from back in August, by the way; it got lost among my browser tabs). What we have instead is a glut of information that lets us pretend we're thinking.

Do I agree with Gabler? Meh. I suppose I do, and yet, I can't bring myself to care.

Big ideas like the ones Gabler cites with admiration -- those of John Rawls, Einstein, Marshall McLuhan, and Betty Friedan, for instance -- contribute to a shared culture and world view (if only because we're arguing about them rather than thousands of smaller things), but it's hard to say they help us make better sense of the world. Rather, they help us to make some sense of the world. It might not be better. It might not even be right. Gabler approvingly cites Marx and Freud as big-idea men, but doesn't mention that many of both men's ideas have been discredited, and that some of those ideas have been blamed for a lot of human misery.

Am I arguing for the banishment of big ideas? No, not really. Without them, humanity would make no forward progress, especially in the sciences. Newton's big ideas allowed a lot of important, productive, and useful work to be done before Einstein showed us where Newton was wrong -- and Einstein himself couldn't have conducted his research had he not been educated about, among other things, Newton's ideas.

Yet we're not just living in an age of unprecedented access to information. We're living in an age in which it is unprecedentedly easy to rebut ideas convincingly -- or at least convincingly enough for a lot of non-specialists. It used to be that credibility could be judged by the glossiness of one's presentation -- not that that that was a good way of judging, but it was a way, and it still is the way a lot of us judge credibility. But with the Web lowering the barrier to publishing virtually to nil, idiots can appear as credible as geniuses.

We're also a lot more skeptical than previous generations, and not just because we've seen more ideas like Marx's and Freud's rebutted or debunked. Over the last four decades we've seen authorities of all sorts lose their credibility: government, the media, scientists (over self-aggrandizing fools like Pons and Fleischman, allegations of influence-peddling by corporations in scientific research, and alleged political bias allegedly contributing to research bias), and religious leaders (James Bakker and Jerry Falwell, for instance). Whether mistakes have been intentional or not, no one is trusted by everyone to be "an authority" or "an expert" any more. Those who have a "big idea" will have a hard time finding a receptive audience even if they penetrate the noise.

In spite of finding even more anti-big idea factors than Gabler cites, I still don't know that I buy into his thesis. It just smacks too much of the griping older people do about the way the world is, because the world isn't the same as it was when they were young. They might be right, but we'll never know because we all roll our eyes and ignore it -- even those of us who aren't young any more.

(Full disclosure: I have had a hard time slogging through Gabler's biography of Walt Disney.)

Sandy Wood

The Texas Monthly has a short profile of Sandy Wood, who has been the voice of the syndicated StarDate program for twenty years. I hear the program on the local newsradio station, and it's always a bit odd, though not unwelcome, to hear what the article calls Wood's "almost otherworldly voice" breaking into the run of the station's crisp, forceful anchors and reporters.

(The article is also available via the New York Times.)

"What is the American Dream?", James Gustave Speth

It's almost a cliché now that consumerism, or the mass consumption of consumer goods, is not the key to happiness. The question arises, though, what is? And what did the colonists really mean by "the pursuit of Happiness" in the Declaration of Independence? The latter is the question answered in "What is the American Dream?: Dueling Dualities in the American Tradition", an entry on the Center for a New American Dream's blog.

Speth traces Jefferson's use of the phrase "pursuit of Happiness" back to
... two very different notions: the idea from John Locke and Jeremy Bentham that happiness was the pursuit of personal pleasure and the older Stoic idea that happiness derived from active devotion to the public good and from civic virtue, which have little to do with personal pleasure.
Speth himself thinks a third man, James Truslow Adams, introduced a third facet to the phrase in his 1931 book The Epic of America.
I believe James Truslow Adams' vision of the American Dream is at least as compelling as that of Lincoln. Adams used the phrase, "the American dream," to refer, not to getting rich or even especially to a secure, middle class lifestyle, though that was part of it, but primarily to something finer and more important:
"It is not a dream of motor cars and high wages merely, but a dream of a social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable, and be recognized by others for what they are, regardless of the fortuitous circumstances of birth or position."
Yet another interpretation of "the pursuit of happiness" comes from what Andrew Carnegie called "the gospel of wealth". Carnegie tied "improved conditions" for all Americans to "material development", which in turn was due (of course) to the unhindered operation of competition and the free market.

As a nation we've given ourselves over to one understanding of "happiness", the understanding promulgated by Carnegie. Speth thinks there's compelling empirical evidence that this deeply libertarian and individualistic pursuit of happiness through "material development" isn't actually making us happier.

I've never thought of myself as a Stoic but I do find myself increasingly repelled by the modern compulsion to consume. I've had to fend off friends and family who urge me to replace my TV, for instance, it being a (seemingly) ancient cathode-ray tube model. My counterargument is, it still works and it's good enough for my (evidently, comparatively simple) needs. More and more, I see ads on the Web for goods and services I just don't need, and I wonder: what am I missing, that so many other people seem to want them?

Clearly, I'm getting older ... but maybe, just maybe, it means I'm getting a little wiser, too. At least I've discovered that for me, "the pursuit of happiness" requires more than buying stuff. What exactly that is, I'm not sure. I'll keep looking, I guess.

For the rest of you, maybe it's time to reconsider whether resurrecting the overheated, consumption-based consumer sector is the best way to return this country to something like its former greatness -- or the best way to seek your own personal happiness, for that matter. Focusing on consumption at the expense of everything else is, after all, how we got ourselves into our current economic tar pit. It also hasn't made a lot of us happy, has it?

Monday, November 7, 2011

"The Tweaker", Malcolm Gladwell

Had enough Steve Jobs stories? Me too. But Malcolm Gladwell's New Yorker piece is worth reading because it counterbalances the misleading elegies that made Jobs out to be a protean creator like da Vinci or Edison.

In brief:
Jobs’s sensibility was editorial, not inventive.
But to get a feeling for what that actually means, you'll have to read Gladwell's article for yourself.

(Thanks to Kottke for the link.)

Thinking about how to get to deep space

An early exposure to, and subsequent mild obsession with, the original Star Trek left me with a curious malady: I could muster absolutely no enthusiasm for contemporary space missions. Compared to the magical craft on Trek, NASA's rockets and shuttles were (and still are) so primitive. How could I get excited about a fragile space station, or tiny probes that needed months to reach a mere asteroid?

Fortunately (because I believe we need to be able to leave Earth if the human race is to survive in the long run), not everyone is so eager to run before learning how to walk. The New York Times reported in mid-October about DARPA's "100-Year Starship Study".
Participants — an eclectic mix of engineers, scientists, science fiction fans, students and dreamers — explored a mix of ideas, including how to organize and finance a century-long project; whether civilization would survive, because an engine to propel a starship could also be used for a weapon to obliterate the planet; and whether people need to go along for the trip. (Alternatively, machines could build humans at the destination, perhaps tweaked to live in non-Earth-like environs.)
There are a lot of great ideas cited in this article; do yourself a favor and check it out.

One thing the article makes clear is, the future probably won't look like Roddenberry's vision. (Does that mean we'll be spared the touchy-feely creepiness of people like Deanna Troi? Whew.)

I had to smile at this:
The $1.1 million study — $1 million from Darpa, $100,000 from NASA — will culminate with the awarding of a $500,000 grant to an organization that will take the torch for further work.

Darpa would then exit the starship business, sidestepping interrogation by Congress during the next budget hearings of why it was spending taxpayer money on science fiction dreams.
I guarantee that the cost of a Congressional investigation into where this money went would dwarf DARPA's and NASA's paltry contributions. Yet I think there's a better than even chance such an investigation will occur. Sigh.

Oh, and there was at least one participant who shared my impatience with current goals:
Some speakers said they thought the first goal over the next century should be colonizing the solar system, starting with Mars.

Dr. Obousy, for one, made his preference known in a couplet:

On to the stars!

Cowards shoot for Mars.

"The Fierce Imagination of Haruki Murakami", Sam Anderson

I was glad to read the New York Times magazine's profile of Murakami, because I knew nothing about the celebrated author. A good friend loaned me South of the Border, West of the Sun several years ago; while I remember nothing specific about the book, it left a good impression and I've been meaning to check out more of his work. Murakami's most recent novel, 1Q84, is out in English translation.

Murakami's voice is unique, in my experience, but I've often wondered how much of my perception could be attributed to the inevitable compromises of translation.
When Murakami sat down to write his first novel, he struggled until he came up with an unorthodox solution: he wrote the book’s opening in English, then translated it back into Japanese. This, he says, is how he found his voice. Murakami’s longstanding translator, Jay Rubin, told me that a distinctive feature of Murakami’s Japanese is that it often reads, in the original, as if it has been translated from English.
This shouldn't have surprised me (though it did). South of the Border, West of the Sun, like all of his works (or so I'm told), is steeped in Western influences. Murakami's worlds are odd melanges of Japanese and Western bits jostling up against one another. They're this close to our reality, but definitely aren't.

(As an aside, Murakami calls Jorge Luis Borges "a hero". Again, this is no surprise. Both Borges' and Murakami's works induce disorientation, a vertigo of the rational mind, while somehow making one yearn for more.)

My one criticism of Anderson's piece is that he tried to make his journey to interview Murakami sound like an episode out of a Murakami novel. It's a tired gimmick found in too many long-form profiles: "Let's exemplify the subject's oeuvre!" Moreover, Anderson held on to the conceit too long in his piece, perhaps in an effort to convey the strangeness of Murakami's fiction. If so, it didn't work. You have to read Murakami himself to get that sensation.

Miscellaneous musings, 6 Nov 2011

  • On a local newscast I saw that a few Oklahomans expressed surprise at the damage from the state's recent string of earthquakes. I'm surprised there wasn't more damage, considering the region doesn't have a quake-safety mentality like more seismically active areas.

  • I wandered through a local gourmet food plaza and discovered a pork specialty store that imports lardo, the cured and flavored pig fat that originated in Italy. It's expensive -- not as much as the truffles in a nearby stall, but enough to keep me from buying it on impulse. Still, I hear it calling to me: people have compared it to fine butter when it's melted on toast. I'm lipophobic but butter and buttery substances slip through my defenses.

  • I will never understand why anyone (aside from close friends and relatives) gives a damn about the trial of Michael Jackson's doctor. At Thanksgiving I will give thanks that daily coverage has ended.

  • I'm ambivalent about the Occupy protests: I share the anger, but I don't see what the protests have accomplished or will accomplish other than showing that a lot of people are, well, angry. I'm sorry, though, that I missed publicizing Saturday's Bank Transfer Day effort. While BTD's Facebook page notes "the Bank Transfer Day movement was neither inspired by, derived from nor organized by the Occupy Wall Street movement", the groundswell of popular dissatisfaction with big companies in general and big banks in particular fuels both efforts.

  • My hunch, by the way, is that even with prior warning, credit unions probably had a hard time handling even the modest number of people wanting to move their money Saturday. I therefore suggest that you take the spirit of the BTD message to heart and move your money this week. The point, after all, wasn't for Saturday to be the only day to move your money out of a big bank: the point was to move your money out of a big bank, period. And credit unions are not your only option: there are smaller local banks that might be worthy of your patronage, too.

  • Speaking of credit unions, it's worth remembering they're run by human beings, just like big banks, so credit unions are not immune to stupidity, incompetence, or malice. A credit union to which I belonged had its operations taken over by a federal agency due to unspecified problems, and eventually the assets were transferred to a larger credit union. (Regrettably, I didn't inquire as to the nature of the problems. I wish I had, because holders of accounts at credit unions are not merely customers: they're more like shareholders, and I have a hunch they're entitled to more information about the institution than account holders at a bank.)

Friday, November 4, 2011

Intellect, authority and the political parties

Just as the two parties differ in their attitudes toward authority, they diverge in the value they place on intellect. In both cases, the two parties might have something to learn from each other.
The article is "Why Our Candidates Disappoint Us" by Drew Westen. It's an analysis that resonates, especially if you've read John Dean's Conservatives without Conscience, which, contrary to its inflammatory title, is more about the authoritarian nature of conservative politics than any supposed lack of empathy among conservatives themselves.

Confident men or confidence men?

The confidence we experience as we make a judgment is not a reasoned evaluation of the probability that it is right. Confidence is a feeling, one determined mostly by the coherence of the story and by the ease with which it comes to mind, even when the evidence for the story is sparse and unreliable. The bias toward coherence favors overconfidence. An individual who expresses high confidence probably has a good story, which may or may not be true.
So Daniel Kahneman wrote in his New York Times magazine piece, "Don’t Blink! The Hazards of Confidence".

For some reason I had never made the connection between two things I've known for quite a while: that humans are terrific at finding patterns where they may or may not exist (hence the compelling nature of conspiracy theories), and that we often believe what we want to believe, not what is real. Kahneman makes the connection explicit and illustrates in an unsettling way how it governs the mistakes we make in our investing.

Not that confident investment advisers are necessarily con men, but the effect on investors may well end up being the same. It's a good article. Do yourself a favor and read it before you send more money your adviser's way.

Thursday, November 3, 2011


I had some minor periodontal surgery today. It's the first time I've had needles poking into the roof of my mouth in a few decades, and I'm pleased the experience wasn't nearly as traumatic as that first time (which kept me away from dentists for years). All in all, it was something of a non-event, to my relief.

What I didn't expect was that the post-operative dietary restrictions would reveal unsuspected cravings. No pickles or ice cream, the latter of which I could have anyway; no, I find myself longing for salt and maybe a little grease. Chips would be nice, but I'd settle for the hearty lentil soup I made for myself just the night before. Unfortunately the soup requires just a bit of chewing, so it's not on.

When I was a kid my idea of a good day's eating would have been having several shakes in a row. Not so as an adult. Give me my salt, my grease, a hint of astringency perhaps. Does anybody make pureed salt-and-vinegar potato chips?

Tuesday, November 1, 2011

Snow goons

I always enjoy Scott Tipton's Comics 101 blog, not least because he includes a lot of pictures (no surprise there). From a couple of weeks ago, here's a soon-to-be seasonally appropriate image courtesy of an old Strange Adventures comic. Tipton called the image "snowgoons.jpg" and that pretty much sums it up. I got a kick out of it and I expect you will, too.

(Since I don't know his policy on internal links, let me also provide a link to the entire column, which is about his visit to the 2011 New York Comic Con.)