Send email incessantly. Your messages will be redirected to /dev/null in no time.
If you spend more than a third of your day sending mail or otherwise trying to get people's attention, you're not communicating effectively. Do less typing and more thinking.
I don't care who you are: if you post to a mailing list a dozen times in the same day, I will classify you, mentally at least, as an unmitigated nuisance.
Friday, April 29, 2011
Wednesday, April 27, 2011
"The New Geopolitics of Food," Lester Brown
Those who have been paying attention saw this coming thirty or forty years ago, but if you haven't, well, this article on how food scarcity underlies the current unrest in the Middle East, and promises only to get worse in the future, shows that we're teetering on the brink of some extremely unpleasant times.
Not that the land purchases, in particular, have gone unimpeded. Unfortunately, that just means more people in the richer country are at risk of malnutrition or poverty or both. That's better than large numbers of people in the poorer country dying, but not tremendously better.
Or, of course, the countries could go to war. I'm not venturing an opinion as to how that stacks up against mass starvation.
Like it or not, the human race is way overdue for a reality check on its lack of population controls.
Last June I wrote about how overpopulation warnings have had a history of not coming true, making a lot of people sanguine that human ingenuity would always fend it off. But do you know what the price of gleefully outrunning the natural limits on population is? You have no margin for error. If, while outrunning those limits, you stumble, you will fall -- hard. And that means a lot of people will suffer and die.
If we don't learn to restrain ourselves as a species, Nature will take care of overpopulation in its own brutal and unforgiving way.
(I found the article courtesy of LongReads. I'd offer my thanks, but I have to admit I don't feel particularly happy now that I've read it.)
This is not merely a story about the booming demand for food. Everything from falling water tables to eroding soils and the consequences of global warming means that the world's food supply is unlikely to keep up with our collectively growing appetites.Between purchases by richer nations of land in poorer nations for growing crops, which takes us right back to the good old days of nineteenth century imperialism, and protective restrictions on exports from grain-rich countries, we're looking at the global economy trying to solve an acute crisis -- hunger -- with Adam Smith's invisible hand. What no one wants to say is that a lot of people in poorer countries will starve if that hand is allowed to sweep through unimpeded.
Not that the land purchases, in particular, have gone unimpeded. Unfortunately, that just means more people in the richer country are at risk of malnutrition or poverty or both. That's better than large numbers of people in the poorer country dying, but not tremendously better.
Or, of course, the countries could go to war. I'm not venturing an opinion as to how that stacks up against mass starvation.
Like it or not, the human race is way overdue for a reality check on its lack of population controls.
Last June I wrote about how overpopulation warnings have had a history of not coming true, making a lot of people sanguine that human ingenuity would always fend it off. But do you know what the price of gleefully outrunning the natural limits on population is? You have no margin for error. If, while outrunning those limits, you stumble, you will fall -- hard. And that means a lot of people will suffer and die.
If we don't learn to restrain ourselves as a species, Nature will take care of overpopulation in its own brutal and unforgiving way.
(I found the article courtesy of LongReads. I'd offer my thanks, but I have to admit I don't feel particularly happy now that I've read it.)
Obama releases birth certificate
Per the AP. He did it to quash the lingering "debate" over his citizenship.
It won't.
The birthers are loons, impervious to evidence and motivated solely by an irrational antipathy toward Obama. I'm sure Obama and his advisors know that, and are only hoping the full birth certificate will forestall mainstream Republican efforts to stoke this particular fire in the 2012 election. However, "mainstream Republicans" are, if not quite an oxymoron, certainly an endangered species. Don't be surprised if the Republican primaries feature presidential hopefuls who continue to bleat this arrant nonsense to their increasingly out-there base.
Like the non-debate between evolution and creationism (aka intelligent design), the "birther debate" is a bogus controversy: the only ones who think there's anything to discuss are those who don't know what the hell they're talking about.
It won't.
The birthers are loons, impervious to evidence and motivated solely by an irrational antipathy toward Obama. I'm sure Obama and his advisors know that, and are only hoping the full birth certificate will forestall mainstream Republican efforts to stoke this particular fire in the 2012 election. However, "mainstream Republicans" are, if not quite an oxymoron, certainly an endangered species. Don't be surprised if the Republican primaries feature presidential hopefuls who continue to bleat this arrant nonsense to their increasingly out-there base.
Like the non-debate between evolution and creationism (aka intelligent design), the "birther debate" is a bogus controversy: the only ones who think there's anything to discuss are those who don't know what the hell they're talking about.
Monday, April 25, 2011
When everybody can be an artist
Stefan Goldmann, an electronic musician, reviews the evolution of electronic music since the 1970s, with special attention to how the lowering of barriers (actually, their elimination) to entry has transformed the business model. Boiling down the thoughtful essay (translated from German) to its essential message, new artists can't emulate their predecessors' path to success: the world has changed too much. Whereas in the past it was possible to mimic the sound of a popular artist and earn a decent return on investment (from the label's perspective, anyway), it's simply not possible to succeed that way today because low-cost technology lets literally anyone be a copycat. Although not everyone will try their hand, enough will, and already have, to flood distribution channels, i.e., the Internet. Faced with a million choices, how does one choose?
Thirty years ago, "a couple of hundred artists and bands gained access to an audience of millions." Today?
Thirty years ago, "a couple of hundred artists and bands gained access to an audience of millions." Today?
Today a couple millions [sic] artists try to reach a few hundred people. Or like the contemporary pun puts it, “In the future everyone will be world-famous for 15 people.”The only way to succeed is to be different, Goldmann says, to stand out from the crowd. Of course, that's not really news: it has been the characteristic principle of those considered great artists throughout history. Goldmann merely points out that this truth is not merely the key to one's artistic reputation these days, but the key to making a living.
To Muni Metro operators
I want to support unions, I really do. But the San Francisco Municipal Railway's operators union makes it really hard. And the worst of the bunch are the Metro operators.
When you witness their slow, inattentive, and often ineffective "work" on the Metro -- the electric streetcar portions of the transit system -- and you reflect that the Metro is supposed to be where the "best" operators wind up, it's all but impossible to feel any sympathy for these men and women.
That's why their authorization of a strike is so galling. Yes, as the article notes, it's "a fairly common tactic in labor negotiations," but I can't think of municipal workers who are less respected as a group than Muni operators.
Of course I've encountered considerate, more than competent, eminently respectable Muni operators. Unfortunately, they're the exception, not the norm. And I understand that being on the front lines of customer service for a system whose failures are far more celebrated than its successes is a wearing, demoralizing position in which to find oneself. I therefore cut Muni drivers a lot of slack: I don't expect them to be chipper and I don't hold gruffness against them.
But the Metro operators spend a third of their runs isolated from passengers. While underground, their trains are run by computer. They don't have to take fares from passengers at any underground station. I've seen them read newspapers while the trains roll through the tunnels. They really have that little to do while going to and coming from downtown.
Yet these supposedly best operators can't be bothered to explain delays to passengers, or to check that the doors are clear before closing them.
With the lowest burden of passenger interaction, Muni Metro operators as a group still manage to do it worst. That's contemptible.
(This rant was prompted by Akit's letter to the Muni operators union, with which I agree for the most part, but which reflection made me narrow down to Metro operators.)
When you witness their slow, inattentive, and often ineffective "work" on the Metro -- the electric streetcar portions of the transit system -- and you reflect that the Metro is supposed to be where the "best" operators wind up, it's all but impossible to feel any sympathy for these men and women.
That's why their authorization of a strike is so galling. Yes, as the article notes, it's "a fairly common tactic in labor negotiations," but I can't think of municipal workers who are less respected as a group than Muni operators.
Of course I've encountered considerate, more than competent, eminently respectable Muni operators. Unfortunately, they're the exception, not the norm. And I understand that being on the front lines of customer service for a system whose failures are far more celebrated than its successes is a wearing, demoralizing position in which to find oneself. I therefore cut Muni drivers a lot of slack: I don't expect them to be chipper and I don't hold gruffness against them.
But the Metro operators spend a third of their runs isolated from passengers. While underground, their trains are run by computer. They don't have to take fares from passengers at any underground station. I've seen them read newspapers while the trains roll through the tunnels. They really have that little to do while going to and coming from downtown.
Yet these supposedly best operators can't be bothered to explain delays to passengers, or to check that the doors are clear before closing them.
With the lowest burden of passenger interaction, Muni Metro operators as a group still manage to do it worst. That's contemptible.
(This rant was prompted by Akit's letter to the Muni operators union, with which I agree for the most part, but which reflection made me narrow down to Metro operators.)
Saturday, April 23, 2011
Physics and hand-waving
If you are not a physicist but have any sense of physics, you probably think its unanswered questions lie on the frontiers, among the stars or within subatomic particles. That was certainly this non-physicist's impression before reading "The Man Behind the Curtain," Tony Rothman's essay in American Scientist. 'Tis not so. According to Rothman, the discussions in introductory physics textbooks and courses fail to remark upon, much less to explain, some quite deep open questions.
I'm generally adept at reproducing the twists and turns of a teacher's explanation, so I'm particularly susceptible to the sleights of hand a teacher can use to evade logical inconsistencies. If I had pursued physics as a major, I would very likely have acquired an understanding of it that was the equivalent of a safe path through a deadly swamp. It would have been suitable for passing on to others so they, too, could avoid falling into marshy pits, but it would have been less than useful for teaching how to drain those spots, that is, how to carry out the hard work of real physics research myself. And as Rothman remarks, "It seems to me that such an approach is both intellectually dishonest and fails to stimulate the habits of inquiry and skepticism that science is meant to engender." Hear, hear.
... friction produces heat and hence an increase in entropy. It thus distinguishes past from future. The increase in entropy—the second law of thermodynamics—is the only law of Nature that makes this fundamental distinction. Newton’s laws, those of electrodynamics, relativity … all are reversible: None care whether the universal clock runs forward or backward. If Newton’s laws are at the bottom of everything, then one should be able to derive the second law of thermodynamics from Newtonian mechanics, but this has never been satisfactorily accomplished and the incompatibility of the irreversible second law with the other fundamental theories remains perhaps the greatest paradox in all physics. It is blatantly dropped into the first days of a freshman course and the textbook authors bat not an eyelash.Rothman cites several other examples of papered-over "holes" and claims others "abound throughout physics." His counsel is for physicists to distinguish between the accuracy with which they can describe what happens and the uncertainty that frequently bedevils their attempts to explain why it happens.
I'm generally adept at reproducing the twists and turns of a teacher's explanation, so I'm particularly susceptible to the sleights of hand a teacher can use to evade logical inconsistencies. If I had pursued physics as a major, I would very likely have acquired an understanding of it that was the equivalent of a safe path through a deadly swamp. It would have been suitable for passing on to others so they, too, could avoid falling into marshy pits, but it would have been less than useful for teaching how to drain those spots, that is, how to carry out the hard work of real physics research myself. And as Rothman remarks, "It seems to me that such an approach is both intellectually dishonest and fails to stimulate the habits of inquiry and skepticism that science is meant to engender." Hear, hear.
Friday, April 22, 2011
How I Met Your Mother
I'm way out of the demographic for this one, but there's something irresistible about HIMYM's brazen absurdities, flights of fancy, and sharp timing (helped out by editing). I knew Alyson Hannigan would be appealing (is there anyone who watched Buffy the Vampire Slayer and didn't like Willow?), but the other main players, especially Neil Patrick Harris, shine.
Is this Friends for the current generation? I don't know because I never saw the iconic sitcom, having been turned off of it by the slavish adulation of so many of its fans. I'd like to think HIMYM doesn't inspire such cult-like devotion, but what do I know? And I get enough of a kick out of HIMYM that I don't care, either.
Is this Friends for the current generation? I don't know because I never saw the iconic sitcom, having been turned off of it by the slavish adulation of so many of its fans. I'd like to think HIMYM doesn't inspire such cult-like devotion, but what do I know? And I get enough of a kick out of HIMYM that I don't care, either.
More than a marathon man
After running the London Marathon, Sam Robson ran home -- an additional 99 miles.
“I had to have regular breaks to refill my water and whenever I stopped my legs seized up so I couldn’t rest for long. In terms of tiredness, my legs felt pretty good and the worst bit was I had to keep eating to replace all the calories I was burning.Wow.
“I woke up feeling surprisingly good. My hip was a bit sore but it feels okay now and my legs are fine.”
Thursday, April 21, 2011
Halliburton fracking info, week 22
A bit late this week, but no matter: there's been no change to Halliburton's fracking fluids disclosure page since week 20.
(If you want to know "week 20 from what?" you can either follow the previous-entry links backwards or go directly to the original entry about Halliburton's fracking information.)
(If you want to know "week 20 from what?" you can either follow the previous-entry links backwards or go directly to the original entry about Halliburton's fracking information.)
Labels:
fracking,
halliburton,
hydraulic fracturing,
natural gas
Monday, April 18, 2011
An anniversary quake
Today, 18 April 2011, is the 105th anniversary of the great San Andreas fault quake that devastated San Francisco in 1906.
What better day, then, for the Bay Area to experience another temblor?
According to the U.S. Geological Survey, the quake occurred at 3 PM Pacific time, measured just 1.4 and was centered at 37.603°N, 122.452°W, which is on the Peninsula between Pacifica and San Bruno.
Felt more like a 3 or 4 to me, but there are always caveats when judging a quake's intensity:
What better day, then, for the Bay Area to experience another temblor?
According to the U.S. Geological Survey, the quake occurred at 3 PM Pacific time, measured just 1.4 and was centered at 37.603°N, 122.452°W, which is on the Peninsula between Pacifica and San Bruno.
Felt more like a 3 or 4 to me, but there are always caveats when judging a quake's intensity:
- The preliminary measurement by agencies like the USGS is almost always revised a little bit down the line. The recent Japan quake, for instance, went from an 8.9 to a 9.0.
- One's perception of a quake depends heavily on where one experiences it -- both one's distance from the epicenter and the type of ground on which you're standing (or sitting, or ...).
UPDATE: According to a California/Nevada fault map centered at 37°N, 122°W, both quakes occurred on the San Andreas Fault. Isn't coincidence wonderful?
Sunday, April 17, 2011
AOL is as clueless about layoffs as it is about content
Read Eric Snider's piece, "Leaving in a Huff," to see how AOL thoroughly mishandled the layoffs of freelancers for one of its specialty Web sites. The site, Cinematical, was one of the properties placed under Arianna Huffington's control as part of AOL's acquisition of the Huffington Post.
Snider is able to distill bitterness into sardonicism, making his account thoroughly entertaining. The portrait he paints of the company is altogether consistent with the mental image conjured up by the AOL Way slides.
At the time of the Huffington Post acquisition I wrote:
Snider is able to distill bitterness into sardonicism, making his account thoroughly entertaining. The portrait he paints of the company is altogether consistent with the mental image conjured up by the AOL Way slides.
At the time of the Huffington Post acquisition I wrote:
AOL, I daresay, will have the same trouble with any acquisition that it had with Netscape: its bureaucratic dedication to trendspotting as a substitute for creative vision will suck the life from and crush the souls of any imaginative and creative people that join the company through those acquisitions. The best will leave, sooner or later, and AOL once again will find itself without anybody to do the real work of generating good content (or making good technology, as the case may be).I was wrong in that not everyone is leaving of his or her own accord: as Snider notes, AOL/HuffPo has been terminating the services of freelancers, preferring to keep the work for full-time, in-house employees. However, I was right that the company is driving off the talent that made its acquired properties valuable in the first place. We'll see if people continue to frequent those properties out of habit, or if the sites lose their cachet just as they have lost their best writers.
Be warned: KidZania is coming
The Morning News has a profile of KidZania, an "edutainment" experience franchised around the world and expected to arrive in the U.S. by 2013. The chain's premise is that kids can learn by play-acting at real jobs.
Sounds harmless enough at first blush. The trouble is that KidZania at its heart seems more interested in indoctrinating children than in educating them.
Your kids will grow up seeing corporate logos on their clothes, in your pantry, and on the 'net (or TV, or both). Give them a chance to develop some defense against the marketing blitz that is modern culture. Even before the first KidZania opens in the U.S., I'll bet that if you can afford the entry fee, you can also afford a trip to a park. Go to the park, okay?
(Thanks to Kottke for the link.)
Sounds harmless enough at first blush. The trouble is that KidZania at its heart seems more interested in indoctrinating children than in educating them.
... at the heart of the concept and the business of KidZania is corporate consumerism, re-staged for children whose parents pay for them to act the role of the mature consumer and employee. The rights to brand and help create activities at each franchise are sold off to real corporations, while KidZania’s own marketing emphasizes the arguable educational benefits of the park.KidZania embeds corporate sponsors deep into the experiences it provides.
When Kahori Roskamp’s daughter took part in a cooking class activity, Roskamp found that “it was about eating chicken nuggets, probably frozen, which I don’t find very healthy, educational, or interesting.” During the activity, TV screens played a commercial for the chicken nugget company on a loop. “It was more about promoting the products rather than creating a fun, educational place for children,” Roskamp said.Does anyone think that these companies' priority isn't to make kids more loyal customers?
Your kids will grow up seeing corporate logos on their clothes, in your pantry, and on the 'net (or TV, or both). Give them a chance to develop some defense against the marketing blitz that is modern culture. Even before the first KidZania opens in the U.S., I'll bet that if you can afford the entry fee, you can also afford a trip to a park. Go to the park, okay?
(Thanks to Kottke for the link.)
Asking the young to think about insurance
I know what you're thinking: "now there's a snoozeworthy topic." That's what I thought when I was 25, too.
Still, "protoblogger" Dave Winer thought it worthy of an entry on scripting.com entitled, "To the young brilliant minds."
Winer is well-intentioned, and is himself an example of one who needed (and luckily for him, had) catastrophic health care insurance at the not completely expected age of 47. Still, 47 ... that's an eternity away when you're 25, I remember that much. Winer's harangue would have fallen on deaf ears in my case, and I have no doubt that most of today's 25-year-olds feel the same way.
In our twenties we're not even conscious of feeling invulnerable: we just do feel that way. Slowly, fitfully, we come to realize that we're mortal. Some of us receive an unwelcome wakeup call in the form of the death of a friend or family member close to our own age. Many of us find our taste for risk vanishes when we become parents. The rest of us just slow down and become a bit more reflective, a bit more conscious of all we have to lose.
So urging a 20-something to think about health insurance is a losing battle even today, even with the news awash in stories of how health care costs stand to bankrupt the government.
It doesn't help that health insurance as we know it is not altogether wonderful. Consider the case of Jill Bolte Taylor, Ph.D., a neuroanatomist who suffered a stroke at the not so very ripe old age of 37. In her book My Stroke of Insight, Taylor recalled her thoughts as she endured the opening stages of the stroke, waiting for help to arrive:
In any case, the young brilliant minds Winer addresses are not the ones in greatest peril. They're likely in high demand at businesses which provide good health care benefits. Those businesses have long since accepted the need to provide such benefits to stay competitive. On the other hand, some sectors -- the fast food industry comes to mind -- have little incentive to provide those benefits because demand for those jobs far outstrips supply. If you're 25 and you suffer a stroke while you're working as a cashier for McDonalds, good luck footing your medical bills.
Still, "protoblogger" Dave Winer thought it worthy of an entry on scripting.com entitled, "To the young brilliant minds."
Winer is well-intentioned, and is himself an example of one who needed (and luckily for him, had) catastrophic health care insurance at the not completely expected age of 47. Still, 47 ... that's an eternity away when you're 25, I remember that much. Winer's harangue would have fallen on deaf ears in my case, and I have no doubt that most of today's 25-year-olds feel the same way.
In our twenties we're not even conscious of feeling invulnerable: we just do feel that way. Slowly, fitfully, we come to realize that we're mortal. Some of us receive an unwelcome wakeup call in the form of the death of a friend or family member close to our own age. Many of us find our taste for risk vanishes when we become parents. The rest of us just slow down and become a bit more reflective, a bit more conscious of all we have to lose.
So urging a 20-something to think about health insurance is a losing battle even today, even with the news awash in stories of how health care costs stand to bankrupt the government.
It doesn't help that health insurance as we know it is not altogether wonderful. Consider the case of Jill Bolte Taylor, Ph.D., a neuroanatomist who suffered a stroke at the not so very ripe old age of 37. In her book My Stroke of Insight, Taylor recalled her thoughts as she endured the opening stages of the stroke, waiting for help to arrive:
My paralyzed arm was partially recovered and although it hurt, I felt hopeful that it would recover completely. Yet even in this discombobulated state, I felt a nagging obligation to contact my doctor. It was obvious that I would need emergency treatment that would probably be very expensive, and what a sad commentary that even in this disjointed mentality, I knew enough to be worried that my HMO might not cover my costs in the event that I went to the wrong health center for care.(My Stroke of Insight, hardcover edition, p. 56)
In any case, the young brilliant minds Winer addresses are not the ones in greatest peril. They're likely in high demand at businesses which provide good health care benefits. Those businesses have long since accepted the need to provide such benefits to stay competitive. On the other hand, some sectors -- the fast food industry comes to mind -- have little incentive to provide those benefits because demand for those jobs far outstrips supply. If you're 25 and you suffer a stroke while you're working as a cashier for McDonalds, good luck footing your medical bills.
Saturday, April 16, 2011
Adobe EULA gripes
I had occasion to download a new version of Flash Player from Adobe. As a conscientious end user, I opted to read the end user license agreement (EULA) before proceeding with installation.
First gripe: what possessed Adobe to lump all the different translations into one giant file? It's one thing for the printed manual to include a dozen translations in the same booklet. It's quite another -- the height of laziness, in fact -- for the online version of the manual to be in such an unhelpful form. If Adobe can't be bothered to create single-language versions of the EULA, each available at its own link, I conclude that Adobe doesn't give a crap about me, and that its software's quality probably reflects such a crummy attitude.
Second gripe: well, first read section 9.5 of the EULA, regarding certificate authorities and indemnification:
First gripe: what possessed Adobe to lump all the different translations into one giant file? It's one thing for the printed manual to include a dozen translations in the same booklet. It's quite another -- the height of laziness, in fact -- for the online version of the manual to be in such an unhelpful form. If Adobe can't be bothered to create single-language versions of the EULA, each available at its own link, I conclude that Adobe doesn't give a crap about me, and that its software's quality probably reflects such a crummy attitude.
Second gripe: well, first read section 9.5 of the EULA, regarding certificate authorities and indemnification:
You agree to hold Adobe and any applicable Certification Authority (except as expressly provided in its terms and conditions) harmless from any and all liabilities, losses, actions, damages, or claims (including all reasonable expenses, costs, and attorneys fees) arising out of or relating to any use of, or reliance on, any service of such authority, including, without limitation (a) reliance on an expired or revoked certificate, (b) improper verification of a certificate, (c) use of a certificate other than as permitted by any applicable terms and conditions, this agreement or applicable law; (d) failure to exercise reasonable judgment under the circumstances in relying on issuer services or certificates or (e) failure to perform any of the obligations as required in the terms and conditions related to the services.I'd like someobody at Adobe to tell me, after that list of exceptions and exclusions, what exactly is left over? As regards digital certificates, does Adobe promise anything whatsoever?
Tuesday, April 12, 2011
Martin Rees on science and religion
Lord Martin Rees, Britain's Astronomer Royal, was interviewed by The Guardian's Ian Sample in the wake of Lord Rees' winning of the Templeton Prize. There is much to be said about (not necessarily "for") the prize and what it means as regards the interaction between science and religion: the Guardian has a number of articles and opinion pieces on the subject that I plan to read. Perhaps it was unsurprising, then, that much of Sample's interview focused on Lord Rees' opinion of religion and his personal practice thereof. That it was not surprising does not mean that focus was well-received by Lord Rees, though.
I am sorry you focused on science and religion rather than what I think are the interesting things I do.I felt rather sorry for Lord Rees, whose discomfort with Sample's heavyhanded attempts to elicit controversial remarks was evident from the start.
The literature of dictators
Courtesy of The Browser, a light Foreign Policy piece surveying the writings of some of the twentieth century's better-known absolute rulers.
As you might expect of men whose egos drove them to the pinnacle of power in their countries, their writing, by and large, lacks finesse. After all, not only were they convinced of their prowess in all things, but they were unburdened by editors, too.
On the other hand, consider this bit of (translated) verse:
As you might expect of men whose egos drove them to the pinnacle of power in their countries, their writing, by and large, lacks finesse. After all, not only were they convinced of their prowess in all things, but they were unburdened by editors, too.
On the other hand, consider this bit of (translated) verse:
Open the door of the tavern and let us go there day and night,Granted, on its face it might not be the greatest poetry you've ever read. But doesn't it become more interesting, and revealing, when you learn it was written by the Ayatollah Ruhollah Khomeini?
For I am sick and tired of the mosque and seminary.
I have torn off the garb of asceticism and hypocrisy,
Putting on the cloak of the tavern-hunting shaykh and becoming aware.
The city preacher has so tormented me with his advice
That I have sought aid from the breath of the wine-drenched profligate.
Leave me alone to remember the idol-temple,
I who have been awakened by the hand of the tavern's idol.
Saturday, April 9, 2011
A Somali pirate in a German court
Courtesy The Browser, an English-language version of a great article in Der Spiegel about a Somali pirate's experiences in a German court, where he is on trial for piracy against a German-flagged vessel. He's utterly bewildered by the norms of a society under the rule of law.
Caveat: you have to believe the young man in question is telling the truth. I gave him the benefit of the doubt.
I wasn't inclined to feel sorry for a Somali pirate, but this one's youth and background made me question whether the world's energies would not be more productively directed toward fixing Somalia (for some definition of "fixing") rather than pursuing and prosecuting desperate young men like this one. The effort seems akin to treating a gunshot victim by mopping blood off the floor rather than taking her to a hospital.
Caveat: you have to believe the young man in question is telling the truth. I gave him the benefit of the doubt.
I wasn't inclined to feel sorry for a Somali pirate, but this one's youth and background made me question whether the world's energies would not be more productively directed toward fixing Somalia (for some definition of "fixing") rather than pursuing and prosecuting desperate young men like this one. The effort seems akin to treating a gunshot victim by mopping blood off the floor rather than taking her to a hospital.
Friday, April 8, 2011
Last year's model
I stumbled across an intriguing-looking badge on a blog and clicked through to Last Year's Model, whose motto is, "Saving the planet through sheer laziness." The point is to encourage people not to buy new gadgets if they have perfectly functional ones that are slightly older.
It's an idea with which I was already on board a long time ago. My only quibble is, "last year" is such a short horizon. I think shooting for five years is reasonable. (I know, I know: just contemplating being behind the technology curve for that long would make some trendinistas' heads explode.)
Life's short, yes, but just because you can buy something doesn't mean you must.
It's an idea with which I was already on board a long time ago. My only quibble is, "last year" is such a short horizon. I think shooting for five years is reasonable. (I know, I know: just contemplating being behind the technology curve for that long would make some trendinistas' heads explode.)
Life's short, yes, but just because you can buy something doesn't mean you must.
Giants generous to Loewenstein
It's nice to read a story of classiness.
A full World Series share came to $317,631.29, a life-changing windfall for a gravely ill career clubhouse attendant. The Giants’ players had sole determination over how the cash was doled out among employees – partial shares and cash awards were awarded in addition to full shares – and they’d voted [David] Loewenstein the maximum.
Open source vs. closed source
In response to Jon Stokes' opinion piece on Facebook's open-sourcing of its datacenter design, I found a strikingly insightful comment from user "brentbordelon" on what really distinguishes an open-source project from a closed-source project.
"Open Source" is really little more than a PR stunt. There's very little difference between open source systems and their outcome vs. closed source systems.More of the tagalongs are helpful than "brentbordelon" admits, but his main point holds true in my experience: it takes a core of dedicated, motivated, creative, and effective people to make a successful project.
In successful examples of both you have a few brilliant people doing the huge majority of the things that makes the venture successful. The difference is really only concerned with the "tagalongs":
In the closed-source world, a tagalong is a yes man or some middle management person who wants everyone above them to think they play a significant role in the success of the venture. In reality, they usually do more to hinder progress than help. Eventually, they end up losing their jobs as it becomes increasingly evident that they are not adding any real value.
In open-source scenarios, the approach to tagalongs is very different. Anyone and everyone can say, "...and I helped!", even if all they did was git some code and look at it. Instead of one or two people getting in the way, you have a multitude. Many of which submit actual code. Unfortunately the majority of code is either novice crap or is so laden with someone else's totally different vision as to be useless. Fortunately, with a few key players being able to decide what ACTUALLY gets accepted and what doesn't, an open source project can progress.
Thursday, April 7, 2011
Comodo and bogus SSL certificates
Oh boy. The Comodo Group, a certificate authority, was tricked into issuing SSL certificates for well-known domains to a bogus client. As Bruce Schneier wrote, "This isn't good."
I gave some background on this technology in December and August 2010. My blog entries aren't perfect but they give you the flavor of what's going on and why the Comodo story is important.
There are ways to mitigate the vulnerabilities introduced by corrupt or sloppy CAs (the Comodo Group appears to fall into the latter category), but some if not all of them require far-reaching changes in even more fundamental Internet technologies, like the Domain Name System (DNS).
In short, there's no easy fix.
I gave some background on this technology in December and August 2010. My blog entries aren't perfect but they give you the flavor of what's going on and why the Comodo story is important.
There are ways to mitigate the vulnerabilities introduced by corrupt or sloppy CAs (the Comodo Group appears to fall into the latter category), but some if not all of them require far-reaching changes in even more fundamental Internet technologies, like the Domain Name System (DNS).
In short, there's no easy fix.
Labels:
certificate authorities,
security,
technology
Tuesday, April 5, 2011
"Does the Universe Need God?", Sean Carroll
Courtesy of The Browser, an absolutely fascinating excerpt from a forthcoming book, The Blackwell Companion to Science and Christianity (due out in July 2012 according to one source).
Carroll's question is provocative, but his answer is compellingly well-reasoned and thoughtful. It helps that he sensibly circumscribes the problem space:
Then he surveys the various theories purporting to explain how the universe came to be. It is in these theories that science and religion contend with one another, and what becomes clear in Carroll's survey is that which theory -- or which type of theory -- one believes is very much a philosophical rather than scientific question.
First, Carroll notes the attractiveness for the theologically inclined of the scientific observation that the universe's physical characteristics are exceptionally well-suited for our form of life. Certain physical parameters could have taken many different values, resulting in many different types of universe; even small differences from what we have measured could have resulted in a universe in which life on Earth could not have existed (at least, not in its current form). Since we seem to exist by virtue of a long shot, probabilistically speaking, the idea that there must have been an entity "setting" these particular values is attractive at first blush. However, Carroll points out possible alternative explanations:
The "we got lucky" explanation is certainly possible, but from the standpoint of a cosmologist it doesn't lend itself to the imaginative flights of fancy that the alternative explanations do. (That's my opinion. Carroll dismisses "we got lucky" on the basis that it is highly improbable, which is consistent with his assignment of probabilities to each of these alternative explanations.)
The third alternative explanation requires introducing the concept of the "multiverse." A universe is a region with a single set of values for fundamental physical parameters, and therefore a particular set of physical laws. If universes with different values for the same physical parameters exist, the whole collection of universes is called the multiverse.
There are different theories for how a multiverse could have arisen, but the bottom-line question for Carroll is the likelihood of a multiverse coming into existence versus the likelihood that a (or rather, the) universe was created by a deity.
In my experience, the typical rejoinder to this kind of objection is, "God's reasoning is beyond our comprehension." This assertion is a showstopper: there is no way to convince a nonbeliever that it's true, and no way to convince a believer that it isn't.
What is often overlooked is that there is a corresponding showstopper argument if you come at things from the opposite direction. "God hypothesis" advocates cite various roles God could have played or could be playing in the unfolding of the universe, but they all have the purpose of explaining why things happen. And as Carroll writes:
Carroll's question is provocative, but his answer is compellingly well-reasoned and thoughtful. It helps that he sensibly circumscribes the problem space:
Some questions science has more or less answered: "What happens when something catches on fire?" But "Where did the universe come from?" is not one of these questions. So we are not faced with a matter of judging the merits of a mature and compelling scientific theory. Rather, we are trying to predict the future: will there ever be a time when a conventional scientific model provides a complete understanding of the origin of the universe? Or, alternatively, do we already know enough to conclude that God definitely helps us explain the universe we see, in ways that a non-theistic approach can never hope to match?A brief and quite comprehensible explanation of the state of our cosmological knowledge follows. Along the way, Carroll clears up misconceptions that have arisen around concepts and terms of art that have seeped into the wider public consciousness, like "the Big Bang."
Then he surveys the various theories purporting to explain how the universe came to be. It is in these theories that science and religion contend with one another, and what becomes clear in Carroll's survey is that which theory -- or which type of theory -- one believes is very much a philosophical rather than scientific question.
First, Carroll notes the attractiveness for the theologically inclined of the scientific observation that the universe's physical characteristics are exceptionally well-suited for our form of life. Certain physical parameters could have taken many different values, resulting in many different types of universe; even small differences from what we have measured could have resulted in a universe in which life on Earth could not have existed (at least, not in its current form). Since we seem to exist by virtue of a long shot, probabilistically speaking, the idea that there must have been an entity "setting" these particular values is attractive at first blush. However, Carroll points out possible alternative explanations:
- Other values for those parameters might have led to other forms of life emerging, forms whose characteristics we can't imagine.
- We got extremely lucky that the universe came into being in this way.
- Different parts of the universe have different values for the physical parameters, and we happen to live in one of the parts where life is possible.
Life may be very fragile, but for all we know it may be ubiquitous (in parameter space); we have a great deal of trouble even defining "life" or for that matter "complexity," not to mention "intelligence." At the least, the tentative nature of our current understanding of these issues should make us reluctant to draw grand conclusions about the nature of reality from the fact that our universe allows for the existence of life.(That's one problem I've always had with the anthropic principle: it always has struck me as unjustifiably specific to humans.)
The "we got lucky" explanation is certainly possible, but from the standpoint of a cosmologist it doesn't lend itself to the imaginative flights of fancy that the alternative explanations do. (That's my opinion. Carroll dismisses "we got lucky" on the basis that it is highly improbable, which is consistent with his assignment of probabilities to each of these alternative explanations.)
The third alternative explanation requires introducing the concept of the "multiverse." A universe is a region with a single set of values for fundamental physical parameters, and therefore a particular set of physical laws. If universes with different values for the same physical parameters exist, the whole collection of universes is called the multiverse.
There are different theories for how a multiverse could have arisen, but the bottom-line question for Carroll is the likelihood of a multiverse coming into existence versus the likelihood that a (or rather, the) universe was created by a deity.
One popular objection to the multiverse is that it is highly non-parsimonious; is it really worth invoking an enormous number of universes just to account for a few physical parameters? As Swinburne says:It's here that we arrive at the question of what is philosophically preferable in a theory. Simpler theories are better, but what is the measure of simplicity? Carroll's answer is, the theory that can be expressed most compactly (for a technical definition of "compact") is the one scientists prefer.
To postulate a trillion trillion other universes, rather than one God in order to explain the orderliness of our universe, seems the height of irrationality.
... The physics of a universe containing 10^88 particles that all belong to just a handful of types, each particle behaving precisely according to the characteristics of its type, is much simpler than that of a universe containing only a thousand particles, each behaving completely differently.There are legitimate objections to the notion of a multiverse. However, whatever challenges arise from positing a multiverse to explain our physical reality, those same challenges, and worse, confront "the God hypothesis." For instance, one fundamental question facing an explanation of the universe as God's handiwork is, why is the universe so much more complex than it needs to be for the purpose of creating man?
Likewise, a multiverse that arises due to the natural dynamical consequences of a relatively simple set of physical laws should not be discounted because there are a lot of universes out there. Multiverse theories certainly pose formidable problems, especially when it comes to making predictions and comparing them with data; for that reason, most scientists would doubtless prefer a theory that directly predicted the parameters we observe in nature over a multiverse ensemble in which our local environment was explained anthropically. But most scientists (for similar reasons) would prefer a theory that was completely free of appeals to supernatural agents.
In my experience, the typical rejoinder to this kind of objection is, "God's reasoning is beyond our comprehension." This assertion is a showstopper: there is no way to convince a nonbeliever that it's true, and no way to convince a believer that it isn't.
What is often overlooked is that there is a corresponding showstopper argument if you come at things from the opposite direction. "God hypothesis" advocates cite various roles God could have played or could be playing in the unfolding of the universe, but they all have the purpose of explaining why things happen. And as Carroll writes:
... the ultimate answer to "We need to understand why the universe exists/continues to exist/exhibits regularities/came to be" is essentially "No we don't."Carroll thinks that some people need to find causes for all things, while others are able to see the universe as a thing beyond the need for causation. It's this fundamental difference in mindset that explains why some people need a "God hypothesis" and others don't.
There is no reason, within anything we currently understand about the ultimate structure of reality, to think of the existence and persistence and regularity of the universe as things that require external explanation. Indeed, for most scientists, adding on another layer of metaphysical structure in order to purportedly explain these nomological facts is an unnecessary complication.It's a terrific read.
Stiglitz on the top one percent
In a Vanity Fair article, economist Joseph Stiglitz writes of the enormous, and growing, inequality between what the top 1% of Americans earn (and are worth) versus the other 99%. He proceeds to cite the reasons this yawning chasm is detrimental not just to the nation as a whole, but also to those extraordinarily wealthy people themselves.
The article is bound to stir up accusations of class warfare among those who feel the rich are the only remaining group it is permissible to demonize, but this statistic is hard to wave away:
The article is bound to stir up accusations of class warfare among those who feel the rich are the only remaining group it is permissible to demonize, but this statistic is hard to wave away:
The upper 1 percent of Americans are now taking in nearly a quarter of the nation’s income every year. In terms of wealth rather than income, the top 1 percent control 40 percent. Their lot in life has improved considerably. Twenty-five years ago, the corresponding figures were 12 percent and 33 percent.(Thanks to The Browser for the link.)
Halliburton fracking info, week 20
Sometime in the last two weeks, Halliburton added a third state to its fracking fluids disclosure page: North Dakota joins Pennsylvania and (South) Texas. The three North Dakota formulations are:
A prefatory explanation to the formulations explains a little about why the region is being explored.
- North Dakota Bakken Hybrid Formulation 1
- North Dakota Bakken Hybrid Formulation 2
- North Dakota Bakken Hybrid Formulation 3
A prefatory explanation to the formulations explains a little about why the region is being explored.
The Upper Devonian-lower Mississippian-age Bakken Formation, a relatively thin stratigraphic unit that covers a large portion (>200,000 mi2) of the Williston Basin, contains the largest oil accumulation in the contiguous 48 states. The Bakken Formation which is found only in the subsurface of the Williston Basin, ranges in depth from11,000 ft at the center of the Williston basin to just over 3,000 ft along its northern limit. The Williston Basin is located in Montana, Wyoming, South Dakota and North Dakota in the United States and Saskatchewan and Manitoba in Canada.The Wikipedia entry on the Bakken Formation has more details, including the history of how estimates of the recoverable amount of oil have changed just in the past five years. Presumably, the provable reserves of natural gas in the area are commensurately large.
Discovered in 1951, the Bakken has seen aggressive growth in development activity in recent years with the successful application of advanced horizontal drilling, fracturing, and completions technologies.
Labels:
fracking,
halliburton,
hydraulic fracturing,
natural gas
Monday, April 4, 2011
Yes, I'm conflicted
Last entry:
I can hold mutually incompatible opinions more or less simultaneously. Well, at least that should help me pass a Turing test.
Our failure [to produce new genius writers] is due to not valuing them. We don't value them because we don't value great writing. And we don't value great writing because we don't value intellectualism in any form.The entry before that:
...
[James] has put his finger on a very serious shortcoming about our culture.
The 'net-addled might not be as intellectually rigorous or as deeply thoughtful as earlier generations. It doesn't matter. If it's a problem, future generations will make those characteristics a priority (again).Not quite a direct contradiction, but close. Maybe the moon changed phases last night.
I can hold mutually incompatible opinions more or less simultaneously. Well, at least that should help me pass a Turing test.
Why we mint athletes instead of writers
In an excerpt from his book Solid Fool's Gold: Detours on the Way to Conventional Wisdom, sports writer Bill James advances a familiar argument in a refreshingly blunt way. To the question, "Why are we so good at developing athletes and so lousy at developing writers?" James says the answer is, we work a lot harder at the former than the latter.
More pugnaciously, James also says that sports get a bad rap for misdirecting racial aspirations, and he's tired of the calumny.
On the other hand ...
First, I'd like to see hard numbers. Second, is the problem really racial discrimination, which is what I assume he means by "barriers"? It's not at all clear to me that low rates of racial diversity in various fields -- if the rates are low; again, I don't have numbers (James doesn't cite any, either) -- are tied to race rather than income. As I understand things, parental income tends to be a more reliable predictor of children's education and future status than race.
In spite of the weaknesses in James's argument, attributable to overreach, I think his fundamental point is sound: we pay more attention to identifying and developing athletes than we do to identifying and developing writers (or any intellectuals). Why is that? Here's his answer:
Why wouldn't we need and want more literary geniuses, for crying out loud?
Certainly Shakespeare remains important and relevant today. So did Sophocles in Shakespeare's time. Shakespeare, though, was a better fit for his audience than Sophocles was. So we should expect that someone from our own time and circumstances would connect more intimately with us than Shakespeare does. That person might join Shakespeare in the esteem of future generations.
James is wrong that our failure as a society to produce new genius writers is due to our not needing them. Our failure is due to not valuing them. We don't value them because we don't value great writing. And we don't value great writing because we don't value intellectualism in any form.
If we did, we'd be paying physicists what rookie baseball players make, we'd have multiple channels dedicated to following philosophical debates instead of golf, and our kids would grow up idolizing those who analyze viral proteins instead of NBA all-stars.
It's a pity James doesn't do a better job of supporting his own argument, because he has put his finger on a very serious shortcoming about our culture.
First, we give them [potential athletes] the opportunity to compete at a young age.James has been taken to task, justifiably, for a badly conceived comparison between London at the time of Shakespeare (and Ben Jonson, Christopher Marlowe, and Francis Bacon) and Topeka, Kansas, a comparison James intended to be the key statistical underpinning for his argument. The comparison is wrong and helps his argument not a bit. However, the criticisms I've read don't argue that his central premise is flawed. We do put a lot more effort into identifying and nurturing athletes than writers, or more broadly, intellectuals.
Second, we recognize and identify ability at a young age.
Third, we celebrate athletes' success constantly. We show up at their games and cheer. We give them trophies. When they get to be teenagers, if they're still good, we put their names in the newspaper once in a while.
Fourth, we pay them for potential, rather than simply paying them once they get to be among the best in the world.
More pugnaciously, James also says that sports get a bad rap for misdirecting racial aspirations, and he's tired of the calumny.
People in the sporting world in 1950 were just as racist as people in other parts of society—but people in the sporting world got over it a hell of a lot faster, because we cared more about winning than we did about discriminating. Because the sporting world was always ahead of the rest of the world in breaking racial barriers, black kids came to perceive sports as being the pathway out of poverty. For this we are now harshly and routinely criticized—as if it was our fault that the rest of society hasn't kept up.I have no idea if he's right. It seems a bit facile as arguments go. However, it's good food for thought, no?
On the other hand ...
Some jackass Ph.D ex-athlete pops up on my TV two or three times a year claiming that a young black kid has a better chance of being hit by lightning than of becoming a millionaire athlete. This is nonsense as well as being a rational hash.In the absence of actual statistics in support of his assertion, I'll call foul on James. I don't think it's nonsense at all: I think it's a realistic appraisal of the odds. (James seems to have trouble putting statistics into proper context: that failure is the crux of the flaw in his Topeka-vs.-London argument, too. I find that failure a bit ironic in a sports writer, considering that stats rule sports analysis.)
Look, it's not our fault that the rest of the world hasn't kept up. It's not our fault that there are still barriers to black kids becoming doctors and lawyers and airline pilots. Black kids regard the athletic world as a pathway out of poverty because it is. The sporting world should be praised and honored for that. Instead, we are more often criticized because the pathway is so narrow.I like his scrappy tone. I suspect, though, that his argument, again, is too facile.
First, I'd like to see hard numbers. Second, is the problem really racial discrimination, which is what I assume he means by "barriers"? It's not at all clear to me that low rates of racial diversity in various fields -- if the rates are low; again, I don't have numbers (James doesn't cite any, either) -- are tied to race rather than income. As I understand things, parental income tends to be a more reliable predictor of children's education and future status than race.
In spite of the weaknesses in James's argument, attributable to overreach, I think his fundamental point is sound: we pay more attention to identifying and developing athletes than we do to identifying and developing writers (or any intellectuals). Why is that? Here's his answer:
We still have Shakespeare. We still have Thomas Hardy and Charles Dickens and Robert Louis Stevenson; their books are still around. We don't genuinely need more literary geniuses. One can only read so many books in a lifetime. We need new athletes all the time because we need new games every day—fudging just a little on the definition of the word need. We like to have new games every day, and, if we are to have a constant and endless flow of games, we need a constant flow of athletes.The fallacy of thinking "we don't genuinely need more literary geniuses" becomes more breathtaking the longer you think about it. If Shakespeare sufficed, why did later ages produce Hardy or Dickens or Stevenson? For that matter, why did we need Shakespeare when the classic Greek tragedies sufficed for Western civilization for a couple of thousand years?
Why wouldn't we need and want more literary geniuses, for crying out loud?
Certainly Shakespeare remains important and relevant today. So did Sophocles in Shakespeare's time. Shakespeare, though, was a better fit for his audience than Sophocles was. So we should expect that someone from our own time and circumstances would connect more intimately with us than Shakespeare does. That person might join Shakespeare in the esteem of future generations.
James is wrong that our failure as a society to produce new genius writers is due to our not needing them. Our failure is due to not valuing them. We don't value them because we don't value great writing. And we don't value great writing because we don't value intellectualism in any form.
If we did, we'd be paying physicists what rookie baseball players make, we'd have multiple channels dedicated to following philosophical debates instead of golf, and our kids would grow up idolizing those who analyze viral proteins instead of NBA all-stars.
It's a pity James doesn't do a better job of supporting his own argument, because he has put his finger on a very serious shortcoming about our culture.
Fretting about the 'net
I shun celebrity gossip sites and Web videos because I decided a while back that if I'm going to kill brain cells, I'd rather do so over a well-aged bottle of amber bliss with a good friend or two. Recently, though, I realized I have another weakness that, while not killing as many brain cells, certainly isn't giving the ones I still have the kind of workout they need, and is taking up an inordinate amount of time to boot. The culprit is the melancholic essay about the 'net's detrimental effect on our social lives and/or mental health. (That's right, essays just like this one.)
Nicholas Carr's Rough Type blog is dedicated to investigating the effects of the 'net on our minds; Carr has even written a book on the subject, The Shallows: What the Internet is Doing to Our Brains. (Visit the Rough Type blog for the actual link, in case he gets cash for the clicks. I haven't read it, by the way, so don't ask me if it's any good.) I read a lot of what Carr posts; I don't agree with it all, but I appreciate that somebody is reading and thinking about the subject.
That said, reading about the 'net's detrimental effects on our minds ultimately is pointless. People have anecdotal evidence to support this narrow finding or that one: the 'net is decreasing our ability to focus, unless it isn't; we're losing our capacity for deep thought, except when we aren't; etc. Few well-designed studies have been done. Nobody can say whether our 'net-addled brains work differently today than brains did a hundred years ago, much less say whether any differences are beneficial.
Moreover, there's something narcissistic about 'net-fretting. We who spend a lot of time online tend to obsess about it in ways that bemuse, if not amuse, others. We think we're hyperaware of our environment, imagining ourselves to be fish who realize they live in water. It's probably truer to say we're self-absorbed: we can't get enough about how the 'net has changed us.
So why write another navel-gazing piece? Actually, I'm hoping this won't turn out to be one. Believe it or not, this professional pessimist and part-time cynic is going (to try) to offer a non-cynical, mostly non-pessimistic take on the subject. For this surprising turn you can thank Alice Gregory, who penned (can I still use that term?) a book review last November that Carr made the focus of a recent blog entry.
Gregory's review of Gary Shteyngart's novel Super Sad True Love Story is really an excuse, and not a bad one, for an extended reflection on how technology and circumstance have changed the way she thinks since she graduated from college. She's not happy with the changes.
(I can't resist observing that Gregory writes in a very fresh-out-of-school way. Flourishing her erudition alongside her earthiness shows us and future employers that she not only received a good education, but can write for the common man too.)
Certain software, though (and by "technology" Gregory really is talking about software, not hardware), does habituate us to what she calls "the primitive pleasure of constant and arbitrary stimulation." Where I think she errs is in assuming such stimulation is entirely pleasurable. Up to a point it is: we all like new experiences. However, eventually we grow tired of being stimulated and we need to rest. Gregory finds herself wearier and wearier, and perhaps as a consequence, finds much about social media to be a chore.
Sometimes 'net obsession goes beyond being a chore, and turns into something that sounds like a narcotic.
None of the foregoing would have gotten me to comment on this review, though. What spurred me to do that was a bit of amateur prognostication of What All This Means For Our Future.
Anyway, I've stopped worrying about humanity's future.
If you were to transport Aldous Huxley from his heyday to right now, would he conclude that we had escaped or fulfilled the dystopian freak show of Brave New World?
If the Greek philosophers who first imagined "democracy" could see us now, would they think we had distorted human society all out of reason, and fallen from what they would consider their own state of relative grace?
Whether you answered "yes" or "no" to the foregoing, consider this:
What does it matter?
We can't and won't go back to an earlier state of society. We always look forward. The result might be something that looks similar to what came before, but it will have arisen in response to current needs and current abilities, and it will never be exactly the same as what came before.
So fretting that the 'net is changing the way we think isn't going to get us not to use the 'net. It might get us to use it less, but that's all. None of us wants to turn the clock back, even if we could.
The 'net-addled might not be as intellectually rigorous or as deeply thoughtful as earlier generations. It doesn't matter. If it's a problem, future generations will make those characteristics a priority (again). That's how our future brains will take care of themselves. That's why hand-wringing like Gregory's doesn't impress me any more.
Humanity will muddle through somehow.
Nicholas Carr's Rough Type blog is dedicated to investigating the effects of the 'net on our minds; Carr has even written a book on the subject, The Shallows: What the Internet is Doing to Our Brains. (Visit the Rough Type blog for the actual link, in case he gets cash for the clicks. I haven't read it, by the way, so don't ask me if it's any good.) I read a lot of what Carr posts; I don't agree with it all, but I appreciate that somebody is reading and thinking about the subject.
That said, reading about the 'net's detrimental effects on our minds ultimately is pointless. People have anecdotal evidence to support this narrow finding or that one: the 'net is decreasing our ability to focus, unless it isn't; we're losing our capacity for deep thought, except when we aren't; etc. Few well-designed studies have been done. Nobody can say whether our 'net-addled brains work differently today than brains did a hundred years ago, much less say whether any differences are beneficial.
Moreover, there's something narcissistic about 'net-fretting. We who spend a lot of time online tend to obsess about it in ways that bemuse, if not amuse, others. We think we're hyperaware of our environment, imagining ourselves to be fish who realize they live in water. It's probably truer to say we're self-absorbed: we can't get enough about how the 'net has changed us.
So why write another navel-gazing piece? Actually, I'm hoping this won't turn out to be one. Believe it or not, this professional pessimist and part-time cynic is going (to try) to offer a non-cynical, mostly non-pessimistic take on the subject. For this surprising turn you can thank Alice Gregory, who penned (can I still use that term?) a book review last November that Carr made the focus of a recent blog entry.
Gregory's review of Gary Shteyngart's novel Super Sad True Love Story is really an excuse, and not a bad one, for an extended reflection on how technology and circumstance have changed the way she thinks since she graduated from college. She's not happy with the changes.
In the past year, I graduated from college, got a desk job, and bought an iPhone: the three vertices of the Bermuda Triangle into which my ability to think in the ways that matter most to me has disappeared. My mental landscape is now so altered that its very appearance must be different than it was at this time last year. I imagine my brain as a newly wretched terrain, littered with gaping chasms (What’s my social security number, again?), expansive lacunae (For the thousandth time, the difference between “synecdoche” and “metonymy,” please?), and recently formed fissures (How the fuck do you spell “Gyllenhaal?”). This is your brain on technology.I'll hazard a guess that she's wrong to make technology the primary culprit. College requires you to spend hours, days, or even weeks on a single topic. Most jobs neither encourage nor permit that kind of singlemindedness. You're engaged with the outside world, and the outside world is wont to destroy your carefully constructed plans for organizing the day. This happens whether or not you own an iPhone.
(I can't resist observing that Gregory writes in a very fresh-out-of-school way. Flourishing her erudition alongside her earthiness shows us and future employers that she not only received a good education, but can write for the common man too.)
Certain software, though (and by "technology" Gregory really is talking about software, not hardware), does habituate us to what she calls "the primitive pleasure of constant and arbitrary stimulation." Where I think she errs is in assuming such stimulation is entirely pleasurable. Up to a point it is: we all like new experiences. However, eventually we grow tired of being stimulated and we need to rest. Gregory finds herself wearier and wearier, and perhaps as a consequence, finds much about social media to be a chore.
Sometimes 'net obsession goes beyond being a chore, and turns into something that sounds like a narcotic.
Opening Safari is an actively destructive decision. I am asking that consciousness be taken away from me. Like the lost time between leaving a party drunk and materializing somehow at your front door, the internet robs you of a day you can visit recursively or even remember. You really want to know what it is about 20-somethings? It’s this: we live longer now. But we also live less. It sounds hyperbolic, it sounds morbid, it sounds dramatic, but in choosing the internet I am choosing not to be a certain sort of alive. Days seem over before they even begin, and I have nothing to show for myself other than the anxious feeling that I now know just enough to engage in conversations I don’t care about.[Grumpy aside: it's "the Internet," capitalized. It's still a proper name, even today.]
None of the foregoing would have gotten me to comment on this review, though. What spurred me to do that was a bit of amateur prognostication of What All This Means For Our Future.
The internet’s most ruinous effect on literacy may not be the obliteration of long-format journalism or drops in hardcover sales; it may be the destruction of the belief that books can be talked and written about endlessly. There are fewer official reviews of novels lately, but there are infinitely more pithily captioned links on Facebook, reader-response posts on Tumblr, punny jokes on Twitter. How depressing, to have a book you just read and loved feel so suddenly passé, to feel—almost immediately—as though you no longer have any claim to your own ideas about it.I don't know what to make of ominous visions like this. My inclination is to observe that in spite of books, radio, films, television, and now the Internet, people still see plays. They no longer command the same degree of attention, but they survive. I'll warrant that books, and deep discussion and appreciation thereof, will survive, too.
Anyway, I've stopped worrying about humanity's future.
If you were to transport Aldous Huxley from his heyday to right now, would he conclude that we had escaped or fulfilled the dystopian freak show of Brave New World?
If the Greek philosophers who first imagined "democracy" could see us now, would they think we had distorted human society all out of reason, and fallen from what they would consider their own state of relative grace?
Whether you answered "yes" or "no" to the foregoing, consider this:
What does it matter?
We can't and won't go back to an earlier state of society. We always look forward. The result might be something that looks similar to what came before, but it will have arisen in response to current needs and current abilities, and it will never be exactly the same as what came before.
So fretting that the 'net is changing the way we think isn't going to get us not to use the 'net. It might get us to use it less, but that's all. None of us wants to turn the clock back, even if we could.
The 'net-addled might not be as intellectually rigorous or as deeply thoughtful as earlier generations. It doesn't matter. If it's a problem, future generations will make those characteristics a priority (again). That's how our future brains will take care of themselves. That's why hand-wringing like Gregory's doesn't impress me any more.
Humanity will muddle through somehow.
Saturday, April 2, 2011
That fracking dirty water
I'm sorry that ambient overload kept me from reading a New York Times piece about wastewater from fracking until now. Specifically, the article talks about how the wastewater is being sent to sewage treatment plants that aren't designed to cope either with its high level of salts or relatively high level of radioactivity.
I'm also sort of sorry that I read the article, because it makes abundantly clear that the only ones more overwhelmed by the situation than those treatment plants are federal and state regulators charged with protecting us from that hazardous wastewater.
Actually, not all state regulators are overwhelmed. The word "overwhelmed" suggests that they're attempting to cope. In the case of Pennsylvania's Department of Environmental Protection, nothing could be further from the truth.
UPDATE: I wrote about the risks of reusing the wastewater in early March. As with measuring for radioactivity in such wastewater, there are no regulations covering wastewater reuse.
I'm also sort of sorry that I read the article, because it makes abundantly clear that the only ones more overwhelmed by the situation than those treatment plants are federal and state regulators charged with protecting us from that hazardous wastewater.
Actually, not all state regulators are overwhelmed. The word "overwhelmed" suggests that they're attempting to cope. In the case of Pennsylvania's Department of Environmental Protection, nothing could be further from the truth.
In 2009, E.P.A. scientists studied the matter and also determined that certain Pennsylvania rivers were ineffective at sufficiently diluting the radium-laced drilling wastewater being discharged into them.It's hard to tell if the Pennsylvania D.E.P. is complacent, or complicit.
Asked about the studies, Pennsylvania regulators said they were not aware of them.
UPDATE: I wrote about the risks of reusing the wastewater in early March. As with measuring for radioactivity in such wastewater, there are no regulations covering wastewater reuse.
Labels:
environment,
fracking,
hydraulic fracturing,
natural gas
"Johnson & Johnson's Quality Catastrophe," David Voreacos, Alex Nussbaum, Greg Farrell
Courtesy of Longreads, a disturbing portrait of a megacorporation gone wrong in the last decade.
The CEO claims the problems largely have been confined to one division and are being addressed. Others see a companywide culture that places profit and shareholder interests above strict quality control. The CEO's credibility would be greater (not high on an absolute scale, but still, greater) if the company didn't have a pattern of doing everything it could to keep people from finding out things -- things like how often it recalled products, and how widespread those recalls were.
The CEO claims the problems largely have been confined to one division and are being addressed. Others see a companywide culture that places profit and shareholder interests above strict quality control. The CEO's credibility would be greater (not high on an absolute scale, but still, greater) if the company didn't have a pattern of doing everything it could to keep people from finding out things -- things like how often it recalled products, and how widespread those recalls were.
Subscribe to:
Posts (Atom)