Pages

Sunday, June 24, 2012

Speaking of where the jobs are ...

In the previous post, I asserted:
Tech, being concentrated on intellectual property, doesn't need a huge number of people to do anything: it needs a limited number of people to think of things. To the extent that the tech industry makes tangible goods, those goods are made overseas.

... Tech is a high-profile but totally insufficient source of employment.

I had been thinking of the kind of work performed by software companies like Facebook, Google, and Twitter. Companies like these don't have retail arms, nor do biotech firms like Genentech. Even companies that make hardware, like H-P and Dell, don't tend to have their own stores; those that do, like Dell and Gateway (if either still has any), have a vanishingly small number.

The exception to the foregoing, of course, is Apple, which has a robust, growing and high-profile retail operation. In spite of that high profile, I confess I had completely forgotten about these stores until I ran across the New York Times' lengthy piece discussing some former workers' dissatisfaction with their pay and working conditions.

It's worth reading the whole piece, especially if you've always suspected there was something a little, well, off about the Apple fans of your acquaintance — if, in other words, you've wondered if they didn't qualify as "fanatics" rather than mere "fans". However, if you simply can't be bothered to read it, then at least note this telling statement:

The Internet and advances in computing have created untold millionaires, but most of the jobs created by technology giants are service sector positions — sales employees and customer service representatives, repairmen and delivery drivers — that offer little of Silicon Valley’s riches or glamour.
In other words, these service jobs are no more likely to elevate you into the middle class than working at Walmart or Starbucks. And yet, these jobs constitute the majority of those created by high tech. The high-paying jobs are quite few, by comparison, and totally inadequate to creating a middle class.

The Times article simply reinforces what I said in that last post: this country isn't creating things any more, but instead is providing services. Since altogether too many of those services are not worth a great deal of money, the majority of people aren't making a great deal of money — hence, fewer and fewer people can genuinely afford the kind of life that we have convinced ourselves we require. And my guess is that even if you're frugal, you're still having a tough time making ends meet on the salaries offered by most of the service jobs out there today.

To dig ourselves out of the hole we're in, we must do something either to lower our overall costs (so our limited incomes go further), or to increase our overall income. Neither of these is easy to do, and don't believe any politician who tells you otherwise (I'm looking at you, Mitt Romney).

And while we could think in terms of fixing our services-based economy to work better (again, either by lowering costs or increasing incomes), wouldn't it be nice if we started to restore our self-respect and our national security by restoring our ability to build tangible goods again at the same time? I'm not just talking about making new and modern factories: I'm also talking about not forgetting the hard-won lore and wisdom of the people who spent years observing and understanding what it takes to build things, big and small — tapping these people's brains before they're all dead and we have to relearn what they knew on our own.

What I'm talking about is anathema to free-marketeers because I'm talking about creating and implementing a national strategy, one that places the interests of the entire nation above the interests of shareholders. It's an audacious thought, to defy Adam Smith's invisible hand — and yet, this is what some of the U.S.'s most successful competitors like Germany and China have done. There are probably pitfalls to be avoided along the way (China's in particular is not a model we should emulate), but before you dismiss the idea out of hand, ask yourself this:

Has the last thirty to forty years of free-market sloganeering — in particular, the mad dash to deregulate everything in the name of unfettered capitalism — left this country better off? Has it left you better off?

If you answered "yes" to either of those questions, you're either part of the 1 percent, or you haven't been paying attention. Enron and the toxic-mortgage meltdown are the best-known catastrophes that arose from reckless deregulation in the last decade, but they're merely the tip of a very ugly iceberg. More shoes are waiting to drop (one fell on Jamie Dimon recently), and some of them could be much less abstract than the ones we've seen so far: for instance, I have a bad feeling about the long-term health consequences of hydraulic fracturing, which is not subject to clean-water regulations promulgated by the EPA (thanks to shady dealings between the energy industry, Congress and the George W. Bush administration).

Jobs, deregulation, declining wages, increasing costs: it's such a tangled web we allowed big business to weave. Yet we have to start untangling it, if we're to keep alive at least some of what makes our nation great. And time's a-wasting.

Saturday, June 23, 2012

The lie of the free market

"If you work hard, you will succeed. If you don't succeed, it's because you didn't work hard."

That's essentially the credo of free-marketeers everywhere. It's intuitively appealing in part because it's such a simple formulation.

That credo rests on an unspoken assumption that remains unspoken because it seems so obvious: the playing field is level. Everybody has an even chance to climb the ladder.

It's time for us to acknowledge that the playing field isn't level. Not in the United States.

I'm not talking about the well-known phenomenon that wealth begets wealth. That has always been the case, and will always be. I'm talking about the reality that if you want to go from working at Walmart to earning a decent, middle-class living, the odds are stacked hugely against you.

Don't believe me? Then tell me: where are the jobs?

Even if you have a college degree, the jobs are scarce. And if you don't have a college degree, the jobs are not only scarce, they're totally inadequate to supporting a family.

Where are the good-paying jobs of yesteryear? Sent offshore.

Where is the seed crop, so to speak, for creating new good-paying jobs? It's either on Wall Street, figuratively and literally speaking, or in tech. The trouble is, Wall Street today is essentially parasitic and tech industries are all about intellectual property. Being parasitic, Wall Street doesn't create real value, it creates imaginary value by leveraging real assets in irresponsible ways. Tech, being concentrated on intellectual property, doesn't need a huge number of people to do anything: it needs a limited number of people to think of things. To the extent that the tech industry makes tangible goods, those goods are made overseas.

Wall Street, in other words, is a morally bankrupt source of employment. Tech is a high-profile but totally insufficient source of employment.

The U.S. economy today is geared toward providing services, not manufacturing goods. It's an emphasis that is guaranteed to spiral this country downward, both in terms of economic prowess and the difficult to measure capacity of self-respect. Except for food (and not even all of that), we don't make things for one another: we offer services to one another. The trouble with that is, we still need tangible goods.

Kevin Phillips in the 2006 book American Theocracy explains that since the 1980s, the financial-services industry has played an outsized role not only in the economy, but in policymaking. The industry skewed federal legislation and regulation to benefit itself and in the process promoted the transfer of good-paying jobs overseas. With those jobs went the middle-class standard of living.

At one time, this nation might have been the land of opportunity free-marketeers still think it is. But no longer. The playing field stopped being level twenty or even thirty years ago, and it has been tilting away from the 99% ever since. It's time for us to acknowledge that reality — and to stop calling attempts to restore some kind of equilibrium "socialist" or other phony, stupid epithets.

And it's time to call those supposed defenders of the free market who are perpetuating the distorted status quo what they are: self-interested tools who are looking out for nobody but themselves. That's the capitalist way, of course, but it only works when we all have the economic freedom to pursue our own self-interest. We don't have that today.

(This post wouldn't have happened if I hadn't been inspired — or perhaps the word is "outraged" — by tonight's episode of Moyers & Company in which Matt Taibbi and Yves Smith excoriated our too-big-to-fail financial institutions.)

Thursday, June 21, 2012

Paying for music

A blog post by an NPR intern, Emily White, explains that while she has 11,000 songs in her music collection, she has only ever bought 15 CDs. She has, she says, no attachment to physical media.
I wish I could say I miss album packaging and liner notes and rue the decline in album sales the digital world has caused. But the truth is, I've never supported physical music as a consumer. As monumental a role as musicians and albums have played in my life, I've never invested money in them aside from concert tickets and T-shirts.
It seems to me she is confusing physical media with paying for something she readily admits she enjoys. But let's set that aside for the moment. What's the bottom line for her?
What I want is one massive Spotify-like catalog of music that will sync to my phone and various home entertainment devices. With this new universal database, everyone would have convenient access to everything that has ever been recorded, and performance royalties would be distributed based on play counts (hopefully with more money going back to the artist than the present model). All I require is the ability to listen to what I want, when I want and how I want it. Is that too much to ask?
That's a beautiful vision, up to the last couple of sentences. As for those last sentences, well, how fucking entitled can you get?

If you don't like enriching giant agro-corporations, does that entitle you to take food at the store?

Tell me, Emily, since that Utopian infinite music library you envision doesn't exist yet, how exactly do you expect those musicians you love to earn a living today?

I don't get this mentality that if you can steal, it's okay. And what really gets me is pretending that you're not stealing, that you're trying to make a better world.

What absolute horseshit.

As I've grown up, I've come to realize the gravity of what file-sharing means to the musicians I love. I can't support them with concert tickets and T-shirts alone. But I honestly don't think my peers and I will ever pay for albums. I do think we will pay for convenience.
More succinctly: "I like what you do. I just can't be bothered to pay for it."

You may have gotten older, Emily, but you haven't "grown up". Adults pay for what they want.

Thursday, June 14, 2012

Getting past the religious divide

David Bornstein has an Opinionator piece in the New York Times about the Chicago-based Interfaith Youth Core (IFYC) whose mission is to foster greater meaningful interaction between those of different faiths (and those who do not profess a religious faith at all), with the ultimate goal of increasing mutual understanding.
“We can show in a quite rigorous way that when you become friends with someone of a different faith, it not only makes you more open-minded to people of that faith, it makes you more open-minded about people of all other faiths. It makes you more tolerant generally,” says Putnam. “That’s the fundamental premise of the Interfaith Youth Core’s work.”
While I find the premise immensely appealing, I also find it hard to be optimistic that IFYC's model is a workable one for society at large. IFYC operates among college students, who are among the better-educated members of our nation: they are, in short, part of the elite. Also, being young, they're more likely to be open to new ideas and new experiences. Now, while I think much of the dysfunction and polarization caused by religious sectarianism is fostered by elite members of society (specifically, media-savvy and ambitious religious leaders and politicians), these elites aren't generally uninformed: rather, they've made the conscious decision to foment sectarianism because it strengthens the influence they have over their followers. Even if you think I'm wrong, even if you think those leading the sectarian charge are sincere in their beliefs, getting them to back down from their often incendiary rhetoric toward The Other is a tough sell because they're only human, and being human, have a tendency to double down when their cherished beliefs are challenged. (You might even look on my skepticism of IFYC's work as evidence of that tendency, and you'd probably be right.)

But I think the real challenge for those who want to believe in the IFYC model is to consider those who aren't of the elite: those who can't attend college, or who won't, or who didn't and never will. How do you foster engagement with those of a different (or no professed) faith outside a college campus?

It's the same problem that makes racial animus so hard to eliminate. It's easy in this society, in which individualism is all but canonized as a sacred right, to create a fortress for yourself into which you admit only those who pass your tests for admission. Heck, I cherish individualism myself. I think the crucial question is how you view those outside your fortress: are they strangers, who simply are unknown to you, or enemies? The problem is that a lot of extremely faithful religious believers in this nation are acting as if the rest of us are enemies, not mere strangers. That may not be how these people actually think of us, but that's how they're coming off because of the rhetoric they choose to applaud (from the aforementioned religious and political elites).

For myself, by the way, I have no animus toward believers, though I can see how you might think otherwise if you've read much else on this blog. I'm more than willing to live and let live. The reason my dander is frequently up about religion is that I consider it an extremely private matter, and I vehemently object to attempts to export it into the public sphere. Thundering rhetoric proclaiming that "this is a Christian nation", for instance, is simply beyond the pale: it is exactly the kind of establishmentarianism that the sainted Pilgrims (forgive my sarcasm, but that is how they're portrayed) sought to escape in Europe. I will not stand for my identity as a non-believer being trampled by the misguided zeal of sectarians. I demand my right to exist and to a minimal degree of respect and dignity in the civil sphere. It is the same right you, as a believer, insist upon, and it is the same right we are both guaranteed by the non-sectarian Constitution.

In spite of my skepticism, I really, really, really, really hope that efforts like IFYC's succeed. A lot, after all, is at stake.

Americans celebrate diversity. But one of the mistaken beliefs about diversity is that it leads to greater tolerance. Putnam’s research indicates that, unless people make a concerted effort to build bridges, diversity leads to greater social fragmentation — with lower rates of trust, altruism and cooperation. “What ethnic diversity does is cause everybody to hunker down and avoid connection,” he explained. “It’s not just the presence of diversity in your neighborhood. You’ve got to actually be doing things with other people in which you have a personal attachment. Diversity is hard, not easy.”
If we are to renew our democracy — and is there anyone who thinks that's not necessary? — I think working hard at getting along, at understanding what our diversity actually means and why it's important, is a worthy goal to set for ourselves as a nation. It's not as sexy as winning World War II or putting a man on the moon, but it's at least as important. And libertarians, don't fret: unlike those efforts, we don't actually need government to show the way. This one can and probably should start at the bottom, with each of us.

Monday, June 11, 2012

A good use of social media

Horace Mann Academy alumni have set up a couple of Facebook pages to air and to discuss sexual abuse allegations against former faculty and staff, according to the New York Times. There are accounts surfacing elsewhere, too.

I am, as you might suspect, no fan of social media. I honestly don't get all the fuss about Facebook or Twitter. However, this use of social media I understand and appreciate. I hope the virtual gathering places help the victims come to terms with what happened.

By the way, if you haven't read the original account that sparked the sudden flurry of stories and discussion, it's worth casting an eye over it. The allegations are deeply disturbing. However, as hard as it is to remember in such a charged atmosphere, they're still — and likely will remain — just that: allegations.

Sunday, June 10, 2012

On class reunions

I'm in the part of my life when class reunions are supposed to figure in it, to a greater or lesser extent: greater, if your glory days coincided with your school days, or lesser, if you're like most of us. (TV's Buffy the Vampire Slayer got it right when Buffy snapped at a would-be suicidal classmate, "Every single person down there is ignoring your pain because they're too busy with their own.")

I was on good terms with most of my classmates, and I was never the class leper (sadly, there was always at least one). Attending small schools meant that most everybody shared the same experiences and environment, not to mention that it wasn't difficult to figure out everybody's name. Graduating meant the tight-knit community of familiar faces was disbanding. It hurt, a lot, to lose that every four years.

So those must have been pretty good days for me, modulo graduations, right?

Er, no.

When I think back on those days, I remember my incredible discomfort in my own skin. I was a hopeless outsider who through luck and a gift for mimicry had figured out how to ape everyone else just well enough to pass for a normal person. But like a singer who masters a foreign-language song phonetically but grasps not a whit of its meaning, I lived in dread that I'd stumble. Every day was a nerve-racking, exhausting performance, and like any actor I both dreaded and longed to hear what the critics -- my classmates -- thought. The reviews weren't verbal so much as behavioral: were people still looking me in the eye? Did they sidle away as I approached? Did I hear about next weekend's party?

It was a colossal waste of effort, but I didn't know any better. Besides, rampaging hormones generated a surfeit of nervous energy that had to find an outlet.

Why, then, was graduation painful? For the same reasons a lot of people take comfort in less than perfect situations: a preference for the familiar, and an accompanying fear of the unknown. Besides, even for me, school wasn't completely horrible. I made a few genuine friends. Academically, I did well enough. And thankfully, my social skills improved, however slowly and painfully.

Still, I'd never go back to that time.

Even if I didn't feel so ambivalent about those days, a reunion wouldn't be that compelling. I'm not the same person I was then, and neither is any of my former classmates. A reunion plays on picking up old threads, but they're not attached any more. Not to me, anyway.

Those enthused about reunions say that we're all curious about our old classmates: who got married to whom, who exceeded expectations, who took an unexpected path. Sometimes this is phrased more sardonically: "Don't you want to see who got fat and old?"

Not really.

Looking back, I realize that what I enjoyed with most of my classmates was what a mature person would call "cordial relations", not friendship. I didn't know the difference back then. Perhaps for me, there was no difference. But that's neither here nor there. The point is, what I had back then simply isn't a strong enough inducement to revisit those times. I know now that we simply weren't that close.

I have nothing against you, my former classmates. I'm just not that curious about you. I doubt you're that curious about me, either. (You're not missing much if you are.) We're practically strangers.

So enjoy yourselves when you gather. Just don't expect me to join you.

Thursday, June 7, 2012

R.I.P. Ray Bradbury

I dipped my toe into the vast ocean of science fiction when I was young. The dozens of dog-eared paperbacks at the local library fascinated me -- and scared me: I was susceptible to nightmares and it didn't take much to set me off.

I don't remember when or how I figured this out, but at some point I came to understand that SF broke down into two large categories: the "hard" SF works that were technology-centric, and everything else (which, curiously enough, didn't have a descriptive name like "hard"). "Everything else" could include fantasies, more psychologically complex works, alternate histories ... basically everything that didn't have to do with spaceships or robots. I own to being a little vague about "everything else" because my preference was strongly for hard SF, not least because it didn't give me nightmares.

The Big Three hard SF authors were Asimov, Heinlein and Clarke, the "deans of science fiction" as critiques and commentaries about SF proclaimed again and again. Yet annoyingly enough, a lot of commentators insisted on adding a fourth name: Bradbury.

Bradbury's works fell into the nightmare-inducing category for me. The Martian Chronicles was my first encounter when a well-meaning relative gave me a copy as a Christmas present. The stories weren't to my taste, yet they were weirdly compelling and compellingly weird; they left me unsettled, yet unable to comprehend why. If Asimov's tales were big, bold Statements Of The Future done in broad but exact strokes, Bradbury's tales were Impressionist sketches of less clear-cut subjects. Bradbury had an eye for how humanity's less noble instincts -- or simply bad luck -- could diminish or subvert its mere technological prowess.

Here's the effect a Bradbury story could have on me: the image I took away from his famous short story "A Sound of Thunder" was not of the hunting party, nor of the carnivorous dinosaur that pursued them. Rather, it was that of the squashed butterfly. I have never seen a filmed version, nor have I read the story in decades, yet in my mind's eye, clear as day, is that hapless insect, embedded in the thick coating of mud on a hiking boot. It is huge, perfectly formed (no torn wings), and its markings are delineated by thick, garish black fuzz that almost seems to glow, it's so prominent. It is like a leering, demonic face, mocking me with the promise of no happy ending for humanity. The "sound of thunder" closing the story is perversely anticlimactic by comparison.

No Asimov tale ever left so strong an impression, and I read a lot more of his output than Bradbury's.

So even though I never much cottoned to his work, I can but tip my imaginary hat to Ray Bradbury, and thank him for showing me that science fiction stories could be more than space operas with heroic square-jawed engineers. I might not have liked the worlds he opened up, but I, and science fiction, needed them.

Bradbury passed away Tuesday (5 June 2012). Michiko Kakutani penned a nice appreciation.