The Libertarian Standard » BK Marcus Property - Prosperity - Peace Mon, 02 Mar 2015 18:15:59 +0000 en-US hourly 1 A new website and group blog of radical Austro-libertarians, shining the light of reason on truth and justice. The Libertarian Standard clean The Libertarian Standard (The Libertarian Standard) CC-BY Property - Prosperity - Peace The Libertarian Standard » BK Marcus TV-G Was Robin Hood a Marxist? Mon, 16 Jun 2014 14:37:33 +0000 FreemanRobinHood300

Simon Schama could use a dose of classical-liberal theory. Most of us can be forgiven for knowing Marxist theory better than the liberal tradition — it’s hard not to drink Marxism in with our schooling and culture — but popular historical narrative really does suffer by the omission of the "bourgeois historians" whom Marx himself credits as the precursors of his class theory.

In the BBC TV series A History of Britain, Schama asks about the English Peasants’ Revolt of 1381, "Was this a class war, then?" (A term, he explains parenthetically that "we’re not supposed to use since the official burial of Marxism.") A pause, while the camera angle changes to closeup. "Yes," he says plainly. "It was."

"Not surprisingly," writes Schama in the print version of A History of Britain, "it was in the second half of the fourteenth century that the legends of Robin Hood … first became genuinely popular."

But as I write in "Class War in the Time of Robin Hood" in today’s Freeman, Schama is appealing to the wrong class theory if he wants to explain the mindset of the commoners marching on London in the 14th century.

I’m far from the first to offer a libertarian revision of Robin Hood’s politics, but where I focus on the ideology of his earliest historical audience, most other treatments focus on the particulars of the legend.

Some examples:

On the other hand, Ayn Rand seems to have been happy to leave Robin Hood to the socialists:

"It is said," Rand has Ragnar Danneskjöld concede in Atlas Shrugged, that Robin Hood "fought against the looting rulers and returned the loot to those who had been robbed, but that is not the meaning of the legend which has survived.

What do you think: is Robin Hood worth claiming for our tradition?

]]> 4
When Evil Institutions Do Good Things: The FCC’s PTAR Law Thu, 12 Jun 2014 17:55:55 +0000 StreetTV

In my Freeman article "TV’s Third Golden Age," the summary subtitle that the magazine chose was "Programming quality is inversely proportional to regulatory meddling." I couldn’t have said it better. But does that mean that everything the FCC does makes television worse?

All laws and regulations have unforeseen consequences. That usually means unintended damage, but there’s no law of history that says every unplanned outcome is pernicious.

If you’re an advocate of a free society — one in which all arrangements are voluntary and there is the least coercive interference from governments or other thugs — history will present you with an unending series of conundrums. Whom do you side with in the Protestant Reformation, for example? The Catholic Church banned books and tortured scholars, and their official structure is one of hierarchy and authority. Easy enemy, right? Clear-cut bad guy. But the Church had kept the State in check for centuries — and vice versa, permitting seeds of freedom to root and flourish in the gaps between power centers. Whereas the Protestant states tended to be more authoritarian than the Catholic ones, with Luther and Calvin (not to mention the Anglicans) advocating orthodoxy through force. There’s a reason all those Northern princes embraced the Reformation: they wanted a cozier partnership of church and state.

This is certainly not the history I was taught in my Protestant private schools.

Similarly, most of us were schooled to side with the Union in the Civil War, to see Lincoln as a savior and the Confederacy as pure evil. But as much as the war may have resulted, however accidentally, in emancipating slaves, it also obliterated civil liberties, centralized power, strengthened central banking and fiat currencies and — to borrow from Jeffrey Rogers Hummel’s great book title — enslaved free men.

"Father Abraham," as the pietists called him after his assassination, was a tyrant whose primary goal was always what he actually achieved: central power over an involuntary union. Recasting this guy as an abolitionist hero is one of the many perverse legacies of America’s official history. But it’s a mistake to simply reverse the Establishment’s verdict and claim that the Confederacy was heroic. Plenty of Johnny Rebs were fighting a righteous battle against what they rightly deemed to be foreign invaders, but even if you ignore the little problem of the South’s "peculiar institution," the Confederate government was no more liberal than its Northern rival. "While the Civil War saw the triumph in the North of Republican neo-mercantilism,” writes Hummel, “it saw the emergence in the South of full-blown State socialism.”

Reading history without taking sides may fit some scholarly ideal (actually, it seems to be a journalistic ideal created by the Progressive Movement to masquerade their views as the only unbiased ones), but it is not a realistic option. We cannot do value-free history. If we try, we instead hide or repress our biases, which makes them a greater threat to intellectual integrity.

Neither can we say, "a plague on both their houses," and retreat to the realm of pure theory, libertarian or otherwise. We have to live in the real world, and even if we are not activists or revolutionaries, the same intellectual integrity that must reject "neutrality" also requires that we occasionally explore the question of second-best or least-evil options.

I remember several years ago, when my very libertarian boss surprised me by speaking in favor of increased regulation of banking. His point was that the banks were not free-market institutions; they were government-created cartels enjoying a political privilege that protected them from the consequences of the market while they surreptitiously depleted our property and spoiled the price system that drives all progress in the material world. Ideally, he’d want the government out of banking altogether, but in the meantime having them do less damage was better than letting them do more.

It may seem anticlimactic to follow the Reformation, Civil War, and fractional-reserve banking with a little-known FCC rule about TV programming from almost half a century ago, but I’ve been reading television history for a while now (1, 2, 3, 4) as illustrative of larger patterns in political history.

The Prime Time Access Rule (PTAR) was a law instituted in 1970 to limit the amount of network programming allowed during TV’s most-watched evening hours.

According to industry analyst Les Brown, the PTAR was adopted

to break the network monopoly over prime time, to open a new market for independent producers who complained of being at the mercy of three customers, to stimulate the creation of new program forms, and to give the stations the opportunity to do their most significant local programming in the choicest viewing hours. (Les Brown’s Encyclopedia of Television)

If you still accept the official myth that the airwaves are "That most public of possessions given into the trust of the networks," as Harlan Ellison describes them in The Glass Teat, and that the federal government’s job is to manage the radio spectrum in the best interests of that public, then I’m sure you don’t see any problem with PTAR. (You can read my paper "Radio Free Rothbard" [HTML, PDF] for a debunking of this official piety.)

But a libertarian could easily jerk his or her knee in the opposite direction. How dare the central government tell private station owners what they can and can’t air on their own stations, right?

The problem with such an ahistorical take on the issue is that broadcast television was a creature of the state from the beginning. Radio may have had a nascent free-market stage in its development, but television was a state-managed cartel from the word go.

So am I saying that PTAR was a good thing? Is it like the possibly beneficial banking regulations imposed on a cartelized banking system? Should we view CBS versus FCC as the same sort of balance-of-power game that Church and State played before the early modern period of European history?

Maybe, but that’s not why I find PTAR an interesting case for the liberty-minded historian. As is so often the case with laws and regulations, PTAR’s main legacy is in its unintended consequences.

"Despite the best of intentions," writes historian Gary Edgerton in The Columbia History of American Television, "the PTAR failed in almost every respect when it was implemented in the fall of 1971."

[P]ractically no local productions or any programming innovations whatsoever were inspired by the PTAR. In addition, any increase in independently produced programming was mainly restricted to the reworking of previously canceled network series, such as Edward Gaylord’s Hee Haw and Lawrence Welk’s The Lawrence Welk Show.… Rather than locally produced programming, these kinds of first-run syndicated shows dominated the 7 to 8 P.M. time slot.

This renaissance of recently purged rural programming was certainly not the FCC’s goal, but the creation of the first-run-syndication model is one of the great unsung events in media history.

A quick note on terminology: to the extent that I knew the word "syndication" at all when I was growing up, I took it to be a fancy way of saying "reruns." For example, Paramount, the studio that bought the rights to Star Trek after the series was cancelled, sold the right to rerun the program directly to individual TV stations. When a local TV station buys a program directly from the studio instead of through the network system, that’s called syndication. But syndication isn’t limited to reruns. Studios created first-run TV programs for direct sale to local stations as far back as the 1950s, but they were the exception. The dominant syndication model was and is reruns. But two events created a surge of first-run syndication: (1) PTAR, and (2) the rural purge I obliquely alluded to above.

I write about the rural purge here, but I’ll summarize: as the 1960s turned into the 1970s, television network executives did an about-face on their entire approach to programming. In the 1960s, each network tried to win the largest possible viewership by avoiding controversy and appealing to the lowest common denominator in public tastes. This meant ignoring the rift between races, between generations, and between urban and rural sensibilities — what we now call red-state and blue-state values — in the ongoing culture wars. This approach was dubbed LOP (Least Objectionable Program) theory.

Basically, this theory posits that viewers watch TV no matter what, usually choosing the least objectionable show available to them. Furthermore, it assumes a limited number of programming choices for audiences to pick from and implies that networks, advertising agencies, and sponsors care little about quality when producing and distributing shows. (Gary Edgerton, The Columbia History of American Television)

By the end of the decade, however, NBC vice president Paul Klein (who had christened LOP theory just as its tenure was coming to an end), convinced advertisers that they should stop caring so much about total viewership and focus instead on demographics, specifically the Baby Boomers — young, politically radicalized, and increasingly urban TV viewers — who were most likely to spend the most money on the most products. CBS was winning the battle for ratings, but Klein pointed out that their audience was made up of old folks and hicks, whereas NBC was capturing the viewership of the up-and-comers.

Klein may have worked for NBC, but it was CBS who took his message to heart, quite dramatically. In 1970, the network rocked the TV world by cancelling its most reliably popular shows: Petticoat Junction, Green Acres, The Beverly Hillbillies, Mayberry RFD, Hee Haw, Lassie, and The Lawrence Welk Show.

In Television’s Second Gold Age, communications professor Robert J. Thompson writes,

CBS, in an effort to appeal to a younger audience made socially conscious by the turbulent 1960s, had dumped its hit rural comedies in the first years of the 1970s while their aging audiences were still placing them in Nielsen’s top twenty-five. Critics, who for the most part had loathed the likes of Petticoat Junction and Gomer Pyle, loved some of what replaced them.

I loved what replaced them, too: Mary Tyler Moore, All in the Family, M*A*S*H, and the like. "Several members of Congress," Wikipedia informs us, "expressed displeasure at some of the replacement shows, many of which … were not particularly family-friendly." But that was the point: the networks were no longer aiming to please the whole family: just the most reliable consumers.

But despite capitalism’s cartoonish reputation for catering only to the bloated hump of the bell curve, that’s not how the market really works. It is how a cartel works, and the broadcast networks behaved accordingly, both before and after the rural purge. In the 1950s and ’60s, they aimed for the largest possible viewership and to hell with minorities of any sort. The demographic revolution changed the target, but not the tactic: aim for the big soft mass. That’s certainly how the big players would behave in a free market, too, but the telltale sign of freedom in the economy is that the big players aren’t the only players. Fortunes are made in niche markets, too, so long as there aren’t barriers to entering those niches. As I’ve said, TV is descended from radio, and Hoover and his corporatist cronies had arranged it so that there could only be a few big players.

That’s where we come back to the FCC’s Prime Time Access Rule of 1970. PTAR created a hole at the fringe of the prime-time schedule, just as the rural purge was creating a hole in the market. All those fans of Hee Haw and Lawrence Welk didn’t just go away, and they didn’t stop spending their money on advertised products, either. Before PTAR, the multitude of fans of "rural" programming would have had to settle for mid-afternoon reruns of their favorite shows (the way Star Trek fans haunted its late-night reruns around this same time). But the rural fans didn’t have to settle for reruns, and they didn’t have to settle for mid afternoons or late nights. They could watch new episodes of Hee Haw or Lawrence Welk at 7 PM. In fact, those two shows continued to produce new episodes and the local stations, which were no longer allowed to buy from the networks for the early evening hours, bought first-run syndicated shows instead. The Lawrence Welk Show, which had started in the early 1950s, continued for another decade, until Welk retired in the early ’80s. And the repeats continue to run on PBS today. Hee Haw, believe it or not, continued to produce original shows for syndication until 1992.

I loved Mary Tyler Moore, and I didn’t care so much for Lawrence Welk, but what I really love is peaceful diversity, which cannot exist in a winner-takes-all competition. The rise of first-run syndication was a profound crack in the winner-takes-all edifice of network programming.

The strategy CBS, NBC, and ABC had gravitated toward for short-term success — namely, targeting specific demographics with their programming — also sowed the seeds of change where the TV industry as a whole would eventually move well beyond its mass market model. Over the next decade, a whole host of technological, industrial, and programming innovations would usher in an era predicated on an entirely new niche-market philosophy that essentially turned the vast majority of broadcasters into narrowcasters. (Gary Edgerton, The Columbia History of American Television)

This idea of "narrowcasting" is the basis of quality in entertainment (and freedom in political economy, but that’s another story).

I’m not out to sing the praises of the FCC for increasing economic competition and cultural diversity — these consequences were entirely unintended — but we do have to recognize PTAR as a pebble in Goliath’s sandle, distracting him for a moment from David’s sling.

]]> 2
privcheck Mon, 24 Mar 2014 11:17:22 +0000 Check Your Privilege

In a recent Freeman article, “Check Your Context,” columnist Sarah Skwire brought my attention to a popular meme on the political left, both online and off: “Check your privilege.”

At its gentlest, this is advice to raise our awareness of those aspects of our personal histories that may lead to complacent assumptions about how the world works, assumptions that may limit the scope of our moral imaginations.

When it is less gentle (which is often), it is a dismissal of the opinions of anyone who is insufficiently poor, or, more likely, insufficiently left-wing. [Read the rest of the article.]


]]> 0
Yes, We Have No Bananas Mon, 24 Feb 2014 12:28:29 +0000 YesWeHaveNoBananasIn a recent post on my personal blog (“Is mediocrity intelligent?”), I talked about the importance of a diversity of strategies — even apparently “wrong” ones — to the long-term survival of a species. The corollary of course is that overinvestment in any single strategy can be catastrophic.

We see this issue at play in modern agribusiness.

As Popular Science informs us,

The 1923 musical hit “Yes! We Have No Bananas” is said to have been written after songwriters Frank Silver and Irving Cohn were denied in an attempt to purchase their favorite fruit by a syntactically colorful, out-of-stock neighborhood grocer.

It seems that an early infestation of Panama disease was already causing shortages in 1923. But the out-of-stock bananas in question were not the Cavendish variety we all eat today; they were Gros Michel (“Big Mike”) bananas, and they were all that American banana lovers ate until the 1950s, when the disease finally finished them off.

I would love to know what a Gros Michel banana tastes like. I’m a big fan of bananas and eat them every day. (Actually, I drink them, blended into smoothies.) But the reason I only know the taste of Cavendish — and the reason you do too, unless you’re old enough to have had some Gros Michel mixed into your pablum — is that Cavendish bananas are resistant to the strain of disease that wiped out our original bananas. We have to assume that the Plan B bananas we now enjoy are only second best as far as flavor goes. They may not even be first best at survival, because the banana industry is searching for a Plan C banana to take the place of the Cavendish once the inevitable crop disease sends it the way of the Gros Michel — something that they predict will happen in the next decade or two. (See Banana: The Fate of the Fruit That Changed the World by Dan Koeppel.)

Why are bananas so vulnerable to these blights? Why aren’t agricultural scientists worried about our other favorite fruits — apples, for example?

Because there are many different types of apples. I’m dizzied by the variety at our local produce warehouse.

But not only is there just the one type of banana at the green grocers and in supermarkets; each banana you’ve probably every eaten is a clone of every other banana you’ve eaten. One genetic pattern manifested billions of times over, across millions of households in the past half century. And each Gros Michel was a clone of every other one, too. That’s because bananas reproduce asexually (as do potatoes, another food that’s especially vulnerable to disease — remember the Irish potato famine?).

Cavendish DNA is different enough from Gros Michel DNA that the disease that targeted the one species was no threat to the other. But any infection that can kill one Cavendish plant can wipe out the worldwide supply.

There are many reasons food activists attack Big Agribusiness — some good, some bad, and some wacky. One criticism that seems eminently reasonable to me is a concern that Big Agra puts all its billions of eggs in one giant basket.

Once upon a time, genetic diversity in farm products was built into how farming took place. Farmers farmed local land with local genetic strains of plants and animals. Chickens may have come from Asia, and Europe never saw a tomato until the Spanish brought some back from the New World, but even as trade began to go global several centuries ago, the limits of transportation and technology meant that gene pools could be local and diverse in a way that is much harder in our era of global overnight shipping and transnational corporate bureaucracies.

If an infestation wipes out the Golden Delicious, I can eat Fugi apples instead. But if the Cavendish disappears tomorrow, there isn’t yet a different banana to take its place.


In “Is mediocrity intelligent?” I wrote about the time my professor presented to the “artificial life” department at Bell Labs. In the context of a communications-research lab, artificial life was about using the lessons of biology, ecology, and evolution to make telephone networks more robust.

You may think that agriculture is more “natural” than phone switches and fiberoptics, but farming often short-circuits nature’s mechanisms to suit our short-term goals. One of the main such strategies of nature is diversity. And as I tried to illustrate, in that post, with the concept of the genetic deme (an isolated and seemingly inferior gene pool within a species), diversity means that what looks like an inferior strategy today could turn out to be the salvation of the species tomorrow.

As Larry Reed wrote recently in the Freeman,

Statists those who prefer force-based political action over spontaneous, peaceful, and voluntary initiatives — excel at distilling their views into slogans. (“A Slogan Worth Your Bumper?”)

But what I find revealing is the contradictions at play in the juxtaposition of different bumper stickers on the same car. (And when you see a whole bunch of bumper stickers on the same car, odds are you’re driving behind a left-wing statist.)


Last weekend, at a red light, I was behind a minivan that brandished three bumper stickers:

One said, “Women for Obama.”

If that wasn’t enough to declare the driver’s politics, the next bumper sticker made the claim that strong public schools create strong communities.

The last bumper stick advised us in rainbow colors to “Celebrate Diversity!”

(Pop quiz: Are bumper stickers #2 and #3 in accord or at odds?)

Now, it’s a standard complaint against leftists that they talk diversity while pushing ideological conformity. Political correctness, and all that.

But to me the greater irony is that the Left consistently pushes centralization. Eat local, buy local, but decide everything in Washington DC.

I know that there are left-wing decentralists, and perhaps they genuinely do see the important parallels between genetic diversity and political federalism, between local communities and local authority. But I keep thinking of a story Tom Woods tells of his attending a decentralist conference back in the 1990s, where he happily discovered like-minded activists from both Left and Right. But to the apparent delight of the left-wing so-called decentralists, the highlight of the event was the keynote speaker: Vice President Al Gore.

BananaBookNo, in my experience, the vast majority of people with Buy Local bumper stickers, as with the Celebrate Diversity crowd, are also often, e.g., Women for Obama — that is to say, champions of ever-more-centralized authority. I’m confident that the driver in front of me at the intersection saw no irony in celebrating diversity while advocating strong public schools — and an even stronger central government.

But in the biosphere, where diversity rules, order is spontaneous. That spontaneous order is both the cause of and the result from overwhelming diversity. There are no central strategies in evolution, only in the human world, and only in recent human history. Evolution gave the natural world hundreds of varieties of banana. The United Fruit Company (hardly a free-market firm, by the way) gave us only one.

]]> 1
Batman vs James Bond Tue, 28 Jan 2014 12:20:42 +0000 BatmanVsJamesBondIn recent months, my wife and I have been catching up on the Daniel Craig trilogy of 007 movies, and I’ve been watching Batman cartoons with my seven-year-old son. So my thoughts have been full of action heroes — particularly the Dark Knight and Her Majesty’s secret servant.

I remember my father complaining about both characters and contrasting them to the lone-hero tradition of hardboiled detectives and their fictional forebears, the cowboys.

G.I. vs Private Eye

In fact, my father’s point to my preteen self was a continuation of a point he made to me when I was about my son’s age. I’d just gotten a set of “Undercover Agent” accessories for my GI Joe doll (we didn’t call them action figures back then). Gone were the camouflage fatigues and assault rifle; now Joe sported a dark trench coat and a walkie-talkie.

GIJoeUndercoverAgentI said, “Look dad: It’s GI Private Eye!”

My father explained to me that my rhyming name for my new hero was self-contradictory. A GI was an American soldier, an official agent of the US government, whereas a “private eye” was a private individual, a lone hero in the fictional tradition. If dad had been more of a libertarian, he would have said that the military agent is paid by coercively extracted taxes and operates by state privilege, whereas the private detective is an agent of the market, authorized only by private contracts, and liable to the same restrictions as any individual citizen. My father doesn’t talk that way, even now, but he would acknowledge that description as making the same point.

So after GI Private Eye, I grew up with an awareness of the distinction between heroes like James Bond, who was funded and sanctioned by the government, and heroes like Philip Marlowe, who was funded by private clients and sanctioned only by his personal code of conduct.

Astin-Martin vs the Batmobile

Now, a few years later, my father was making a different but related point about James Bond, this time inspired by my love of another toy: my Corgi Astin-Martin DB5, James Bond’s super spy car from the movie Goldfinger. “Look dad, isn’t this car cool?”

1964_Corgi_Aston_Martin_DB5Ever philosophical, my father saw the car as symbolic, not only of that state-agent/private-individual divide he’d addressed a few years earlier with my GI Joe, but also of a divide in heroic literature. James Bond worked for the queen, he explained, in Her Majesty’s Secret Service. He was a knight for the monarch, and this tricked out vehicle from MI6’s Q Branch was the 1960s adventure-fantasy equivalent of the nobleman’s armor and mount.

I believe he felt the same about the Batmobile, but there are several important distinctions, some that put the historical emphasis on the “knight” in the Dark Knight, and some that put the “World’s Greatest Detective” more in league with the private eyes of American detective fiction.

For one thing, the medieval knight was a soldier for the king because he could afford to pay for armor, weapons, and a battle horse. He could afford to head off into battle instead of plowing the fields — and he could afford the time required for training between wars. The king didn’t pay him to be a knight. He paid the king for that honor. As far as we can tell, James Bond isn’t paying out of pocket for all those vodka martinis, and he certainly didn’t commission Q Branch for any of his gadgets. 007’s license to kill makes him a hired gun, even if he does restrict his paid murders to those sanctioned by his government.

Batman, on the other hand, pays his own way.

The Dark Knight of Liberty

Like most of the medieval knights, his wealth originally came from privilege more than trade. The Waynes are old money. Even “stately Wayne Manor” suggests aristocracy, and where Superman’s Metropolis is shiningly new and forward looking, gothic Gotham is old, with deep roots in Europe. Frames of Batman on the rooftops harken back to Quasimodo atop Notre Dame.

But while WayneCorp may well have risen on government contracts, Batman is not on the payroll. Bruce Wayne is spending his own money to fund his war on crime. This may put him in the ranks of the feudal warriors, but it sets him apart from agent 007.

Finally, who are the bad guys?

For Bond, they are the enemies of the state — meaning that they are whoever Her Majesty says they are. In both the books and films, they are invariably evil, so James Bond will look like the good guy when he finally defeats them, but ultimately the double-O agents are weapons: the government aims them at its enemies and pulls the trigger. We know full well from history who ends up in the crosshairs.

Even my favorite fictional private eyes, however independent and heroic they may prove to be, don’t go looking for trouble until a client hires them to do so.

But for Batman, the enemy is crime — not mere violators of legislation and statute law, not people who manufacture without regulation, trade without license, or copy digital patterns in violation of copyright. A true comic-book fanboy could probably dig through back issues and show us the exception, but I can’t recall Batman ever even picking on drug users.

For Batman, as for libertarians, a crime isn’t a crime without a victim. And it is the victims Batman is fighting for; they are proxies for the parents he was too young and scared to rescue from the back-alley gunman. In the versions of the backstory that I prefer, Batman can never avenge his parents’ deaths, so even the target of his vengeance is a proxy: not a human criminal but crime itself. And by “crime,” I mean rights violations, violence against person and property.

The Dark Knight may be on a perpetual quest, but it is not for a king; it is for the people.


]]> 0
The Pestilential State Mon, 25 Nov 2013 17:55:39 +0000 Mongol siegeThe Mongols surrounded the city walls. Genoese merchants hoped to wait them out inside the Black Sea trading city of Caffa. Technically these European merchants were guests of Uzbeg Khan of the Golden Horde. But the Genoese had become unwelcome. They repeatedly disrespected the authority of Islam and the khan himself. They dared to trade in Turkic slaves and had even summoned Italian troops to repel the previous khan’s soldiers. Now, when one of their own had killed a Muslim in the port city of Tana, these foreign "guests" defied the law by giving the murderer sanctuary here in Caffa, then refusing entrance to their hosts and rightful rulers at the edge of the Mongol Empire.

This time, there would be no reinforcements from Italy. Instead, the Mongols would fall to the invisible arrows of a plague that had followed the Silk Road from the arid plains of central Asia. While the Genoese were safe within the city of Caffa, the Mongol bodies piled up outside its walls.

In many respects, this scene was an echo of earlier history. The Greeks had fallen to plague outside the high walls of Troy, if Homer’s telling is right. The Bible says that Sennacherib ended his siege of Jerusalem because "the angel of the Lord went out, and smote in the camp of the Assyrians … and when they arose early in the morning, behold, they were all dead.…" According to the ancient Jewish historian Josephus, the Lord’s weapon was plague.

But the Mongols of the Golden Horde did something unprecedented both in the history of warfare and the history of disease. They piled their dead into catapults and hurled them over the city walls, raining diseased corpses on the besieged Genoese.

Unlike the Trojans and the Jews, the merchants were not on their home turf. And because Caffa was a port city, they could board their ships and flee the Crimea. It seems they brought the plague home with them.

"If this account is correct," writes bacteriologist Mark Wheelis in a paper for the Center for Disease Control, "Caffa should be recognized as the site of the most spectacular incident of biological warfare ever, with the Black Death as its disastrous consequence."

A century later, the population of Europe was only half the size it had been before the plague came west.

But even if the disease reached the West by way of the late Mongol Empire, causing what Wheelis calls "the greatest public health disaster in recorded history," ultimate blame for the cataclysm may not fall to the Mongol khan or his soldiers. Instead we should look to the conduct of European monarchs — and one in particular.

I tell the rest of the story in today’s featured article at FEE:


Black Death and Taxes

They had more to do with each other than you might think

NOVEMBER 25, 2013

The plague and the Little Ice Age didn’t do Europe any favors. But the excesses of the State amplified the damage.

]]> 0
Watching Illegal TV in Turkey Mon, 14 Oct 2013 11:25:56 +0000 RightwingTV Last month, I wrote in the Libertarian Standard about Twilight Zone creator Rod Serling and the end of the Golden Age of Television and about Serling’s preference for government interference over that of the advertisers.

Last week the Freeman published my article "TV’s Third Golden Age," about our present era in which quality dramas are moving from cable TV to the Internet, where they finally enjoy less interference from both advertisers and government regulation. The Internet is freer than television ever was.

In that article, I also give a little more background on JFK’s assault against the TV industry and how the deregulation trend of the 1970s and ’80s produced TV’s second "golden age." (Can you guess what brought it to an end?)

Paul Cantor, The Invisible Hand in Popular CultureBecause I mention the University of Virginia’s Paul Cantor in the Freeman article (as I did in "The Golden Age at Twilight" and "Price Theory a la Rupert Murdoch" here at TLS, as well as in "Did Capitalism Give Us the Laugh Track?" in the Freeman), I emailed Professor Cantor a link to the article.

Having just returned from the annual meeting of the Property and Freedom Society in Bodrum, Turkey, Cantor wrote this wonderful reply (which I quote with his permission):

This is a terrific article and thanks for sending it to me (and mentioning me in it). I’m glad to see that Thompson seems to be on board with us on these issues. I own his book but haven’t read it yet. It’s nearing the top of my "to read" pile, and you’ve pushed it up a few places. It’s good that we’re not alone on these issues.

As I recall what you wrote about radio, all this could have happened back in the 1920s if a subscriber model had been adopted for radio instead of the broadcasting model. Essentially, we’re finally getting where we should have been in the first place — real consumers for TV. I notice that young people now have no interest in seeing TV as broadcasted. They want direct access and know how to get it. When I was at Hans-Hermann Hoppe‘s recent conference in Turkey, I was amazed at how current the young people from central and eastern Europe were with American TV — maybe one episode behind on BREAKING BAD. When I asked: "Is BREAKING BAD broadcast in your country?" they stared at me as if I were saying: "Do dinosaurs still roam the plains of Poland?" They were getting the show — well, frankly, I don’t know how they were getting the show, but it was definitely online and quite possibly illegal.


]]> 0
The Golden Age at Twilight Mon, 09 Sep 2013 13:12:12 +0000 Rod SerlingWhen I was in 5th grade, the teacher, Mr. Kelly, asked the class if anyone could tell him the definition of the word twilight. I raised my hand, excited to know the answer for once: “A dimension not only of sight and sound but of mind — a journey into a wondrous land whose boundaries are that of imagination…”

“You idiot!” interrupted Mr. Kelly. (Does the setting of New York City in the 1970s explain at all why the teacher talked to his pupils that way?) “That’s the Twilight Zone! — Twilight is the period between sunset and darkness…”

Oh, I thought. So that’s why the show is called the Twilight Zone. It’s an in-between thing.

I wonder if there are kids today who will some day tell a similar story — probably with a less ill-mannered teacher — where they answer the vocabulary question by stating that “twilight” is when high-school vampires are in love with teenage mortals.

When I was a kid, The Twilight Zone was the smartest television show I watched. And I watched a lot of TV. It had already been off the air for a decade, but so had most of my shows. I grew up in the 1970s watching the TV of the 1950s and ’60s on a portable black-and-white television set with antennas made of coat hangers and tinfoil.

I loved the plot twists, and I didn’t mind all the moralizing. Most of the television I watched was preachy — and kids are used to being preached at from all directions, not just their TV viewing — but unlike all the other shows I watched, The Twilight Zone dealt with mind-bending ideas, and its plots weren’t predictable, at least not to me. Each episode ended with a revelation, and I enjoyed trying to guess what it would be, though I seldom guessed right.

The critics had loved it from the beginning — well before the show became popular with viewers — and later critics ranked it as a high point in television history:

In 1997, the episodes “To Serve Man” and “It’s a Good Life” were respectively ranked at 11 and 31 on TV Guide‘s 100 Greatest Episodes of All Time.…

In 2002, The Twilight Zone was ranked No. 26 on TV Guide‘s 50 Greatest TV Shows of All Time. In 2013, the Writers Guild of America ranked it as the third best written TV series ever. (Wikipedia)

The show’s creator, executive producer, and head writer, Rod Serling was one of the star television writers from the first “Golden Age of Television.”

His successful teleplays included Patterns (for Kraft Television Theater) and Requiem for a Heavyweight (for Playhouse 90), but constant changes and edits made by the networks and sponsors frustrated Serling. In Requiem for a Heavyweight, the line “Got a match?” had to be struck because the sponsor sold lighters; other programs had similar striking of words that might remind viewers of competitors to the sponsor, including one case in which the sponsor, Ford Motor Company, had the Chrysler Building removed from a picture of the New York City skyline. (Wikipedia, “The Twilight Zone”)

In the Golden Age of Television, sponsors not only attached their names to the TV shows they sponsored — Kraft Television Theater, Philco TV Playhouse, Goodyear TV Playhouse, The Alcoa Hour, The Voice of Firestone, The US Steel Hour — they developed shows, produced them, and paid the networks to put them on the air.

Television's Second Golden AgeRobert J. Thompson, a communications professor at Syracuse University, writes,

This arrangement led to some legendary stories of sponsor interference. Alcoa, manufacturers of aluminum, for example, would not let Reginald Rose set a tragic event in his episode of The Alcoa Hour in a trailer park, where most of the homes are made of aluminum. The Mars company, which sponsored Circus Boy, made it known to those making the show that they didn’t appreciate references in the program to ice cream, cookies, or other treats that competed with Mars’s candy products for the sweet tooth of America’s youth.

And for those of you who’ve read my earlier post “Who destroyed the first golden age of television?” take note of this one:

In “Judgment at Nuremberg,” an episode of Playhouse 90, about the trials of Nazi war criminals, a reference to “gas chambers” was deleted by the sponsor, the American Gas Association. (Television’s Second Gold Age)

Two years before Serling created The Twilight Zone, he wrote a long introduction to a paperback release of his historic teleplay Patterns. (“Many of the scripts for these [1950s TV] plays were collected and sold in book form,” writes Professor Thompson, “a distinction prime-time programs would not enjoy again for many years.”)

In his introduction, Serling reviews the history of television drama and his career in the medium, gives advice to young writers, and voices his regret about the medium’s dependence on commercial interruptions and busybody sponsors.

RodSerlingPatternsFor good or for bad, the television play must ride piggy-back on the commercial product. It serves primarily as the sugar to sweeten the usually unpalatable sales pitch. It’s the excuse to wangle and hold an audience.

Serling is clearly trying for a measured tone in that introduction. In Submitted for Your Approval, a documentary about his career released 20 years after his death, we get a more candid opinion:

How can you put out a meaningful drama when every fifteen minutes proceedings are interrupted by twelve dancing rabbits with toilet paper?

Still, Serling understood that his career depended on the dancing rabbits:

A sponsor invests heavily in television as an organ of dissemination. That organ would wither away without his capital and without his support. In many ways he hinders its development and its refinement, but by his presence he guarantees its survival. (Patterns, introduction)

In addition to specific cuts and changes, the TV sponsors of the 1950s had informal rules limiting content. While Serling was already known as a writer of television drama, The Twilight Zone made him famous ever after for fantasy and science fiction. In his 1957 introduction to Patterns, you can already see him being pushed in that direction as a reaction to the sponsors’ fiats:

One of the edicts that comes down from the Mount Sinai of Advertisers Row is that at no time in a political drama must a speech or character be equated with an existing political party or current political problems.

Serling’s 1956 teleplay about the US Senate was gutted. Several million television viewers tuned in to his political drama “The Arena,” Serling writes, and

were treated to an incredible display on the floor of the United States Senate of groups of Senators shouting, gesticulating and talking in hieroglyphics about make-believe issues, using invented terminology, in a kind of prolonged, unbelievable double-talk.

“In retrospect,” Serling mused,

I probably would have had a much more adult play had I made it science fiction, put it in the year 2057, and peopled the Senate with robots. This would probably have been more reasonable and no less dramatically incisive.

Serling insists that he did not make trouble: “I’m considered to be a cooperative writer — even now. I don’t get my back up at requests for rewrites.” But he was known in the industry as the “angry young man of Hollywood,” and when he died of a heart attack at age 50, many newspapers “mentioned that he had been a heavy smoker for years and was angry and stressed most of his life” (Wikipedia).

But while he fought television executives and sponsors over what he unfortunately called “censorship” (see my post “censorship schmensorship” on why this label is misleading, at best), he fell short, in the 1950s at least, of proposing government intervention — or any other specific solution:

I don’t really believe there exists a “good” form of commercial. There are some that are less distasteful than others, but at best they’re intrusive.… I make reference to this by way of pointing out a basic weakness of the medium. I do not presume to suggest any antidotes or alternatives. At the moment none seems possible. (Patterns, introduction)

Sadly, by the ’60s, he was willing to call on the state. According to a 1964 article about Rod Serling and “TV censorship,” we learn that Serling

proposed that the Federal Communications Commission “pass muster” in some fashion on the quality of advertising in television. The FCC has never been a “strong arm of the government” because it was afraid of being accused of censorship, he said. (“Serling Rips TV Censorship,” Binghamton Press & Sun-Bulletin, May 1, 1964)

Note the irony of his fighting the “censorship” of private editorial policies within the networks, then dismissing concerns about the real-deal coercive variety from the central government.

There’s another irony to Serling’s shift. You need to note the dates and know a little television history to catch it.

The television industry in which Rod Serling had established his name was dominated by sponsors — this was precisely Serling’s problem with it:

No dramatic art form should be dictated and controlled by men whose training and instincts are cut of an entirely different cloth. The fact remains that these gentlemen sell consumer goods, not an art form. (Submitted for Your Approval)

And yet the era of Serling’s ascendancy is now considered the Golden Age of Television and the TV drama of the era is recognized as an art form at its peak (until the present new golden age of television drama came to surpass it). According to television producer Sherwood Schwartz, the success of that earlier era resulted directly from its domination by the sponsors:

[T]he networks were conduits and they had no control of programming. Sponsors had more power, and the creative people who created the shows had more authority.

Professor Thompson indicates other benefits of the 1950s arrangement:

Television's Greatest Year: 1954[S]ingle sponsorship also had advantages. R.D. Heldenfels, TV critic and author of Television’s Greatest Year: 1954, points out that “Unlike the current system, where a terribly low-rated show is pulled after one or two telecasts, a single sponsor willing to wait for good numbers — or to settle for lower numbers because the show increased the sponsor’s prestige — could keep a show going.” Since networks made money as long as the show remained sponsored, the only reason for them to cancel a sponsored series was if the ratings were so low that they threatened to reduce the size of the potential audience for the next show on the schedule. Indeed, many companies were more concerned with prestige than they were with numbers. If not for prestige, why would a company like US Steel have sponsored an anthology? There were no raw US Steel products that a mass audience could buy over the counter and most viewers had no idea where the steel in their automobiles came from. It was even possible that a show would continue to be sponsored based on the tastes of a single executive or company owner. The classical music on The Voice of Firestone played for five years on NBC and another five on ABC to comparatively small audiences because the Firestone family was more concerned with attaching their name to a cultural show than they were with ratings.

Yet here was Serling in 1964, calling for a stronger hand from the FCC and pooh-poohing the idea that such intervention would constitute censorship — this just after the three-year reign of FCC chair and “culture czar” Newton Minow, who

gave networks authority and placed the power of programming in the hands of three network heads, who, for a long time, controlled everything coming into your living room. They eventually became the de facto producers of all prime-time programs by having creative control over writing, casting, and directing. (quoted by Russell Johnson, aka the “Professor,” Here on Gilligan’s Island)

In the famous “vast wasteland” speech before the National Association of Broadcasters in 1961, Minow told the television industry, “You must provide a wider range of choices, more diversity, more alternatives.”

“Yet,” according to University of Virginia professor Paul Cantor,

Minow’s speech resulted in centralizing power in the television industry and thus actually reducing the range of choices in programs.… [H]is words contained clear threats that if the television industry did not voluntarily do what he wanted, the FCC would make sure that it did. (Paul A. Cantor, “The Road to Cultural Serfdom: America’s First Television Czar” in Back on the Road to Serfdom: The Resurgence of Statism, edited by Thomas E. Woods, Jr.)

Rod Serling, the angry young man of Hollywood, clearly preferred the rule of the FCC to the rule of the American sponsors, and in 1964 — after three years under Newton Minow had radically changed the television landscape, and JFK-appointed FCC chair E. William Henry was still “fully committed to Minow’s agenda” (Thompson) — Serling all but advocated an even stronger hand from the federal government to limit commercial interruptions.

Is it possible that the sponsors were requiring ever more commercials in response to their dwindling power in the production end? After all, you don’t have to push Kraft-brand cheese slices as ardently when the anthology showing Rod Serling’s famous “Patterns” is called The Kraft Television Theater.

If that’s right, then Rod Serling is yet another example of the intervention spiral that Ludwig von Mises described: first you call for government intervention, then you fail to see that the intervention created the new problems you dislike, so you call for further intervention, and the cycle repeats.

So why wasn’t Serling afraid of implicit censorship from the FCC?

One unfortunate possibility is that Rod Serling was less vigilant about the FCC because Newton Minow’s agenda was better aligned with Serling’s own politics. Serling’s teleplays were antiwar well before antiwar sentiment took over a later generation. His stories also focused on questions of racial prejudice and sexual equality at a time when the sponsors considered the topics divisive and controversial. Recall that one of the edicts from “Advertisers Row” was that “at no time in a political drama must a speech or character be equated with an existing political party or current political problems.”

But in the early 1960s, the edict from Washington DC reversed the mandate.

Newton Minow was an appointee of the Kennedy administration. “His chief ‘qualification’ for the FCC job,” according to Paul Cantor, “was the fact that he was a personal friend of the president’s brother Robert Kennedy.”

Lacking any grasp of aesthetic criteria, Minow had to employ political criteria in his evaluation of television, and the industry responded accordingly.… [T]he changes in television content in the 1960s chiefly followed a political agenda — greater representation of minorities on shows, especially African-Americans; more dramas devoted to controversial political issues, displaying a deepened social conscience; in particular a number of shows dealing with the issue of civil rights, which not coincidentally was being promoted at the same time by the Kennedy and Johnson administrations.… [T]elevision in the 1960s increasingly fell in line with the program of the Democratic Party. This is exactly what one might have predicted under the leadership of an activist FCC chairman appointed by a Democratic president. (Cantor)

If Rod Serling wanted to push the Democrats’ agenda, then pressure from the federal government for television networks to do exactly that may have felt less like oppression and more like freedom.

Serling may have welcomed the new era of the American culture czar. Minow certainly recognized Serling as a comrade in the crusade. In his speech to broadcasters, Minow had called television a “vast wasteland,” but he listed a handful of exceptions by name. Serling’s Twilight Zone was one of them.

The preachy tone I now hear in the show was a sign of the times. It felt familiar to me because I had grown up on 1960s television. I believe in tolerance and diversity largely because TV taught me to believe in tolerance and diversity. But over time, I came to believe that the tolerance of left-liberalism was a shallow tolerance, a tolerance only for certain forms of diversity — those that aren’t in conflict with the rest of the left-liberal agenda. That agenda was about more than cosmopolitan open-mindedness and acceptance of ethnic and cultural differences; it was about greater centralization of power, the need for coercive intervention, trust in certain elites, and a distrust of local values and local authority.

Serling may have seen a greater number of heroic, middle-class blacks and strong, smart women on television and believed that it was evidence that the medium was advancing. But did he also notice that the stories took fewer and fewer risks? Did he notice that the chorus of social consciousness could sing only one note?

He bridled against the sponsors’ mandate not to offend anyone and bemoaned the television writers’ practice of “pre-censoring,” by which he meant anticipating sponsor reaction and thereby avoiding any risks. And he was right that creativity requires risk-taking. In recent decades we’ve seen the cable-TV drama raised to the level of art precisely because commercial-free cable networks can afford to take chances that commercially supported broadcast networks just can’t.

But the strong arm of Kennedy liberalism, in the form of an activist FCC, drove risk-taking off the air and replaced it with homogeneity and blandness under the guidance of a fearful cartel of network heads who were willing to sing the administration’s preferred lyrics so that they could continue to sell soap. Rod Serling may have played a starring role in the golden age of television drama, but his agenda brought that age to an end.

]]> 0
The Right to Say “I Do” versus the Right to Say “I Don’t” Wed, 28 Aug 2013 12:47:36 +0000 GayStripesThe New Mexico state government has become significantly more gay friendly in the last week or two.

Sadly, one result is that individual freedom in the state is on the wane.

On Monday, a New Mexico judge ruled that the state’s marriage law "doesn’t specifically prohibit gay marriage," and the next day court clerks began issuing same-sex marriage licenses.

I look at the photographs of gay and lesbian couples tying the knot yesterday in Albuquerque, and I feel moved by them. Knowing how they’ve struggled to achieve the moment captured in those pictures, I feel much happier for them than I would for most strangers. And I think of the same-sex couples I know, none of them married by any legal definition, and I wonder if the piece of paper would matter to them.

This is how the state tricks libertarians into supporting the growth of government power.

I’m not suggesting that anyone in the government is actually concerned about the beliefs and political stances of self-described libertarians — we’re far too small a group for the Powers That Be to care what we think — but anyone who believes that individuals have any inalienable rights is, to at least that limited degree, libertarian in their thinking. And it is that libertarian instinct that the political class appeals to for increases in legislation and the growth of the state.

The marriage-law ruling comes one week after another so-called gay-rights case:

"Refusal to photograph New Mexico same-sex couple ruled illegal"

I can’t recall why Robert Anton Wilson stopped supporting the ACLU and started giving his money instead to the Fully Informed Jury Association (FIJA). But if you ever needed evidence that the ACLU is an anti-libertarian organization (whose name should really have the word "liberty" in scare quotes), then this case should be conclusive.

Joshua Block, an attorney with the American Civil Liberties Union, which represented the couple, said the ruling rejected a "frighteningly far-reaching" argument for allowing private companies to discriminate against gays and lesbians.

"The Constitution guarantees religious freedom in this country, but we are not entitled to use our beliefs as an excuse to discriminate against other people," said Louise Melling, also of the ACLU.

As one comrade said recently, "Thank God for the ACLU. Who else would stand up for a gay couple’s right to force a company to provide them services unwillingly?"

The photography case isn’t about gay marriage, but it nevertheless highlights why many libertarians are reluctant to support gay-marriage legislation.

Should gays be allowed to marry? At first glance, that seems like a no-brainer to advocates of individual rights. To a supporter of liberty, the question becomes, "Who has a right to stop them?" In our view, anyone (well, let’s say any mentally competent adult) has a fundamental right to make contracts with anyone else (again, consenting adults, to keep the argument on track). And while it may offend romantic sensibilities — or even personal experience — to think of marriage primarily as a contract between individuals, contract is nevertheless the proper public component of such a private union. (By this same reasoning, we support the rights of polygamists, assuming consenting adults, etc.)

But the state turns the gay-marriage issue into a sort of trick question. Because the current legal definition of a marriage is both more and much less than a mutually beneficial arrangement between the spouses: it’s a set of coercive obligations imposed on third parties.

Again, the photography case was not about gay marriage but about antidiscrimination laws, but the two are linked, because any business that, to use the ACLU’s terminology, "offers services to the public," is already burdened with legislation dictating what they can and cannot do, whom they may and may not employ or serve, and even in jurisdictions where sexual orientation is not already included in antidiscrimination laws, those laws could automatically grow to include gays and lesbians when state-sanctioned marriage (under whatever name) is applied to same-sex couples.

So the gay-marriage issue is contentious even within the libertarian movement because it practically requires us to conflate two very distinct questions:

  1. Should any adult be denied the right to "marry" any other consenting adult?
  2. Should other individuals be forced to recognize such unions?

The first question is a no-brainer, and it’s the one most people have in mind when they say they support gay marriage.

The second question is equally straightforward for a libertarian, and yet, in the current context, it conflicts with the answer most of us want to give to question #1.

Antidiscrimination laws are a violation of freedom of association.

Telling me whom I can and cannot hire or whom I must or must not serve professionally is like telling me whom I may or must befriend, date, or marry. Even the most ardent opponent of discrimination would probably scruple to force a black girl to date a white hillbilly, or a Muslim man to marry a Jewish woman. Statists believe it’s their business whom I hire or fire and whose business I must accept. But even they stop short of telling me whom I must invite into my home or into my family.

Even if we want to promote open-mindedness and persuade each other to see past the categories of religion, race, sex, and orientation, very few would be comfortable forcing personal associations on people through coercive legislation. Yet many on the Left advocate tirelessly for such coercion against businesses, without seeing it as the same issue — even when the business is just one individual trying to make a living.

I don’t want the state to discriminate against gays or any other group. But the recent developments in New Mexico will not reduce the problem. The larger and more intrusive the state becomes, the more it has to side with one group against another, feeding on conflict as it sows the seeds for ever more.

Anyone who is serious about liberating gay and lesbian couples should demand that the state get out of the marriage business altogether — and let people associate freely, not under duress, whether or not their choices strike us as enlightened.

]]> 4
Price Theory a la Rupert Murdoch Mon, 19 Aug 2013 13:00:07 +0000 RupertMurdochRupert Murdoch was buying up online properties in the mid 1990s, trying to do with the newly commercial Internet what he had done with FOX television during the previous ten years or so. One of his new acquisitions was a gaming company here in Charlottesville. My friend and I became the two "web guys" for the company. It was my first full-time job in the private sector.

When I was discussing my yet-to-be-submitted laugh-track article with Paul Cantor and I mentioned that it was competition from HBO and FOX that pushed canned laughter into retreat, Professor Cantor recommended the book The Fourth Network: How FOX Broke the Rules and Reinvented Television by Daniel M. Kimmel.

He says it was his main source for this lecture:

"When Is a Network Not a Network?" (It’s a great talk!)

I said, "You know, I used to work for Rupert Murdoch."

"Don’t you realize," Cantor quipped, "at some point EVERYBODY has worked for Rupert Murdoch."

Charlottesville, perhaps like most university towns, is famously left-wing. Rupert Murdoch was infamously right-wing long before FOX News became the unofficial media arm of the Republican Party. So I sensed a definite ambivalence, sometimes defensiveness, among my co-workers about the guy in charge. One thing I remember people saying with great respect, however, was that Rupert Murdoch had built his media empire by paying not what property was worth but rather what it was worth to him.

That struck me as profound at the time.

What they meant was this: Rupert Murdoch didn’t think of a newspaper as just a newspaper, a TV station as just a TV station. He thought about them as nodes in a network he was building. This meant he outbid his competitors who saw these organizations as single units of profit and loss, whereas Murdoch saw them as part of a bigger picture: what would they contribute to the larger plan?

FourthNetworkCoverFrom the The Fourth Network:

If there was a single turning point in the history of the FOX network, a moment when the Big Three became the Big Four, it occurred in December 1993 with the simple announcement that NFL football was coming to FOX the following season.

One of the NFL’s attorneys recalled,

"What FOX offered us, the league, was the potential of having a bidder who needed the games, wanted the games, and was willing — in a sense — to overpay for them."

Why would a savvy entrepreneur overpay for anything?

As FOX exec Lucie Salhany later recalled,

"When Rupert took over, Rupert was his most phenomenal. ‘I want it, I have a vision, I’m willing to pay for it.’"

The author comments,

As he had with so many other properties, Murdoch had paid more than the market thought it was worth because he saw a greater opportunity there.

What was his vision? It turns out that 70 percent of the NFL’s viewing audience had never watched FOX.

Salhany again:

"So, in the end, it was actually a bargain to acquire the rights to the NFL to promote the rest of [FOX’s] schedule. It was cheaper. If you went out and spent the same amount of money to promote it, it wouldn’t have been as effective."

What strikes me now is not how brilliant Murdoch was as an entrepreneur (although he was and is) but how my game-company colleagues worded their praise for the man — and what it revealed about the people giving the praise:

Rupert Murdoch didn’t pay what something was worth but rather what it was worth to him.

Notice the unstated assumption that there is such a thing as value independent of an individual evaluator, almost as if "worth" is objective, but Murdoch had the genius to see it as subjective.

Only when I started to read economics almost a decade later did I come to understand that all value is subjective. Consumers pay whatever they feel will benefit them more than the next best thing they could have done with that money, whatever seems better than the opportunity forgone. The difference between consumers and entrepreneurs is twofold:

  1. Consumers pay for things as direct ends in themselves, goods that will directly satisfy their subjective needs (consumers’ goods); whereas entrepreneurs pay for things as the means toward achieving other ends (capital or producers’ goods).

  2. Consumers can make direct price comparisons between, say, a cup of Starbucks coffee and a cup of diner coffee, or between a MacBook and a Windows laptop, but ultimately the comparison isn’t much of a calculation — it remains a subjective preference for the thing purchased over the forgone option; entrepreneurs, however, need to combine the objective data of recent prices for all their factors of production and compare the result to their personal predictions for what people will pay for the final product.

The successful entrepreneur is the better predictor, the more innovative producer, or both.

When my co-workers praised Rupert Murdoch for pricing factors more presciently or using them more innovatively, they were just saying, in essence, that he was a good entrepreneur. Maybe they understood that. Maybe what they meant by "what it’s worth" was what the current market consensus is for this factor’s future earnings, discounted for time, and assuming no innovation.

I certainly didn’t understand that they were merely describing what an entrepreneur does, and admiring Murdoch for doing it so much better than everyone else.

(Cross-posted at

]]> 0