October 20, 2006  ·  Lessig

HOTI points to Scott Cleland of the respected “The Precursor Blog” who has posted a reply to my FT article. My “thin rule,” Cleland says, is just “thin gruel” (by which I take it he means he doesn’t like my rule; I personally find the best gruel to be thin gruel, and in fact, in the increasingly cold Berlin mornings, I think gruel is a very good start to the day. I wish people would stop picking on gruel, thin or fat.)

The thrust of Cleland’s one pager is that I’ve been “loose with the facts.” Let’s review the charges: (Cleland’s words are bolded below):

Loose Fact #1:

Lessig asserts: “In the U.S. at least, broadband competition is dying.”
Anybody who cares to check, will find that broadband prices are falling, broadband speeds are increasing, consumer choices are increasing steadily, broadband investment and deployment are strong, and innovation is vibrant.

This is Dick Cheney on the war. (Indeed, let’s give it a name: to Chenify): Put the prices issue aside till the next sentence: Are you kidding, Scott? Relative to every nation who should be considered to be competitive to us in this, we are worse off today than we were four years ago. When Bush said “10th is 10 spots too low” he was right (well, sort of. It’s actually 9 spots too low), and yet now the US is 16th in broadband deployment. The worse things go, the more a certain set simply denies reality.

Why can’t an anti-neutrality advocate begin with what everyone knows is true: US broadband sucks — it is too slow, it is too expensive, and it is too unavailable. The only question is what we are going to do about it.

For those who care to go more in depth on this subject, I have produced two useful one pagers to prove this point: “Debunking the Broadband Market Failure Myth” and “Debunking the Broadband Competition Can’t Work Myth.”

So here’s where I was worried. Though this is an issue I’ve been studying “in depth,” I hadn’t read Cleland’s “one-pagers” before. (I hate the word “debunking”. Sounds way too Marxist for my taste). So I took a deep breath and clicked on the one pagers, expecting to find a refutation of the data upon which I had based my understanding that in fact, prices have not fallen.

That data, again summarized well in Broadband Reality Check, suggests that first, cable prices have increased slightly, and second, while NOMINAL DSL prices have fallen, the $/MB has gone up, since the speed of the offering for the cheapy deals is significantly slower.

Here’s Cleland’s “debunking”:

Real DSL prices have fallen ~50% as speeds have roughly doubled over the last 2 years; introductory DSL prices have fallen ~70%in ~3 years; average monthly DSL prices fell ~15% from 2004-2005.

Every fact stated here could be consistent with the conclusion that the $/MB has gone up. (Well, almost: You’d have to be a bit charitable in interpreting “as speeds have roughly doubled over the last 2 years” — that’s plainly not true, as the “introductory” packages he points to offer less than 1 MB speeds). So where’s the beef, Scott? Where is the data to debunk “Broadband Reality Check”? I’d be happy (in the academic sense of that term) to be proven wrong about relying upon the data I relied upon. I’d be even really happy to learn that average $/MB prices for DSL have gone down. But notice the critical fact Cleland didn’t try to “debunk”: That prices in the US range from 6x (France) to 12x (Japan) the $/MB of the US. But don’t worry, every-thing’s great. The war’s great. Broadband in the US is great. The deficit is great. It’s just the best of all possible worlds…

For those who want to get the FCC’s analysis of broadband competition click here; or for the FCC’s analysis of Wireless competition click here.

I love the way anti-regulation types hate everything the government does, except data that supports their argument. Talk about “debunking”: The FCC’s analysis has been the subject of extensive criticism, including by the GAO. The problem is the FCC’s method for counting penetration with in a zip-code: They conclude that if 1 person within the zipcode has a broadband choice, the whole zip-code has broadband choice. As the GAO concluded about this obvious fudge: “the number of providers reported in the ZIP code overstates the level of competition to individual households.”

For those who don’t want to be bothered with facts and analysis, but just want anti-business assertions about what imminent peril our way of life faces from continued free and open competition on the Internet click here for SavetheInternet.com, of which Professor Lessig is a Charter Member.

This is the part of this debate that drives me nuts: As if this is a battle between “anti-business” sorts, and pro-business sorts. I understand how it’s easy to believe that if you spend your life thinking about other things, and spend 30 seconds thinking about this issue. But for anyone inside this debate, this claim is the most bogus sort of rhetoric there is.

This is not a pro vs. anti-business debate at all. The whole point of the Network Neutrality argument I’ve advanced (for almost 8 years now) is about what conditions produce the greatest growth in applications and content. The aim is to maximize wealth for the economy as a whole, and not just for the network owners. The whole argument is that a neutral network incentivizes more competition in applications and content than a network controlled by network owners. Think the cell phone network vs the Internet: This is not a battle between pro and anti-business sorts, it is a battle between cell-heads and net-heads.

There are those who continue to function normally in the world who believe, all facts to the contrary, that this is a debate about regulation vs. no regulation. I’ve spilled too many bits over that canard to believe wasting more space here makes any sense. But I wish those anti-regulation sorts would spend some of their effort getting the FCC out of the business of regulating (through property) spectrum.

Loose Fact #2

Professor Lessig asserts: “There are fewer competitors offering broadband connectivity today than there were just six years ago. The median consumer has a choice between just two broadband providers. Four companies account for a majority of all consumer broadband; 10 companies account for 83 percent of the market.”
What Professor Lessig fails to explain was that six years ago we basically had NO broadband competition, because we had a de facto monopoly for wholesale Internet access called dialup, which had lots of resellers of the underlying monopoly service, which Mr. Lessig likes to call competitors.

Yea, I’m old enough to remember those days. Many many businesses would try to get me to switch to their service by offering me lower prices and higher quality. I confess, I call that competition. But whatever you call it, we need more of it today.

Over the last six plus years, the free and open Internet that has been unfettered by regulation has created a steady increase in real inter-modal broadband competitors/choices for consumers.
What Mr. Lessig really laments is the decrease in the faux/artificial regulatory-favored Internet Access resellers that basically competed on brand; and the increase in REAL inter-modal competitors that can truly compete on price, speed, innovative features, and mobility among other differentiators that consumers value about competition.

What I “lament” is that the speed of broadband sucks in the US, and the prices are too high. Again, if the policies of the last 6 years had really produced the kind of prices and quality that other competitive nations around the world have, I’d be the first to admit I was wrong. But if you turn off the Cheney channel, and looks at the sorry (and increasingly sorry) state of broadband in the US, at some point someone has got to ask whether this policy is a mistake? Call me a cut-and-runner, Scott. Because I definitely want to cut-and-run from the FCC’s policy.

What Mr. Lessig conveniently omits from his assertion that “broadband competition is dying” is the pesky little truth that real broadband prices have fallen by over half over the last three years and that competitive supply is vibrantly increasing.

But then again, there’s the problem of those “pesky” data necessary to support the ultimate claim that needs to be made: That the DSL $/MB have fallen “by 50%.” Show me the data, Scott, not the made-for-TV-soundbites.

Maybe Professor Lessig should take some more classes in economics and antitrust to bone up on the fact that competitiveness of markets are truly measured by effective pricing, by the trend of competitive entry and by the amount of innovation. Only undergrad courses covering antitrust would consider it sufficient to count the number of competitors in a market and then declare a market not competitive. Responsible scholars of competition understand that the competitive facts can vary widely in various markets, and that the number of competitors alone is insufficient data to determine the competiveness of a market. I am sure there are any number of attorneys with “real world” experience in analyzing competition at the DOJ Antitrust Division or at the FTC who would be happy to give Professor Lessig a little tutorial on this before he opines on this topic again on the world stage.

This is no doubt true. A submission to the FCC or to a court about market power with the substance of an 800 word op-ed would be absurd. And indeed, to show fully whether competition is improving or getting worse, you would need to go much more “in depth” than even Cleland’s nippy one-pagers. But really, Scott. This is an op-ed. They don’t allow footnotes.

Loose Fact #3:

Lessig said: “Network owners now want to change this by charging companies different rates to get access to a ‘premium Internet.’” [bold added for emphasis]
This is the way the Internet has operated since it was commercialized in 1995. There have long been been three Internet backbone tiers of service. And companies have long paid for a “premium” Internet since they upgraded from dialup to broadband!

So at a debate with George Gilder, Peter Huber made this same move. Look, no one is arguing about the backbone. No one is arguing for regulation of the backbone. This is a debate about last mile broadband, and the effect certain business models for the last mile will have on competition.

What planet has Mr. Lessig been on that he didn’t notice that companies pay for a “premium” Internet every day? Has he ever heard of the Akamai “premium” service which has been used by most all the biggest online companies to get “premium” Internet service?

And of course this is exactly the criticism I was trying to preempt by my original post on this matter. Obviously, companies do whatever they can to make their content on the net run well as it can. Google must spend millions around the world on caching servers. Everyone spends what they can to get the fastest servers they can.

But again, this is exactly the sort of competition we should celebrate — businesses spending money to add real capacity and functionality to the network, by going to a (relatively) competitive market to add that capacity.

My complaint is not against that. My complaint is about (relatively) uncompetitive markets, and about the consequence of them exercising power over the next YouTubes of the world. No doubt, as they extract rents from these businesses, they make Wall Street happier about them. But as my focus is not the net wealth of a handful of companies, but instead, the wealth of the economy as a whole, what’s good for them is not the end of the matter.

Indeed, this is exactly why my position on Network Neutrality is not as extreme as some. As I testified, for example, I’m all for “consumer tiering” by network providers — where network providers offer higher quality to consumers for more money. That again is the sort of business model that creates an incentive to increase capacity.

“Access tiering” doesn’t. Or at least, I’m looking for the economic analysis to show it does. What I’ve seen so far is that in an relative uncompetitive market, “access tiering” creates an obvious (and perverse) incentive: with relatively limited competition, if you can charge a premium for a “fast internet,” you don’t have much incentive to make the rest of the Internet very fast at all.

October 20, 2006  ·  Lessig

In Free Culture, chapter 9, I wrote the following:

In addition to the Internet Archive, Kahle has been constructing theTelevision Archive. Television, it turns out, is even more ephemeral than the Internet. While much of twentieth- century culture was constructed through television, only a tiny proportion of that culture is available for anyone to see today. Three hours of news are recorded each evening by Vanderbilt University – thanks to a specific exemption in the copyright law. That content is indexed, and is available to scholars for a very low fee. “But other than that, [television] is almost unavailable,” Kahle told me. “If you were Barbara Walters you could get access to [the archives], but if you are just a graduate student?”

As Kahle put it,”Do you remember when Dan Quayle was interacting with Murphy Brown? Remember that back and forth surreal experience of a politician interacting with a fictional television character? If you were a graduate student wanting to study that, and you wanted to get those original back and forth exchanges between the two, the 60 Minutes episode that came out after it … it would be almost impossible. … Those materials are almost unfindable. …”

Jeff Ubois has just published a paper about his effort to find out whether Brewster was right. His conclusion: Brewster’s right. As he writes:

I searched for footage of the Quayle/Brown interaction with an eye towards making some general assessments of the accessibility of historic broadcasts, and detailed the results in a paper called Finding Murphy Brown: How Accessible are Historic Television Broadcasts? It’s finally out this week in the peer reviewed Journal of Digital Information….

Copyright restrictions ultimately made it impossible to get the original Dan Quayle speech, or the Murphy Brown episodes in question. In an odd coda to this project, one digital library journal (from which I withdrew this paper) insisted that the correspondence detailing refusals by various organizations to allow access to or use of the Quayle/Brown footage was itself copyrighted, and therefore unsuitable for publication. Those excerpts are included in the current piece. It was disturbing how one effect of copyright law is to chill academic discussions of copyright law.

You can read the paper by linking from the blog entry.

(Thanks, Jeff!)

October 20, 2006  ·  Lessig

So in the comments to my post about the piece in the FT, John Earnhardt, an author on Cisco’s (read: the company that will sell the technology to end network neutrality) “High Tech Policy Blog,” complains about “the rhetoric [I] have used.” In his blog post, titled “How can you tell if a lawyer is lying?” (talk about helpful rhetoric), he writes:

In the FT piece he writes: “Network owners now want to…charg(e) companies different rates to get access to a “premium” internet. YouTube, or blip.tv, would have to pay a special fee for their content to flow efficiently to customers. If they do not pay this special fee, their content would be relegated to the “public” internet a slower and less reliable network. The network owners would begin to pick which content (and, in principle, applications) would flow quickly and which would not.” This is sheer fiction and he knows it. The truth of the matter is that YouTube and Google, the companies he holds up at stalwarts of fair play, apple pie, motherhood and whiskers on kittens actually charge companies to get premium placement on their websites. What’s this you say? Those who own a website or service are allowed to charge money to allow an advertiser to get top placement on their website? I’m shocked and appalled and will be submitted an op-ed to the FT stating the same. What is the difference of a service provider (in his terminology, a “network owner”) of charging a service to get premium placement on their “owned” network? They are not degrading the services of others, but enhancing the service of those who choose to pay for the premium placement.

“What is the difference [between] a service provider … charging a service [fee] to get premium placement on their ‘owned’ network?” Really? The difference is all the difference in the world. No one supporting network neutrality would (or should) say that we should fight discrimination at the edge of the network. That’s the whole point: End-to-end (the bedrock upon with the network neutrality argument rests) is all about facilitating lots of discrimination and preference at the edge; the only place discrimination is a problem is within the network. And again, nothing in my argument is about whether people at the edge of the network are “stalwarts of fair play, apple pie, motherhood and whiskers on kittens” (whiskers on kittens?). The point is not about good vs. evil. The point is about what architectures (whether imposed through technology or business models) will lead to the fastest growth in applications and content. No doubt, some architectures will lead to faster growth in profits for some companies (not to name names); but more profits for some is not the same as faster growth for all.

His second strike is even better:

Here’s another anology: We’re in the throes of campaign season here in the ol’ US of A and television and radio ads play a large role in electing or defeating a candidate. Those candidates who have more money can buy more ads on radio and TV. They can buy them during the most popular shows so that the most amount of voters can see them. If the other candidate has no money and cannot afford to place an ad on television or radio I can only assume that Larry Lessig will offer to pay their way in the name of net neutrality. Why? Because, in his mind, the playing field should be equal for all candidates.

So again, no, whether candidates have money or not is not my concern. They are (in the analogy) at the edge of the network. But let me turn the analogy around. Imagine there are only two television stations in a particular democracy. They both begin to “access tier” — charging different rates to different political candidates. So Dems get a rate 1/2 the rate charged to the GOP; or major parties get a rate that is 1/3 the rate charged to Independents. Does that begin to trouble you?

Now again, as I said in the blog post about the piece, everything here hangs upon market power. So in a truly competitive market for last mile broadband, I wouldn’t care as much (Barbara van Schewick says there’s still a reason to care). But in a world of limited competition, the games the networks can play will both stifle innovation at the edge, and reduce the incentive network owners have to increase performance for all.

October 20, 2006  ·  Lessig

So there’s an important distinction developing among “user generated content” sites — the distinction between sites that permit “true sharing” and those that permit only what I’ll call “fake sharing.”

A “true sharing” site doesn’t try to exercise ultimate control over the content it serves. It permits, in other words, content to move as users choose.

A “fake sharing” site, by contrast, gives you tools to make seem as if there’s sharing, but in fact, all the tools drive traffic and control back to a single site.

In this sense, YouTube is a fake sharing site, while Flickr, (parts of) Google, blip.tv, Revver and EyeSpot are true sharing sites.

Fake Sharing Sites

YouTube gives users very cool code to either “embed” content on other sites, or to effectively send links of content to other sites. But never does the system give users an easy way to actually get the content someone else has uploaded. Of course, many have begun building hacks to suck content off of the YouTube site. (On the Mac, I’ve used TubeSock to do that). But this functionality — critical to true sharing — is not built into the YouTube system.

True Sharing Sites

By contrast, ever other major Web 2.0 company does expressly enable true sharing.

  • Flickr, for example, makes it simple to download Flickr images. (See, e.g., here.)
  • blip.tv explicitly offers links to download various formats of the videos it shares. (See, e.g., here.)
  • EyeSpot (a fantastic new site to enable web based remixing of video and audio) permits the download of the source and product files. (See, e.g., here.)
  • Revver (the site that enables an ad-bug to be added to a video so the creator gets paid when each video is played) builds its whole business model on the idea that content can flow freely on the Net. (See, e.g., here.)
  • And even Google increasingly enables access to the content it creates and collects. Its fantastic Book Search project enables people to download (funnily formatted) PDFs of public domain books. (I know this link used to work, but now that I’m in Germany, Google is obviously not permitting me access to the work because it is so insanely hard to know whether it is in the public domain anywhere else.) And I am told (though I’ve not yet seen how to do it), Google Videos can be download to a machine.

This difference, I suggest, in business models should be a focus of those keen to push the values of Web 2.0. Though Tim O’Reilly’s canonical statement of those values implies this freedom is necessary, it doesn’t really expressly say so. The freedom to access the content seems, in my view, related to the Web 2.0 principle that “the service automatically gets better the more people use it.” Or at least the right to access it if the author chooses (another Web 2.0 principle: Some Rights Reserved) seems essential for this ethic to make sense. As O’Reilly puts it, “Design for ‘hackability’ and ‘remixability’” — precisely what hoarding content doesn’t do.

If YouTube is a trend, this is a depressing turn. No doubt, that amazing company has a billion things to think through (including what to do with more than a billion dollars). But one thing it really needs to keep in focus is a very important part of its success: That it was seen to respect the ethics of the web. Why post on YouTube rather Google Video? At least some did so because YouTube was “cooler.” Whether it continues to be as cool depends critically on the values it practices.

UPDATE: Joi has a fantastically thoughtful followup on this.

October 19, 2006  ·  Lessig

I wrote this piece for the FT, arguing the phenomenal success of YouTube is yet another argument for Network Neutrality. The data in the piece comes from this great report, Broadband Reality Check.

One point the compactness of 800 words didn’t let me make fully: Obviously, everyone spends tons of money to make their content flow more quickly than the competitor. But the question is whether the market in which they spend that money is, in a word, healthy. If there’s lots of competition, then that expenditure is efficient. If there’s not, then it is a barrier. Or that, at least, is the argument.

October 16, 2006  ·  Lessig

TechWorld (a UK publication) has an article about a “leaked” letter from the Initiative for Software Choice (ISC) (apparently MSFT funded) about, as the article puts it, the “potentially dire effects if too much encouragement was given to open source software development.”

Nothing weird there. What is weird is, first, that such a letter has to be “leaked” (aren’t submissions to the EC a matter of public record?), and, second, the way in which the letter is made available on the TechWorld website. TechWorld gives you a link to the letter. The link states: “You can view the entire letter here.” And indeed, the link means what it says. You can ONLY view the letter. The PDF is locked so that it can’t be printed.

Is it really the case that copyright law would forbid a letter written to a government agency from being printed on a users computer?

Note, this is a simple restriction to get around (but is that legal?): If you’ve got access to Acrobat Professional, you can save a version and turn off the password security (apparently without the password, as I did).

(Thanks, Marten!)