Lawyers: To Save Newspapers, Let's Destroy Pretty Much Everything Else Good

from the yeah,-that'll-work dept

A bunch of people have been submitting an opinion piece from the Washington Post, that is basically one of the most stunning set of suggestions for what Congress could do to “save” newspapers. If I didn’t know any better, I’d think it was satire, because the suggestions are so mind-bogglingly bad and dangerous, it’s hard to believe anyone wrote it with serious intent. Also, it’s worth noting that the Washington Post didn’t bother to detail the rather massive conflicts of interests from both lawyers. Apparently they both have represented numerous big name newspapers. And what is it that these big newspaper journalists keep telling us about how it’s the “blogs” that hide conflict of interests? Anyway, let’s dive in to the meat of the argument:

The Internet innovators that have thrived online enabled their own success as early as 1996 by securing immunity from defamation and other liability caused by user postings on their sites. Two years later, they persuaded Congress to add another exemption, this one for user postings that violate copyright law. These safe harbors have allowed companies from Yahoo to YouTube to prosper from the content they carry with little concern of being held accountable for it.

First, it’s rather troubling that two lawyers could so fundamentally misunderstand the safe harbor rules put into both the CDA and the DMCA. The claim that it was the internet companies that somehow sought out these rules is laughable and ignores the history of both laws in question. Both the CDA and the DMCA where massive extensions of laws that purposely limited internet communications massively. The two safe harbor provisions were tiny incursions into both laws designed to (reasonably) point out what should have been obvious: if someone breaks the law, the liability should be on the person who broke the law and not on the tool or service used to do so. That’s called common sense. These safe harbors weren’t, as implied by these lawyers, some massive gift to internet companies. They were a small “safe harbor” for internet companies worried about these two massive laws that criminalized a tremendous amount of communication, showing that the liability should fall on the actual party, rather than on the tool.

Bring copyright laws into the age of the search engine. Taking a portion of a copyrighted work can be protected under the “fair use” doctrine. But the kind of fair use in news reports, academics and the arts — republishing a quote to comment on it, for example — is not what search engines practice when they crawl the Web and ingest everything in their path.

Publishers should not have to choose between protecting their copyrights and shunning the search-engine databases that map the Internet. Journalism therefore needs a bright line imposed by statute: that the taking of entire Web pages by search engines, which is what powers their search functions, is not fair use but infringement.

That would be a massive reinterpretation of copyright law, and would effectively destroy much of what makes the internet useful. This proposal would make it illegal to index the web. It would outlaw search engines. Yes, for the sake of saving some outdated newspaper businesses, these lawyers wish to make it so that before a search engine can index any website, it needs to negotiate permission. This would kill the internet.

Federalize the “hot news” doctrine. This doctrine protects against types of poaching that copyright might not cover — the stealing of information not by direct copying but simply by taking the guts of the content. While the Internet has made news vulnerable to pilfering because of the ease of linking from one site to the next, the hot-news doctrine has limited use because it is only recognized in a few states.

Now that many news aggregator sites have taken “linksploitation” to a commercial level by selling ads wrapped around the links they post, Congress has the incentive it needs to pass a federal law protecting hot news. Such a law would give publishers an additional source of legal leverage outside of copyright to demand fair compensation for the content they create.

The “hot news” doctrine, considered by many to be one of the worst legal decisions ever made when it comes to intellectual property needs to be reversed, not federalized. It is the one case in the US where “facts” can be considered protected information, and that’s bad for everyone. Suggesting an expansion of the hot news doctrine shows a fundamental misunderstanding of First Amendment rights, copyright, the internet and communications.

Eliminate ownership restrictions. Media insolvency is a greater threat today than media concentration. Congress should abolish caps on ownership of broadcast stations and bars on newspaper and television ownership in the same market. These outdated rules belong to an era when the Web was a home for spiders.

The above suggestion might be the only one in all of this that makes any sense. Of course, when combined with the other suggestions, it becomes a horrible idea. These lawyers would effectively kill off all forms of competition to newspapers… and then let the big news organizations combine? Why?

Use tax policy to promote the press. Washington state is taking a lead in the current crisis with legislation signed into law this week to slash business taxes on the press by 40 percent. Congress could provide incentives for placing ads with content creators (not with Craigslist) and allowances for immediate write-offs (rather than capitalization) for all expenses related to news production.

We’ve already discussed how silly Washington state’s new rule is, but are these lawyers really saying that Congress should specifically pick winners and losers in the online classifieds space? How does that not offend the basic concepts of what Congress is supposed to do? How could two lawyers suggest this with a straight face?

Grant an antitrust exemption. Congress first came to journalism’s defense with antitrust relief in 1970, when it permitted endangered newspapers to combine their business operations without fear of antitrust suits if their newsrooms remained independent.

So because newspapers are too clueless to survive, they need to be granted monopoly rights? Sorry, don’t buy it. The whole thing is stunning in just how brazen it is in basically stating that (a) newspapers are more important than all of the internet and (b) just kill off that pesky internet and everything will be fine. Usually, when industries try to work on regulatory capture (getting regulators to put in place laws that favor them) they at least try to couch it in language that pretends it’s for the public good. To outright suggest killing off the internet in favor of newspapers is incredibly shameless.

In responding to this, Jeff Jarvis highlighted a comment made by Dale Harrison that’s worth repeating:

A lesson worth remembering is at the turn of the 20th century people had a transportation problem… and the solution turned out not to be a “faster horse”… but a Ford.

And one should note that the Ford didn’t arise out of the “Horse Industry Revitalization Act”.

I think the future of the media business will look as different as Ford and Toyota’s operations look from horse traders and blacksmiths.

Imagine what the passage of such ill-conceived legislation would have done to the car industry a century ago.

It would have strangled the nascent auto industry at birth, postponing its inevitable rise while sheltering a dying industry, only postponing its inevitable demise… doing great damage to both. Newspapers need to be encouraged to adapt to the future, not retreat behind legislative walls hoping the future will go away.

The newspaper industry’s troubles go to the very core of their historical business model.

What’s historically given value to editorial content is the relative scarcity of distribution versus readers. Newspapers have enjoyed natural localized economic monopolies that allowed each of them to exercise monopoly control over the amount of content (and advertising) they allowed into their local marketplaces.

Monopoly constraint of distribution and supply will always lead to prices (and profits) significantly above open market rates. Newspapers then built costly organizational structures commensurate with that stream of monopoly profits (think AT&T in the 1970’s).

The dynamics of content replication and distribution on the Internet destroys this artificial constraint of distribution and re-aligns advertising (and subscription) prices back down to competitive open market rates. The often heard complaint of Internet ad rates being “too low” is inverted… the real issue is that traditional ad rates have been artificially boosted for enough decades for participants to assume this represents the long-term norm.

An individual reader now has access to essentially an infinite amount of content on any given topic or story. All those silos of isolated editorial content have been dumped into the giant Internet bucket. Once there, any given piece of content can be infinitely replicated and re-distributed to thousands of sites at zero marginal costs. This breaks the back of old media’s monopoly control of distribution and supply.

The core problem for the newspapers is that in a world of infinite supply, the ability to monetize the value in any piece of editorial content will be driven to zero… infinite supply pushes price levels to zero!

What this implies is that no one can marshal enough market power to monetize the value of content in the face of such an infinite supply and such massively fragmented distribution. Pay-walls, lawsuits and ill conceived legislation won’t allow the monopoly conditions to be re-constructed.

There are certainly ways to make online news profitable… and many of us are working to develop such approaches… but I can assure you they don’t involve inventing a “faster horse”…

Indeed. It’s time to stop having Congress keep passing laws that stop innovation in hopes that legacy industries magically come up with faster horses.

Filed Under: , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Lawyers: To Save Newspapers, Let's Destroy Pretty Much Everything Else Good”

Subscribe: RSS Leave a comment
38 Comments
Steve R. (profile) says:

Re: Copyright and the abuse of Freedom

Excellent point.
————-
Thanks to William Stepp at Against Monopoly for finding the quote below from John Perry Barlow. This quote was written in 1994!

“The greatest constraint on your future liberties may come not from government but from corporate legal departments laboring to protect by force what can no longer be protected by practical efficiency or general social consent.”

Dave says:

Wow

Publishers should not have to choose between protecting their copyrights and shunning the search-engine databases that map the Internet.

Wow… they sure want their cake and eat it too!

I wish Google would have the balls to come out and say, “OK… if we’re such a leach on the news media, we’re going to remove all links to your content for 1 month.” I wonder how long it would take the news media to backpeddle and make Google out as a demon for doing exactly what they are currently asking for.

Anonymous Coward says:

Re: Wow

“Publishers should not have to choose between protecting their copyrights and shunning the search-engine databases that map the Internet. Journalism therefore needs a bright line imposed by statute: that the taking of entire Web pages by search engines, which is what powers their search functions, is not fair use but infringement.”

I do not know about you but I have never searched for something and received a whole page unless I was redirected to said page.

Have they not heard of robots.txt

If they do not want the whole page indexed then make a summary page for every page and only allow those pages to be indexed.

Anonymous Coward says:

This isn’t about killing the search business. Although they are pretty blind to not see that’s what would happen.

Google pretty much gives you two choices. Either you let us search your site or you don’t. But the newspapers don’t like either of those choices. They want choice number three, which is Google pays them to allow Google to search their site. That’s what this is about. Google makes money on search, and some of those search results go to newspaper’s websites, and so they want a cut.

Anonymous Coward says:

Re: Re:

Techdirt rails against misuse of antitrust investigations into organizations like Google, which has not committed any real actions that are monopolistic. It has acted as a successful corporation does, that is, continue to be one.

Most posts about antitrust law are against politicians who confuse market dominance with monopolistic behavior.

ScaredOfTheMan says:

Slow slow slow

Nothing will save them… this whole hot news… Ha! I was informed of the LA earthquakes within 50 seconds via twitter. Then by following #earthquake I got all sorts of real time reports. A quick check of all the “news outlet” sites during that time had nothing on it. Yes it was not the fully quality edited presentation, but it worked, complete with photos

Finally the LA times had the earthquake in the coming soon section.

So if on of the pillars of their future success is protecting the expediency of news they are in big trouble. You can’t beat the crowd, the FREE crowd.

Anonymous Coward says:

google doesnt just search and index news sites they aggregate it into google news. the real point is news sites cant get indexed unless they let google use their material for google news. no opt out possible on google news unless they opt out of all search. google appears to be abusing their near monopoly on search to do evil things.

ChurchHatesTucker (profile) says:

Re: Re:

google doesnt just search and index news sites they aggregate it into google news. the real point is news sites cant get indexed unless they let google use their material for google news. no opt out possible on google news unless they opt out of all search. google appears to be abusing their near monopoly on search to do evil things.

So, have them opt out of Google and set up their own search engine. Problem solved, right?

Mike (profile) says:

Re: Re:

google doesnt just search and index news sites they aggregate it into google news. the real point is news sites cant get indexed unless they let google use their material for google news. no opt out possible on google news unless they opt out of all search. google appears to be abusing their near monopoly on search to do evil things.

Uh, this is simply untrue. The only content that is included on Google News are in cases where Google has done a *pay* deal with the sources. As far as I know, it’s only AP and AFP who have done such deals.

JEDIDIAH says:

Re: No Indexing? So What?

If a particular news outlet is really so valuable that
it has some inherent value worth protecting with
draconian measures then it really should have it’s own
reputation independent of Google.

Any news outlet worth saving or worth subscribing to
should have no problem turning it’s back on Google and
surviving based on it’s own reputation.

People will happily bypass Google and go straight to the
source if it’s really worth it. However, that’s the real
problem here. Either these news organizations aren’t
worthwhile or they fear themselves so.

Phoenix says:

Umm… the newspapers CAN elect to opt out of having their news indexed if that is what they want. Using robots meta tagging, newspapers can allow Google to index their site but can declare specific news pages to be off-limits. The fact that newspapers are not doing this is a kind of agreement to display their material in the public domain.

The real issue with newspapers is that their product is really bad. They went way over the top with advertising while allowing their differentiator – quality of reporting – to go into the toilet. When I finally cancelled my newspaper subscription, it’s because they had no local content left (or I couldn’t find it). Page after page of mattress and furniture ads in between crappy little ‘social’ news stories.

Local papers are not competing with the Internet for national/global news, they are competing with other news companies that can deliver news to me using the Internet. My local paper cannot and should not try to compete with the ability of the BBC to provide me with coverage of a UK story. However, my local paper could absolutely compete with any other news organization for local news stories if they were smart enough. First and foremost, it starts with good reporting and good content. No business model will sustain a bad product. Next, the newspapers need to innovate. How about collaborating with Amazon and putting out a Kindle version of the news? The Kindle could be subsidized to some degree by my news subscription.

Newspapers are dying from poor execution, ignorance, and a failure to innovate. Regulation won’t help that one bit.

Anonymous Coward says:

Re: Re:

again read closely. if you use robots.txt to block pages, those pages are not in google search. the choice is either no google search and no google news, or yes to both. there is no middle ground. no way to say yes to search and no to google news. so either you are not in google at all or too much in google and google uses your content to make a news site.

ChurchHatesTucker (profile) says:

Re: Re: Re:

again read closely. if you use robots.txt to block pages, those pages are not in google search. the choice is either no google search and no google news, or yes to both. there is no middle ground. no way to say yes to search and no to google news. so either you are not in google at all or too much in google and google uses your content to make a news site.

Again, read closely. Opt out of Google and then set up your own search engine, This doesn’t take an act of Congress. It’s not even collusion, it’s a business model. (e.g., One paper sets this up, promises to pass along a cut for the traffic delivered, other papers join of their own volition, etc.)

No, they want to make up a ‘third option’ which is “forcing Google to cut them a check, without doing anything on their end.”

Phoenix says:

Re: Re: Re:

@18 – Not necessarily. Robots.txt can specify per-page control. You can have your home page indexed – including story headlines – and you can put the story content on non-indexed pages. This way, your site is visible and your headlines are visible but the story content is not. Non-current news can be moved to a fully indexed archive page ensuring that the site maintains it’s value as a reference site.

Aaron Martin-Colby (profile) says:

Comment

I don’t really agree with Harrison that the previous price of advertising was artificially held high by monopoly.

Back then, there was a fundamental limitation on the dissemination of information, as such the machinery and infrastructure to distribute that information was incredibly valuable.

Instead, I see it as simply a change in the market that resulted in a change in the value of a service or good. Or to be more specific, back then, information was NOT an infinite good, it was very much finite.

I also disagree that newspapers had a monopoly. Yes, the business model of the newspaper had a monopoly on information dissemination, but no single entity had a monopoly, even in small markets. I live in Rhode Island, where our population has been less than one million for most of history. We had two big dailies, the Journal and the Bulletin, and a good baker’s dozen worth of smaller local papers. We also had the NY Times, Boston Globe, LA Times, and some other nationals available at most news stands. That’s no monopoly.

As such, I find his comment about revenue being above market rates inconsistent. Market rates are defined by what people are willing to pay. If everyone is willing to pay a price for newspaper advertising, that is the market rate. Yes, it is above-market in the market of easy, infinite distribution. But at the time, that market didn’t exist.

It was not an artificial restraint. That’s like saying that there was an artificial restraint on information distribution in pre-historic times because we didn’t have the “true” market of the internet, and all we could do was grunt at each other.

This was merely the market at the time. That market had inherent channels through which information could be disseminated. An artificial restriction requires the possibility that the market could be another way, and a powerful force preventing it. This wasn’t the case. There was as much competition as the dynamics of the market allowed at the time, advertising rates were true market rates, and the new market of the internet does not negate the validity of the old market.

The problem as I see it is that newspapers were being paid not for the content, but for the service of organizing, arranging, and distributing content. At the time, this was very valuable. In this perspective, the content never had value, only the access to it did. So to clarify my earlier point, the information was always infinite, it was the access that was finite.

Now access is easy, and the service provided by papers is no longer needed. Newspapers need to find some service associated with the content that they can provide that is of value.

Dale Harrison says:

Re: Comment

My observation was that papers had localized geographic monopolies…not absolute monopoly. This was a function of the logistics of printing and distribution millions of tons of paper.

The local paper always had the fresh news…at least until the early 90’s when companies like NYT began to use digital transmission of printing plate data to regional presses. This was a natural monopoly…nothing artificial or conspiratorial about it.

Dale Harrison
dale.harrison@inforda.com

But whatever the source of monopoly power, the newspapers could still restrict the total amount of editorial and advertising content in a market…intentional or not.

This will always lead to a market-clearing price above what it would be if there were intensive widespread competition.

If you have only one car dealership in town, you will never get as good a bargain if there 100 dealerships in town…each fighting for your business.

With the Internet, we just went from there being 1 or 2 dealerships in each town to there now being an infinite number of dealerships in town.

That’s why ad rates online do not and never will match the old monopoly-level rates of the historical print-only era.

Aaron Martin-Colby (profile) says:

Re: Re: Comment

Hi Dale, thanks for the response.

“This was a natural monopoly…nothing artificial or conspiratorial about it.”

I agree, but your original wording was,

“The dynamics of content replication and distribution on the Internet destroys this artificial constraint of distribution and re-aligns advertising (and subscription) prices back down to competitive open market rates.”

I added the emphasis on your comment about the artificiality of that constraint and, in my reading, the nature of the market at the time. I further read this interpretation to be correct with your statement about prices coming back down, after being held up from their natural level. This made me think your argument was that the old market was somehow incorrect, only to be corrected now with this new market.

I fully agree with your general assertion that online advertising will never match paper (with the current model, anyways), I just felt you were disregarding the validity of the old market vis a vis the new market.

Felix Pleşoianu (user link) says:

(…) these lawyers wish to make it so that before a search engine can index any website, it needs to negotiate permission. This would kill the internet.

Ahem…

First, there is Internet outside the US as well. Imagine Google moving its base of operations to a fiscal paradise. Or Europe. Mwahaha.

Second, even if that restriction became real, Google would still be free to index anything under a Creative Commons or similar license, and anything in the public domain. Guess who’d win: people like me and you, Mike. Oh wait, you don’t actually have your content under an explicit open license. But that’s easy to change.

P.S. Google did actually remove some newspapers from its index following a court decision. It happened a few years ago in Belgium. Those newspapers’ resolve lasted about two weeks IIRC.

Steve R. (profile) says:

Publisher's don't Have Copyright Protection

It seems to me that Sanford and Brown have made a couple of fundamental mistakes.

First, in terms of copyright, a publisher does NOT directly have copyright protection. A publisher is a distributor, not a creator. True, a content creator can enter into a contract with a publisher that gives the publisher some copyright privileges and yes publishers can create some content. Nevertheless, content creators (with our computer technology) today should just simply fire their publisher and market the content themselves. But, I guess the publishers want to fool you into thinking that they are entitled to copyright protection, which they are not.

Second, our legal system is supposed to be based on a level playing field. If we follow what Sanford and Brown are advocating, every profession and I mean every profession would be entitled to protective legislative welfare. The free market would be non-existent.

batch says:

Let them have their way

then, when the US population goes batshit crazy over what happens, people will learn the hard way to Quit Lobbying for Ridiculous Laws and Protection of Business Models then this problem will quit rearing its ugly head every couple years, until a new generation in a hundred years has forgotten all about it and the cycle will repeat. As it was with the horse and car, so it is with the newspaper and internet.

Devonavar (user link) says:

Databases

Maybe you can untie this knot for me. These lawyers are basically complaining that giving people the ability to find whatever information they need is a bad idea. Search engines are bad = there are situations where some information should not be accessible.

It’s not that much of a stretch to apply this to the ICBC case in your previous post where ICBC’s ability to search their database for jury members’ claim histories was potentially disrupting the justice process. Google essentially turns the internet into a massive searchable database, and many privacy advocates (including you) have recognized that individually harmless bits of data can be quite harmful when aggregated. I think there’s a parallel between the harm these lawyers see in Google’s news aggregation and the harm you (and I) see in ICBC’s aggregation of claim data.

So, which is it? Is unfettered access to data (and potentially powerful analysis within that dat) a problem or not? I’m inclined to accept the proliferation of data as a fact and try to work around it. To me the biggest problem is that ICBC’s data is proprietary (and perhaps the push towards transparency of government data would help here). Making ICBC’s database publically available would be much fairer, since the defence would (theoretically) have the same information about the jurors available to it.

But, this doesn’t solve the problem of tainting the jury pool. In fact, it probably makes the problem worse — the more information is available about the potential jurors, the farther from a “random” jury of peers the jury becomes.

So … how do you deal with this? Are certain types of publicly available information a problem? Can we do anything about it? Maybe we need to reform the justice system to deal with this new world where everyone’s sins (and accomplishements) are on public display.

Bettawrekonize (profile) says:

“the suggestions are so mind-bogglingly bad and dangerous”

I’ve always been saying on message boards, if we don’t stand up for our rights they will likely be taken away. Never take your rights for granted. There are powerful entities always working to take your freedoms away (and to make up some fake justification for it) and this will always be true. We must always defend our rights.

Tom Anderson (user link) says:

technically naive lawyers

These lawyers are technically naive. If newspapers really want to present only limited versions of stories to webcrawlers, they just need to separate the content into public and private pages, and address .htdocs and robots.txt accordingly.

Newspapers can also hire companies to track down illegal use of their stories.

The problem, however, is that people don’t want to read only “oldspapers”, stories that were all written yesterday. They also want access to new stories, stories that are breaking news. If newspapers can’t keep up with the information age, then they should remake themselves into

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...