How The UK Gov't Extrapolated 136 Self-Reported File Sharers Into 7 Million

from the nice-work dept

A year ago, Julian Sanchez took the time to dig into the numbers that the US gov’t was using to explain the “cost” of “piracy” on the economy and found that they were completely bogus, based on an offhand mention decades ago, based on no research, and then twisted, pumped up and given a government “seal of approval.” It looks like something similar has happened in the UK. Joseph Young points out to an investigation into the UK’s oft-repeated number of 7 million illegal file sharers as being the reason for kicking people off the internet. But a dive into the details finds a massive extrapolation. That 7 million number is based on 136 people responding to a survey paid for by music industry lobbying group BPI, and conducted by Jupiter Research (now owned by Forrester). Not surprisingly, the research director who ran that study was also a guy who has claimed in the past that music can’t be free and that without copyright there’s no way for musicians to get paid.

Think those survey questions were unbiased?

Either way, the survey reached 1,176 net-connected households, of which 136 (11.6%) said they did file sharing. The researchers then just decided that 11.6% was too low, and bumped it up to 16.3%. Why? “To reflect the assumption that fewer people admit to file sharing than actually do it.” Fine. But how was the number picked? They won’t say. So… you’ve basically added 40% to the number there for totally unclear reasons. Then… the extrapolation gets more ridiculous. Jupiter Research said that there were 40 million people online in the UK, and they applied their made up 16.3% number to that number. Only problem? The 40 million number is made up to. The real number was 33.9 million. Thus, their “estimate” was first boosted by about 40% and then another 20%. And all that on top of likely leading questions to appease the music industry lobbyists paying for the study. And this is what the UK gov’t is basing its decisions on?

Filed Under: , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “How The UK Gov't Extrapolated 136 Self-Reported File Sharers Into 7 Million”

Subscribe: RSS Leave a comment
22 Comments
MC says:

Everyone uses statistics to prove their point.

“Either way, the survey reached 1,176 net-connected households, of which 136 (11.6%) said they did file sharing. The researchers then just decided that 11.6% was too low, and bumped it up to 16.3%. Why? “To reflect the assumption that fewer people admit to file sharing than actually do it.” Fine. But how was the number picked? They won’t say. So… you’ve basically added 40% to the number there for totally unclear reasons.”

I love statistics. I can make it sound really impacting “added 40%”, or minimal (4.7 percentage points). Both are 100% accurate. It’s always funny how people usually choose the route that makes their case sound stronger.

I do agree that the use of any arbitrary number (and this one does sound arbitrary) in a survey/poll pretty much negates the validity of said poll, and I would throw out any data/results derived from it.

GJ (profile) says:

Re: Everyone uses statistics to prove their point.

If there were 40 million people online, then 16.3% is 6.5 million people. If there were 33.9 million people online, and the surveyed number of 11.6% is used, then that is 3.9 million people.

So we show that out of the extrapolated number of 7 million people who share files, 3.1 million were invented by the surveyors without any reason.

So the argument is not about how statistics are used or misused, the argument is that these surveyors are liars.

–GJ–

Spyder (profile) says:

Re: Everyone uses statistics to prove their point.

Math fail!

Adding “4.7 percentage points” may be correct, but it is completely meaningless. Adding 4.7% to .00000001% vs. adding it to 95.3% completely changes how much it effects the results. Mike correctly stated how the final results was inflated by showing how it was changed, you failed.

MC says:

Re: Re: Everyone uses statistics to prove their point.

I never said Mike was incorrect, just pointing out how stats are used. The 4.7% increase would be based on total internet users (not just file-sharers) so it’s an accurate statement.

If you had 1 File sharer out of 100, and that number increased to 2, the number of file sharers doubled (a whopping 100% increase). Can you see the difference in connotation? We saw an increase of 100% in file-sharers, or we saw an additional 1% of all internet users share files.

Enrico Suarve says:

Job Done

As a comment in the original article notes that means that just by unfudging the figures file sharing has dropped from 5.6m to 3.9m a 30% drop.

That’s an impressive reduction and all that is required to achieve it fully is a practically zero spend, (OK Mandy might have to fork a few quid to get to a press briefing or two) and for our glorious, unelected, twice expelled current First Secretary of State, Secretary of State for Business, Innovation and Skills to tell the truth.

A 30% drop in a few minutes for free, requiring no legislation and no pissing off voters – thats got to be tempting?…

Unfortunatly it would also mean Mandy telling the truth – see the rub?

Alternatively he could just resign in disgrace, he’s good at that.

Clairebear says:

Error

OK Stat lovers.. Here’s how it works.. There was a reason for their so-called perceived madness..

Accuracy is the difference between the measurement and the true value. It can be broken into two parts:
Systematic error (or bias) is a fixed error that is common to all measurements that are made, and doesn’t vary from one measurement to the next.
Random error is error that varies unpredictably from one measurement to the next.
The researchers to get an accurate result incuded the systematic error of people being honest..Whatever that may be.
A measurement is a sum of the true value and a number of irrelevant factors (errors).
We can write this as

M = T+F1+F2+F3+F4+…

where M is the measurement, T is the true value, and F1, F2, … are the effects of the various other factors.

Hope this helps..

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...