Assessing the Evidence: “Piracy and Movie Revenues: Evidence from Megaupload”

by Julia Jenks 11/30/2012 14:11 (UTC-08:00) Pacific Time (US & Canada)

An abstract released recently by researchers in Europe has gotten some blogger attention for suggesting that box office revenue for some films may be down since Megaupload shut down in January.  Today, Julia Jenks, head of research here at the MPAA, is breaking down some of the serious methodological gaps in the abstract and notes that its flimsy findings raise far more questions than they answer. 

Assessing the Evidence: “Piracy and Movie Revenues: Evidence from Megaupload”

Researchers at the Munich School of Management and the Copenhagen Business Schoolrecently posted a two page summary abstract on the Social Science Research Network entitled “Piracy and Movie Revenues: Evidence from Megaupload” that has caught the attention of some bloggers.  While independent review of the academic literature has shown that the vast majority of it, particularly the literature published in the top peer reviewed journals, finds evidence that piracy harms media sales (for more on that literature, see: “Assessing the Academic Literature Regarding the Impact of Media Piracy on Sales”), some bloggers have focused on a seemingly contrary conclusion from this new abstract, regarding box office revenue in time periods before and after the Megaupload website shutdown in January 2012. 

The reality is that it is impossible to evaluate the validity of the approach or the reliability of the conclusions based solely on the abstract, which does not fully present the methodology or results of the study. In fact, in its present form, this summary abstract raises more questions than it answers, including:

Are the conclusions being presented and interpreted correctly? From the two page abstract it is unclear, for example, which results are or are not statistically significant, what are the definitions for the variables in the statistical tables, and whether and how the results differ for the films that showed on more than 500 screens, which the authors suggest experienced a positive box office effect post the shutdown.  Specifically, the regression tables seem to indicate that the Megaupload shutdown caused an increase in box office revenue for movies that were shown on more than 500 screens (which is a large number of films), but the tables are unclear and could also be interpreted as also showing an increase in sales for all films after the Megaupload shutdown.3  

Which system was used for “matching” like movies? The abstract’s conclusions rest on the assumption that it is possible to create a “matched” control group of movies from the time period prior to January 2012 (pre shutdown), which accurately predict the potential box office performance of similar movies in the post January 2012 time period (post shut down) had the Megaupload shutdown not happened.  This is an extremely difficult proposition, even with the most sophisticated econometric techniques, particularly for specialty films4 – or there would be no box office surprises.  In this case, it’s impossible to assess the validity of the control group without information about the matching technique and methodology, and the actual matching factors.  The only potential factor visible, genre, is very weak.  In fact, it is well known in both the industry and peer-reviewed academic literature that box office revenue is affected by a myriad of both observable and unobservable characteristics (e.g. audience taste).

How does the research account for box office trends independent of the Megaupload shutdown? The “matching movies” approach taken seems to assume that the only thing that changed in terms of box office for films in the time period, covering the last five years, was the Megaupload shutdown. The authors do state that they did some testing of alternative shutdown dates, but they provide no information on how this was performed or whether this adequately accounted for other changes in box office revenue over the last five years that are unrelated to the Megaupload shutdown.  Box office trends not accounted for in the estimation and independent of Megaupload being shut down would lead to a different set of conclusions.

As currently presented, the conclusions in this abstract are not clear or compelling.  We hope that when the final paper is released, these and other related questions are addressed, and a detailed methodological description provided, so that it will be possible to interpret the conclusions presented, and evaluate their reliability.  


1Christian Peukert, a Ph.D. student at Ludwig-Maximilians-University Munich, Institute for Strategy, Technology and Organization, and Jörg Claussen, a post-doctoral researcher at Copenhagen Business School - Department of Innovation and Organizational Economics.

2The abstract erroneously cites the academic paper Assessing the Academic Literature Regarding the Impact of Media Piracy on Sales,” as stating that “privacy [sic] negatively impacts sales.” 

3Films that are shown on more than 500 screens are a large and important universe.  According to Box Office Mojo, the data source used, all 100 of the top 100 films, and more than 150 films in total, released in the U.S. in 2011 were shown on more than 500 screens.* Given that the total sample of movies in the study (1,344) works out to about 270 films per year, this suggests that in certain years films that were shown in more than 500 screens may actually be a majority of the sample, more than 50% of the total.  *Box Office Mojo actually presents “theaters” not “screens,” but since the Munich paper uses Box Office Mojo and presents the information as “screens” we’re using the same nomenclature. 

4e.g. Films showing on 500 or fewer screens.

Categories: Content Protection, Copyright


Month List