Manila
Principles
on intermediary liability
When user content is threatened with removal from the Internet, it's unlikely that anyone is going to put up more of a fight than the user who uploaded it. That's what makes it so critically important that the user is informed whenever an Internet intermediary is asked to remove their content from its platform, or decides to do so on its own account.
Unfortunately this doesn't consistently happen. In the case of content taken down for copyright infringement under the DMCA or its foreign equivalents, the law typically requires the user to be informed. But for content that allegedly infringes other laws (such as defamation, privacy, hate speech, or obscenity laws), or content that isn't alleged to be illegal but merely against the intermediary's terms of service, there is often no requirement that the user be informed, and some intermediaries don't make a practice of doing so.
Another problem is that even when intermediaries do pass on notices about allegedly illegal content to the user who uploaded it, this notice might be inaccurate or incomplete. This led to the situtation in Canada where ISPs were passing on misleading notices from US-based rightsholders, falsely threatening Canadian users with penalties that are not even applicable under Canadian law.
As a result of the failure to accurately inform users about why their content is being targeted for removal, users remain confused about their rights, and may fail to defend themselves against removal requests that are mistaken or abusive. The ultimate result of this is that much legitimate content silently disappears from the Internet.
To help with this, EFF and our Manila Principles partners have this week released a tool to help intermediaries generate more accurate notices to their users, when those users' content is threatened with removal. An alpha release of the tool was previewed at this year's RightsCon (on the first anniversary of the launch of the Manila Principles), and yesterday at the Asia-Pacific Regional Internet Governance Forum it was finally launched in beta.
The tool is simply a Web form that an intermediary can complete, giving basic details of what content was (or might be) removed and why, and what the user can do about it. Submitting the questionnaire will crunch the form data and produce a draft notice that the intermediary can copy, review, and send to the user. (Note that the form itself doesn't send anything automatically, and the form data is not stored for longer than required to generate the draft notice.)
We don't expect that this form will be needed by most large intermediaries, who will have staff to write their own notices to users. Further information to help users restore content taken down for terms of service violations by several of these large platforms, including Facebook, Twitter, and YouTube, is also available on onlinecensorship.org.
But bearing in mind that small businesses and hobbyists can also be intermediaries who host other users' content, this form may provide a useful shortcut for them to generate a draft notice that covers most of the important information that a user needs to know. The form remains in beta, and we welcome your suggestions for improvement!