Hidden Content
Major Internet platforms such as Facebook, Twitter and YouTube are taking proactive measures to keep offensive content off their services. According to the Motion Picture Association, online services can use similar systems to proactively remove pirated content too. That would be even easier since it doesn't raise the same speech concerns, the group's senior vice president notes.

The entertainment industries are becoming increasingly frustrated by major Internet platforms that are, in their view, not doing enough to tackle online piracy.
While legitimate user-generated content platforms respond to takedown requests, which they are legally required to, most don’t go any further. This, despite repeated calls from industry groups for help.

Over the past several years, the Motion Picture Association (MPA) has made some progress, partnering with several intermediaries, including payment providers and advertising companies. However, it has struggled to persuade major user-generated platforms and social media sites to be more proactive.

This frustration is fueled by more recent developments which have seen these same platforms take voluntary action against hate speech, fake news, violence, and other offensive content that populates social media timelines.

Twitter, for example, took action against more than half a million accounts over “hateful content” during the first half of the year, helped by ‘artificial intelligence’. YouTube and Facebook also report that they are doing more to proactively detect hate speech, while other online services are taking voluntary action as well.

The MPA has followed this trend. The group recently brought the topic up during a hearing of the House Energy and Commerce Committee on “Fostering a Healthier Internet to Protect Consumers.” The hearing dealt with an ongoing examination of Section 230 of the Communications Act.

Section 230 shields online services from liability. However, Congress also intended it to encourage these platforms to take reasonable steps to deter undesirable behavior. While Section 230 doesn’t apply to copyright, the MPA’s SVP and Senior Counsel, Neil Fried, chimed in with a written testimony for the record.

Fried notes that the liability protections are similar to those of the DMCA, where copyright is at the center. Also, the complaint that Internet services are not doing enough to prevent harmful content from spreading, is similar to the MPA’s complaint that they do too little to prevent copyright infringement.

The MPA’s General Senior Vice President highlights these hate-speech enforcement efforts and acknowledges there are complex issues to address – especially with subjects that are not by definition illegal in law, since free speech is a great good.

“A few companies have recently developed systems to proactively identify posts promoting hate and violence, and have invoked their terms of service to terminate accounts of those engaged in such activity, although not before wrestling with concerns over the impact on expression,” Fried writes.

However, that’s not much of a problem when it comes to copyright, the MPA believes.

“If online intermediaries and user-generated content platforms can proactively identify such content and terminate service in these cases, surely they can terminate service and take other effective action in cases of clearly illegal conduct, which present brighter lines and don’t raise the same speech concerns,” Fried adds.

Fried suggests that online services should use the same tools they employ to detect hate speech and other harmful content to proactively remove pirated content too. Copyright infringement is prohibited in the terms of services of these companies, so they would have room to do so.

While Fried is right that copyright infringement is more clearly defined than harmful content, dealing with it proactively is not without challenges. Unlike harmful content, some people may have the right to post some copyrighted content, while others do not. And fair use is hard to capture by an algorithm as well.

The MPA nonetheless hopes that online platforms will cooperate. In addition, it wants to see if current liability exemptions can be overhauled, using legislation to motivate Internet companies to do more.

This was also made clear to the House Energy and Commerce Committee. And while possible legal fixes are being considered, the US should not include such liability provisions into new trade agreements, the MPA’s SVP notes.

“In the meantime, as Congress reexamines online liability limitations, the United States should refrain from including such limitations in future trade agreements, which runs the risk of freezing the current framework in place,” Fried writes.

This follows an earlier recommendation from the House Judiciary Committee. Last month the Committee urged lawmakers not to include DMCA-style safe harbors in trade agreements while alternatives are being discussed.