Delivery alert

There may be an issue with the delivery of your newspaper. This alert will expire at NaN. Click here for more info.

Recover password

‘Wolf of Wall St.’ and Section 230 of U.S. Code

The real life facts behind the case that eventually led to the Wolf of Wall Street movie starring Leonardo DiCaprio also led to a new section of US Code that has had global implications in protecting social media companies from what they publish. (Associated Press)

Before becoming the subject of Martin Scorsese’s film The Wolf of Wall Street, Stratton Oakmont was a Long Island brokerage house. Its Wikipedia article states: “It defrauded many shareholders leading to the arrest and incarceration of several executives, and the closing of the firm in 1996.” But before it closed, it featured in some epochal anonymous posts on an online bulletin board hosted by Prodigy Communications. (Online bulletin boards were social media before that term was invented, functioning like a primitive Reddit.)

The anonymous posts accurately described Stratton Oakmont’s operations and presciently predicted jailtime for its founder. Stratton Oakmont sued for libel, demanding $200 million not from the unknown person who authored the posts but from Prodigy for permitting them to be posted. The judge who heard the case ruled that because Prodigy employed moderators with editorial authority, therefore it was legally the publisher of the posts, making it potentially liable for any and all defamatory material posted on its websites.

Ron Wyden, now a U.S. Senator from Oregon but then a member of the House of Representatives, learned of the case and introduced a bill to protect Prodigy and other internet start-ups. His bill, enacted as title 47, section 230 of the United States Code, states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker” of any other person’s posts.

Section 230, as this famous law is known, protected internet companies from the likes of Stratton Oakmont. But it did much more than that, too. By freeing internet companies from the risk of legal liability for material posted on their sites, it made human moderators unnecessary — which is to say, wasteful, from the point of view of corporate profit-maximization. As the novelist Neal Stephenson told PC Magazine, the absence of moderation made social media “scalable.” Companies like Facebook and YouTube could massively increase in size without adding payroll. They could charge into new markets without hiring staff that spoke the language. Editorial decisions are made by computer algorithms created for the sole purpose of keeping users glued to the site. In plain terms, Stephenson says, that “almost always means [the content] is more emotional, it’s less factually based, it’s less rational.”

Section 230’s effects don’t end there. Traditional news providers bear the costs not just of reporting stories but of pre-publication vetting by lawyers, not to mention the ever-present possibility of defending against ruinous lawsuits. Internet companies are spared those costs. Instead, they can distribute work created by others, skimming off advertising dollars at no legal risk to themselves. Section 230 thus functions as a hidden government subsidy to some of the richest companies in the world.

The New York Times recently documented the way child porn remains permanently available online, and not just on the dark web. Well-known companies “have the technical tools to stop the recirculation of abuse imagery by matching newly detected images against databases of the material,” the Times reports, but choose not to “take full advantage of the tools.” A spokesperson for Dropbox explained that scanning for child porn was “not a ‘top priority'” of the company. If its executives faced long prison terms as the publishers of child porn, the company’s priorities just might change.

Section 230 served a valuable purpose in the early days of the internet. But those days are gone, just like Prodigy. In a rational world, the law would long since have been updated to meet changing circumstances, for example by treating platforms as legal “publishers” once they choose to disregard a formal request to delete material. Unfortunately, rational change may no longer be politically possible. The companies whose fortunes are based on Section 230’s exemption from legal consequences — companies that are, in a literal sense, above the law — spread vast sums around Washington D.C. to protect the goose that lays their golden eggs.

Social media companies have other weapons with which to punish would-be regulators, as well. According to a recent investigation by the political newsletter Popular Information, Facebook is curiously selective in its enforcement of policies against “coordinated inauthentic behavior,” including the use of dummy accounts to promote a post, increasing its prominence and expanding its reach. One particular organization that loudly defends Facebook against the threat of regulation seems to get away with actions that cause other accounts to be deleted.

Regardless of the merits of the specific case analyzed by Popular Information, its article reveals the ease with which any social media platform can enforce strict policies against political enemies while making exceptions for friends. Any politician seeking to modify section 230 is liable to find out what “coordinated inauthentic behavior” really means.

Joel Jacobsen is an author who recently retired from a 29-year legal career. If there are topics you would like to see covered in future columns, please write him at legal.column.tips@gmail.com.

AlertMe

More on ABQjournal




TOP |