Before becoming the subject of Martin Scorsese’s film The Wolf of Wall Street, Stratton Oakmont was a Long Island brokerage house. Its Wikipedia article states: “It defrauded many shareholders leading to the arrest and incarceration of several executives, and the closing of the firm in 1996.” But before it closed, it featured in some epochal anonymous posts on an online bulletin board hosted by Prodigy Communications. (Online bulletin boards were social media before that term was invented, functioning like a primitive Reddit.)
The anonymous posts accurately described Stratton Oakmont’s operations and presciently predicted jailtime for its founder. Stratton Oakmont sued for libel, demanding $200 million not from the unknown person who authored the posts but from Prodigy for permitting them to be posted. The judge who heard the case ruled that because Prodigy employed moderators with editorial authority, therefore it was legally the publisher of the posts, making it potentially liable for any and all defamatory material posted on its websites.
Ron Wyden, now a U.S. Senator from Oregon but then a member of the House of Representatives, learned of the case and introduced a bill to protect Prodigy and other internet start-ups. His bill, enacted as title 47, section 230 of the United States Code, states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker” of any other person’s posts.
Section 230, as this famous law is known, protected internet companies from the likes of Stratton Oakmont. But it did much more than that, too. By freeing internet companies from the risk of legal liability for material posted on their sites, it made human moderators unnecessary — which is to say, wasteful, from the point of view of corporate profit-maximization. As the novelist Neal Stephenson told PC Magazine, the absence of moderation made social media “scalable.” Companies like Facebook and YouTube could massively increase in size without adding payroll. They could charge into new markets without hiring staff that spoke the language. Editorial decisions are made by computer algorithms created for the sole purpose of keeping users glued to the site. In plain terms, Stephenson says, that “almost always means [the content] is more emotional, it’s less factually based, it’s less rational.”
Section 230’s effects don’t end there. Traditional news providers bear the costs not just of reporting stories but of pre-publication vetting by lawyers, not to mention the ever-present possibility of defending against ruinous lawsuits. Internet companies are spared those costs. Instead, they can distribute work created by others, skimming off advertising dollars at no legal risk to themselves. Section 230 thus functions as a hidden government subsidy to some of the richest companies in the world.
The New York Times recently documented the way child porn remains permanently available online, and not just on the dark web. Well-known companies “have the technical tools to stop the recirculation of abuse imagery by matching newly detected images against databases of the material,” the Times reports, but choose not to “take full advantage of the tools.” A spokesperson for Dropbox explained that scanning for child porn was “not a ‘top priority'” of the company. If its executives faced long prison terms as the publishers of child porn, the company’s priorities just might change.