coming soon to a podcast, an app store and a metaverse near you…. content moderation rules - forbes

Coming Soon To A Podcast, An App Store And A Metaverse Near You…. Content Moderation Rules – Forbes

Back in 1993, freelance journalist Julian Dibbell published his now-famous article in the Village Voice entitled “A Rape in Cyberspace” .  It drew attention to the need for community standards to prevent users of virtual world services from forcing online avatars into unwanted, grotesque and violent sexual acts. These community standards evolved over decades to become the complex, detailed content moderation standards in place at social media companies like Facebook, Twitter, and YouTube.
But social media platforms are well on their way to becoming legacy services as companies jostle for position in the emerging marketplace for metaverse services. The handwriting on the wall appeared clearly this week, when Meta, the company formerly known as Facebook, reported that its traditional social media service lost roughly 1 million daily active users in the most recent quarter — its first ever drop.  Meta is leaving social media for the metaverse.
A key question is whether the regulators will follow the company into the metaverse through legislation establishing content moderation rules.  The answer is probably yes, and it’s a good thing.
This question arose in less futuristic terms this week when the content moderation wars hit podcast distributors. Musicians including veterans such as Neil Young and Joni Mitchell pulled their music from Spotify in protest over Joe Rogan’s podcasts that featured substantial amounts of Covid-19 misinformation. In a choice between its $100 million investment in exclusive access to Rogan’s shows and the music catalogues of a few older rock stars, Spotify chose to go with its future as the number one distributor of podcasts.
But in doing so, Spotify also began to highlight procedures and processes that have become standard fare for social media companies – it disclosed its standards for Covid 19 misinformation and announced it would insert warning labels for podcasts that contain misinformation. It is only a matter of time before regulatory content moderation controversies reach Spotify and the other podcast distributors. 
Indeed, they are already here. In today’s Senate Judiciary Committee markup of the Open App Markets bill, Senator Ted Cruz proposed an amendment that would prevent app stores, which are prime distributors of podcasts, from discriminating against apps based upon political or religious viewpoint.  The amendment, which Senator Cruz said was designed to respond to decisions by the app stores last January to remove the Parler app that was heavily used by conservatives, was narrowly defeated on a 10-12 roll call vote. 
A political non-discrimination rule for social media, app stores and pod distributors is enormously controversial and deserves a lot more thought than a hasty amendment during a markup on a bill focused on competition issues. 
But transparency and due process requirements make sense for both legacy social media companies and the newer distributors of apps, podcasts and metaverse products. Australia explicitly included app distribution services in its new online safety law that imposes additional procedural safeguards.
 U.S. legislators could include app stores in content moderation legislation through the simple expedient of mimicking the extraordinarily broad definition of “interactive computer service” in Section 230 of the Communications Decency Act. 
As legal scholar Eric Goldman has pointed out, court cases going back to 2013 have held that app stores are “interactive computer services” under Section 230.  For the same reason, distributors of podcasts and metaverse produces would be covered by Section 230 as well. 
As a result, a new content moderation bill to put in place requirements for transparency and due process would automatically cover services such as distributors of apps, podcasts and metaverse products if it relied on the definition of “interactive computer services” in Section 230. The bi-partisan Platform Accountability and Transparency Act (PACT Act) introduced by Senators Brian Schatz and John Thune does exactly that. 
Critics of regulation often bring up the “pacing problem” in arguing against a strong government role in shaping the development of digital industries.  Technology, they say, moves quickly and government moves slowly, and so regulators will never be able to keep pace with the industries they are seeking to regulate. 
But legislators seeking to throw a regulatory net around digital industries can respond to this pacing problem by granting digital regulators broad powers to update their rules in light of developments in technology and business models.  Otherwise, they will create a regulatory net for the past and the industries of the future will slip through it.


Leave a Comment

Scroll to Top