Sovereign Hardware and Free Speech
The Apple-Google mobile duopoly is beginning to moderate our speech within apps. It's time to start exploring sovereign hardware.
In the last few days we’ve seen President Trump and his affiliated corporate entities banned from Facebook, Twitter, and numerous other platforms. While banning the President of the USA from social media is unprecedented, these types of account suspensions and bans have been commonplace for years. I understand both sides of the argument, and I am not surprised that Trump was ultimately banned.
More interesting and troubling, in my opinion, is Apple’s and Google’s move to ban Parler from their respective app stores.
Digging into Parler
Let’s dig in to Apple’s official letter to Parler:
Our investigation has found that Parler is not effectively moderating and removing content that encourages illegal activity and poses a serious risk to the health and safety of users in direct violation of your own terms of service, found here: https://legal.parler.com/documents/Elaboration-on-Guidelines.pdf
This makes sense, Parler should absolutely moderate illegal activity that poses a serious risk. Let’s read on.
Content of this dangerous and harmful nature is not appropriate for the App Store. As you know from prior conversations with App Review, Apple requires apps with user generated content to effectively moderate to ensure objectionable, potentially harmful content is filtered out. Content that threatens the well being of others or is intended to incite violence or other lawless acts has never been acceptable on the App Store.
Woah! How did we jump from “illegal” and “serious risk” to “objectionable” and “potentially harmful”? How do we define these? Apple cited two specific guidelines in their letter to Parler:
Guideline 1.1 - Safety - Objectionable Content
We found that your app includes content that some users may find upsetting, offensive, or otherwise objectionable. Specifically, we found direct threats of violence and calls to incite lawless action.
Guideline 1.2 - Safety - User Generated Content
Your app enables the display of user-generated content but does not have sufficient precautions in place to effectively manage objectionable content present in your app.
Naturally, I next decided to read through the official App Store Review Guidelines. Relevant quotes below:
1.1 Objectionable Content
Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy. Examples of such content include:
1.1.1 Defamatory, discriminatory, or mean-spirited content, including references or commentary about religion, race, sexual orientation, gender, national/ethnic origin, or other targeted groups, particularly if the app is likely to humiliate, intimidate, or harm a targeted individual or group. Professional political satirists and humorists are generally exempt from this requirement.
1.1.2 Realistic portrayals of people or animals being killed, maimed, tortured, or abused, or content that encourages violence. “Enemies” within the context of a game cannot solely target a specific race, culture, real government, corporation, or any other real entity.
1.1.3 Depictions that encourage illegal or reckless use of weapons and dangerous objects, or facilitate the purchase of firearms or ammunition.
1.1.4 Overtly sexual or pornographic material, defined by Webster’s Dictionary as "explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings."
1.1.5 Inflammatory religious commentary or inaccurate or misleading quotations of religious texts.
1.1.6 False information and features, including inaccurate device data or trick/joke functionality, such as fake location trackers. Stating that the app is “for entertainment purposes” won’t overcome this guideline. Apps that enable anonymous or prank phone calls or SMS/MMS messaging will be rejected.
1.2 User Generated Content
Apps with user-generated content present particular challenges, ranging from intellectual property infringement to anonymous bullying. To prevent abuse, apps with user-generated content or social networking services must include:
A method for filtering objectionable material from being posted to the app
A mechanism to report offensive content and timely responses to concerns
The ability to block abusive users from the service
Published contact information so users can easily reach you
Apps with user-generated content or services that end up being used primarily for pornographic content, Chatroulette-style experiences, objectification of real people (e.g. “hot-or-not” voting), making physical threats, or bullying do not belong on the App Store and may be removed without notice. If your app includes user-generated content from a web-based service, it may display incidental mature “NSFW” content, provided that the content is hidden by default and only displayed when the user turns it on via your website.
With regard to 1.1, Parler may be violating some of the guidelines – some content on Parler may be considered “offensive, insensitive, upsetting” (1.1) and “Defamatory, discriminatory, or mean-spirited” (1.1.1). But I could say the same about Twitter or Facebook, which is filled to the brim with content that could be considered offensive and upsetting.
With regard to 1.2, Apple is specifically asking Parler to filter “objectionable material.” Parler is different from other social media companies in that it gives users tools to filter their own feeds. Parler does, however, in its Community Guidelines, state:
Parler will not knowingly allow itself to be used as a tool for crime, civil torts, or other unlawful acts. We will remove reported member content that a reasonable and objective observer would believe constitutes or evidences such activity. We may also remove the accounts of members who use our platform in this way.
Based on Apple’s guidelines and its letter to Parler, it seems that Apple recognizes that Parler employs a moderation process – but that Parler does not have “sufficient precautions in place.” Apple is essentially stating that, if Parler does not employ an Apple-ordained moderation process to remove “objectionable content” then it must be removed from the App Store.
Why I am Terrified
In my opinion, Apple and Google’s actions are Orwellian and terrifying. Why? Because my personal definition of “objectionable content” is different from yours, and because this definition widely varies based on country, culture, and epoch.
For example, would Apple have booted an app that allowed Copernicus and Martin Luther to express their deeply heretical views?
Sure, it’s an extreme example. And there are indubitably numerous individuals on Parler who are clearly inciting violence. But my concern lies more with Apple’s and Google’s subjective guidelines, and the raw power they possess to set the rules, change the rules, and enforce the rules in an inconsistent manner – without oversight of any kind. I am concerned with the precedent.
The Public Square
The Internet is now our public square. And I am worried that Apple and Google are beginning to do more than simply decide what apps are allowed to be installed on our iOS and Android devices – they are beginning to decide what we are allowed to say within these apps.
Parler is not the first time this has happened. As I previously wrote last December:
Most troubling and surprising to me, however, is Apple’s recent involvement in targeted content censorship within apps. Two months ago, Apple requested that the popular Telegram app delete channels that citizens of Belarus were using to organize and track members of law enforcement.
Apple confirmed this to Forbes and others:
"Apple confirmed its requests that Telegram delete individual posts, pointing to its App Store rules for apps hosting user-generated content. They require those programs to provide systems for 'filtering objectionable material,' 'to report offensive content,' and 'to block abusive users from the service.'"
This is a massively slippery slope, and especially worries me as Apple operates in so many countries across the world. If oppressive governments are able to work with Apple to censor anti-government speech, Apple could end up playing a key role in suppressing democracy across the world.
I believe Apple should simply refuse to cooperate with oppressive governments – but this is an unlikely scenario, as they have extremely close ties and dependence to China, a current perpetrator of genocide against the Uyghurs.
We should be scared and outraged. Essentially, the companies that make our hardware and operating systems – our portals to the Internet – are now deciding what we users are allowed to say. These companies are regulating our speech without any oversight.
Are We Stuck?
It may seem hopeless. iOS and Android have an effective duopoly on mobile platforms, and the vast majority of users are trapped within their ever-growing walls. But there is hope! There is a growing movement towards self-sovereign money, via Bitcoin, and a decentralized Internet.
If you want to escape the Apple-Google duopoly, here’s what I recommend:
- Buy Bitcoin. Purchase hardware wallets and learn to take sovereignty over your own money. Our company, Foundation Devices, is soon launching what we think will be an excellent option.
- Practice running your own infrastructure. Set up a Bitcoin node on a Raspberry Pi (I use Umbrel, a beginner-friendly option), buy a NAS from Synology or others.
- Familiarize yourself with Linux. I’m currently experimenting with Pop!_OS. For novice users accustomed to a Mac, elementary OS is a good option.
- Experiment with alternative mobile operating systems. Install GrapheneOS or CalyxOS (I’m currently running the beginner-friendly CalyxOS) on a Pixel phone. If you’re a Linux power user, possibly purchase a PinePhone or preorder a Purism Librem 5.
You don’t need to make the switch today. Embarrassingly, I still use Apple products as my daily drivers. But it’s so important to experiment and familiarize yourself with sovereign, censorship-resistant, open hardware. There may come a time, sooner than you think, when you’ll need to make the switch to preserve your freedoms.