How did you find this article? Where did you click through from, or how were you notified about the latest copy of The Nav that you currently hold in your hands?
Open your phone. Go to settings. Scroll down, and then down a little further. Let’s talk about screen time. How much time have you spent on your phone today? This week? How much time online? Click “See all activity.” What are your most-used apps? Mine are Instagram—totalling three hours and 45 minutes so far this week—and Facebook (1 hour and 36 minutes). And it’s only Tuesday.
The point being, social media takes up a lot of our time, and a lot of our mental space.
This becomes relevant not only to our personal lives but also to our larger social and political realms when world events, such as elections, approach.
Several months ago, Facebook announced in a blog post titled “Helping to Protect the 2020 US Elections” that they would be making efforts to claim responsibility for their impact and influence as a major world platform.
The first step is Facebook’s “fighting foreign interference.” This includes defending against fraudulent accounts and protecting the accounts of electoral candidates and their teams. This will be made possible through the implementation of “Facebook Protect,” which candidates can enroll themselves in. Facebook will help those enrolled in Facebook Protect through both aiding them in developing better safety protections, as well as monitoring the accounts for hacking.
The second part of Facebook’s plan is one of increased transparency. This includes changes such as displaying the owner of a Facebook page publicly, so there is no misunderstanding as to whom the page is run by. Facebook will also label all state-controlled media as such, for an increased understanding of who is funding the advertisements online. The final stage of this step is making it easier to understand political advertisements on Facebook, including allowing users to see how much money presidential candidates are spending on their Facebook advertisements. Additionally, users will be able to see how these ads were geographically dispersed and targeted.
The final part of Facebook’s plan to protect the upcoming elections is summarized as “reducing misinformation.” This includes adding a fact-checking extension, which will notify users if they attempt to share a post on Facebook or Instagram that contains content proven to be incorrect by third-party fact-checkers. Facebook is also planning on reducing voter suppression through ensuring that information shared to mislead voters—including but not limited to false information as to who is eligible to vote, where to vote, or how to vote—is prohibited. Facebook is also looking to remove any content that threatens violence as a means to discourage certain people’s vote.
In addition to these steps, Facebook reminds users of their hate speech policies, and that posts suggesting that voting is useless or pointless counters the commitments Facebook made to the civil rights audit released in the summer.
These changes come shortly after current presidential candidate, Sen. Elizabeth Warren, ran an advertisement that suggested Facebook founder, Mark Zuckerberg, endorsed Donald Trump in the 2020 election. She quickly followed up with the clarification that this statement was untrue, but that in allowing paid misinformation on his platforms, Zuckerberg has “given Donald Trump free rein to lie on his platform—and then to pay Facebook gobs of money to push out their lies to American voters” and was turning Facebook into a “disinformation-for-profit machine.”
Days before releasing these new protective steps, Zuckerberg did note, in a speech delivered from Georgetown University, that he had considered banning political ads on his platforms altogether, but instead decided to “err on the side of greater expression.”