3 Simple Fixes for Social Media Disinformation — Dislike, Throttle, and Mix

realdanielbyrne
9 min readNov 22, 2020

Trump is Gone, but the Social Media Vulnerabilities he Exposed Cannot Continue Unchecked

Rudy Giuliani, evangelizes the cult of Trump at a news conference in the parking lot of a landscaping company on 11/7/20.
Rudy Giuliani evangelizes to the cult of Trump at a news conference in the parking lot of a landscaping company on 11/7/20. — Credit Getty Images, Brian R. Smith

I saw a post on Reddit the day after the election was called for Biden that read, “Everyone is celebrating but, like, I don’t know man. It kinda feels like a horror movie where the monster just died but there’s still twenty minutes left in the runtime.”

Everyone is celebrating but, like, I don’t know man. It kinda feels like a horror movie where the monster just died but there

“It is a consolation or a misfortune that the wrong kind of people are too often correct in their prognostications of the future” — Robbie Ross

Oh so true it was, but yet even though we could all see it coming many still underestimated the depths to which the Trump machine would go to achieve their goals. Everyone underestimated, yet again, Trump’s stubbornness and never quit attitude.

This trait in of itself is admirable, but his extremely poor execution on every major and minor challenge of his presidency disqualifies him from holding the position ever again. However, it is astonishing how a serial liar, a cheat, and by every metric an incompetent chief executive could muster such nationwide support, even if he did lose in the end. So how did he do it?

Ben Decker, the founder of Memetica, a consultancy that studies how information travels said, “Stop the Steal is a highly coordinated partisan political operation intent on bringing together conspiracy theorists, militias, hate groups and Trump supporters to attack the integrity of our election.” It is not a grassroots movement.

In short Trump’s campaign executed a master clinic on spreading disinformation so far and so wide, that the truth became indistinguishable from fiction. He cleaned house and replaced top officials at every federal department with incompetent sycophants who would blindly defend every failed policy. The once sane Republican party merrily went along and helped to parrot controversial ideas or at least not shoot them down. Fox News’ news department aired all the news conferences filled with false claims, and Fox News’ opinion anchors zealously championed the demonization of anything Trump was against. However, all that had all been done before, and not just by Trump in the previous election. Previous presidents, senators, and elected officials of every ilk, conservative and liberal alike, have used mass media to their advantage.

The differentiator for Trump however was his exploitation of Social Media’s unchecked mechanisms for sowing the seeds of doubt and capturing its prey in a cocoon of disinformation. The hapless prey, the otherwise doting grandparents, the pleasant, privledged, but dimly lit suburbanites, and any whom abdicate critical thinking to what’s best for them in the moment are helpless and cannot wretch themselves from the dopamine fueled web of lies that social media can spin.

The Social Dilemma. This Netflix documentary explores the dangerous human impact of social networking.

The recent Netflix documentary, The Social Dilemma exposed the dangerous consequences of social media sites and quotes tech experts’ alarm at their own creations. The fact is the tools of social media that data scientists and software engineers like myself put into play on platforms such as Facebook, Twitter, and Instagram, ostensibly to improve advertising revenue, could ultimately be complicit in the untimely death of American Democracy.

To fix the problem, we must first understand it. Far from a difficult or unsolvable problem, I feel the answer is simple and staring us right in the face. We need look no further than the social rules of civil discussion and those around honest debate and truth that have codified themselves in real life, IRL, discussions.

The spoken word and the written word are governed by codes of conduct. One does not spew racist vitriol at work or at Thanksgiving dinner; because social norms keep those impulses in check. Discussions at the water cooler or in a church group rarely end up in a shouting match, and debates in prominent political institutions and in respected courts of law are dictated by decorum and rules devised around the concept of fairness.

These norms took centuries to develop. However, since online discourse is new, those rules of decorum have not yet been established or codified in this new medium. Online, individuals can be perversely rewarded, with likes, badges, reputation points, karma, followers, etc., sometimes merely by the amplitude of their outrage. They can then legitimize fringe belief systems like white supremacy and birtherism with repeated false claims that slowly garner more and more followers and likes until it becomes an undeniable force.

Studies on human psychology confirm that there are two basic types of people.

  1. Those who ascribe to a brand, a product, a belief because others are doing it. The larger the following, the larger the pull to those in group 1.
  2. Those who expressly prefer the opposite from the majority. These are the contrarians, the renegades, and the scofflaws. These are the people who used Apple back in its early days to thumb their noses at the establishment, and who now use Linux to do the same today.

The larger of the two groups is group 1, and generally there is nothing wrong with that. Typically if some product, idea, or political movement has received a large amount of public support, it is because it is good. Therefore, it is usually the safe bet to go with the vehicle that got the 5 star crash rating and good fuel economy and is the best selling in its class.

In Influence, the Psychology of Persuasion, Dr. Cialdini argues that humans operate on the principle of social proof. The principle of social proof suggests that we look to others to decide what to believe, especially when we are uncertain. That tendency can be subverted though if an person with nefarious intentions sets out to gain followers. The larger the number of followers the stronger the legitimacy individuals in group 1 ascribe to the indoctrinator.

Traditional broadcast television and major print publications have mostly kept fake news in check through maintaining institutional and editorial norms, a motivation to maintain integrity, and an attempt to increase readership by appealing to a broader demographic. However, the norms governing social media today allows for coercive tactics to proceed mostly unchecked. To that end I propose a simple three step approach fix the problem of social media amplifying disinformation.

Provide a Dislike Button and Show Counts of Both Likes and Dislikes on the Post

Today, if a person sees a post with 1000 likes, they might think it was legitimate and worthwhile post, again basing the belief on the level of social proof that post had garnered.

However, if that same person sees the same post with 1000 likes and 1 million dislikes then they might be more likely interpret it as a fringe idea.

By not allowing the ability to vote down an untrue, misleading, or inflammatory post Facebook, Twitter, and Instagram are effectively promoting the fringe theory instead.

Imagine you are a user who comes across a post you know from first hand knowledge is false and misleading, but yet it has already garnered 1000 likes. Your only recourse as a voice of reason it to belligerently defend the valid position in the comments. Belligerent because to cut through the noise of all the likes the tendency is to amp up the dissent in the comment to counteract its social support. Therefore, social media is constructed to also favor the negative comment simply by not providing a dislike button.

Social media is constructed to favor the negative comment simply by not providing a dislike button.

Throttle Posts with High Numbers of Dislikes from Appearing in News Feeds

The current crop of social media news feed algorithms flattens the hierarchy of credibility that exists in other media and in real life conversations. Any fringe idea by any producer with the right social engineering, clickbait titles and images could generate engagement and stick to the top of our feeds for longer than the relevance and credibility of the idea would normally dictate.

Society at large by contrast has never amplified the false, the misleading, the racist, more so than it does now because social norms organically restrict this type of behavior in other social situations.

The fix is for social media sites to throttle the exposure of fake news tweets simply by dividing the number of likes by dislikes and using that ratios as a weight on the algorithm that determines how often a post is shown.

The smaller the ratio of likes to dislikes would make throttle the algorithm that determines how likely a post will show up in a feed. This is the social distancing approach to curtailing the spread of disinformation.

Here are a few examples:

1000 likes / 1000 dislikes = 1
1000 likes / 10,000 dislikes = 0.1
1000 likes / 1 dislike = 1000

The first example with a LD ratio of 1 would be of a post that is open for debate, and thus not throttled or promoted more than the average post.

The second example with a LD ratio of .1 would be of a post that is widely considered fake news , and thus throttled to 10% of normal.

The third example with a LD ratio of 1000 would be of a post that is trending and thus should be promoted.

This proposed algorithm has its corollaries in real life. For instance the conversations around the water cooler do not routinely turn to how the world is really flat. Those conversations are kept in quiet back rooms filled with the small number of people who ascribe to those fringe beliefs.

However, social media as it is structured today gives a bullhorn to the flat earthers thus amplifying and adding legitimacy to these ideas without merit. Throttling these fringe ideas as they are throttled in real life, will bring help bring normalcy to online discourse.

Mix Posts of Differing Opinions in News Feeds

This third and final leg to my proposal attempts to break the echo chamber effect. This effect was recently confirmed in an academic paper from Matteo Cinelli et al. A less scientific, but more real life example of how quickly a social media feed can be skewed to reinforce our belief system can be seen in the articles by Mckay Coppins at the Atlantic, and Ryan Broderick at Buzz Feed in which they both created fake online profiles and then started clicking on fringe conspiracies and then documented how their feeds were inundated with fake news.

Compounding the echo chamber is the “illusory truth” effect. Illusory truth is a tendency to believe a thing after seeing it over and over again. This illusory truth effect is well-established; a quick search on Google scholar confirms as much. There was a general belief though that this effect did not work with obviously false or improbable statements like “the world is flat”. However, in a paper published in 2019 by Lisa Fazio et al from MIT they show that repetition increases the illusory truth affect regardless of plausibility of the statement!

Our results indicate that the illusory truth effect is highly robust and occurs across all levels of plausibility. Therefore, even highly implausible statements will become more plausible with enough repetition. https://psyarxiv.com/qys7d

This phenomenon is never more evident than the wholly irrational belief in some of the obviously manufactured stories circulating in conservative social media circles.

My proposal then is to engineer the feed algorithms to insert alternative opinions and actual facts in the news feed adjacent to the questionable ones. This will at least provide an anchor to reality or a “kick” to the hapless soul trapped in an alternate reality.

It is important that these kicks be placed in feed and subsequent to or preceding to the disinformation posts. Simply posting a link to a list of fact checks is not a suitable alternative, since most people are not going to click through to these links.

Inception “Kick”, which wakens you from your alternate reality.
The Inception “Kick”, which wakens you from your alternate reality.

Conclusion

In short to combat the viral spread of disinformation, online social media needs to take steps to bring conversational norms to social media. Social media must take the bullhorn away from those who seek to cause harm and to spread discord; because that is what society has does in other mediums and in real life conversations.

Social media companies can do this by allowing members of online societies vote to not just elevate topics, but also to temper topics. Twitter, Facebook, and Instagram can then use the like/dislike ratio to throttle highly misleading posts or relegate them to only private groups. Then the social media companies must also insert alternative opinions and facts in feed an adjacent to disinformation posts so that the mere repetition of this false information is tempered by reality.

I tweeted at Jack Dorsey my proposal awhile back, but I never got a response. However, Mashable is reporting now that Twitter is now considering the proposal. I like to think it was because I proposed the idea, but in reality no idea is ever original. Either way, I do not care about authorship. I simply want to free society from this problem by whatever means necessary.

--

--