Shared Belief

Published: January 30, 2025

Last modified: February 6, 2025

9 minute read

I recently read the book Sapiens by Yuval Noah Harari. I found it quite enlightening, and think that some of the theories discussed frame the issues we are having with social media quite well. My interest in Harari’s books was piqued after his guest attendance on the Your Undivided Attention podcast, which is the biggest inspiration for my interest in social media and ethical technology.

Abstract Thought

Looking back to before humankind diverged from other animals to adapt across climates and spread worldwide takes us to the year c. 70,000 BCE, when humans were primarily hunter-gatherer societies. We formed small tribes of people that could cooperate to achieve a shared goal, being survival of the group via survival of each individual. Dunbar’s number suggests that without some form of social technology to allow us to cooperate on grander scales, the size of hunter-gatherer tribes would have been severely limited by the number of individuals we can come to know and trust (150 for the modern Dunbar’s number).

It was around this time, though, that humankind went through the cognitive revolution, and began to think abstract thoughts about our environment, ourselves, each other, and the world we live in. Through our abstract thinking capability, we developed the subsequent ability to have shared beliefs: I know that my tribe holds beliefs XYZ, and I do too, and that’s what makes us, us. Now, instead of needing to get to know a person to the point that they take up an individual under my Dunbar’s number count of ~150, I can extend a certain level of trust towards them if I know that they also hold beliefs XYZ - that means they’re one of us. The horizon of who I can include within my tribe widened by a magnitude by the development of abstract thinking.

Social Technology

Social technologies were then developed through this capability of abstract thought. That set of beliefs XYZ could be considered a form of social technology - a collection of abstract ideas about how the world works that come together and let me more effectively communicate and cooperate with strangers. Eventually came written language as a way to encapsulate our abstract thoughts and share them with others across the boundaries of space and time, encouraging the spread of shared beliefs that may have only ever been passed down person-to-person.

The next obvious example of a social technology that widens our social horizons is religion (what is religion if not a set of beliefs held by a collective of people?). Different sets of beliefs achieved differing levels of broadening horizons - I don’t know enough here to go into too many details, but roughly 43.9% of Australians said they follow some denomination of Christianity in the 2021 ABS survey - Religion as a social technology is clearly extremely effective.

Linking this to more modern meanings of social technology, I think the next obvious example is the printing press. While not a collection of beliefs itself, it is instead an amplifier of beliefs, and is likely a large part of why Christianity is so common in the modern world - the printing revolution begun after the adoption of the movable-type printing press in Germany in the 1440s, which was part of the Christian Holy Roman Empire. Mass-copying a belief allows that belief to spread over a greater area without relying on individuals or the few written copies available to effectively convey the ideas, and thereby allows us to broaden our horizons and trust people who are significantly removed from our small group of well-known individuals: I know that I am part of a Christian nation, so if I were to travel to the other side of my country, I have at least a base level of trust in the people there assuming that we hold the same set of base Christian beliefs.

Us vs Them

While the horizon of who we consider to be part of our tribe, us, broadened, so did our ability to abstractly think of the opposing force, them. Nature is a battle of survival of the fittest - therefore, our ability to form cohesive groups that cooperate together effectively against threats is a fundamental aspect of what makes us humans. Us vs them thinking is baked into our physiology due to it’s importance in our evolutionary history.

As our understanding of us and them gets wider, so too do our wars - we started the 20th century with the invention and widespread adoption of the radio, a social technology that broadened our understanding of us by another order of magnitude. We also started the 20th century with the first World War.

Modern Social Media

My impression is that every advancement in social technology has resulted in a broadening of our horizons of us - until the advent of modern social media (or rather, the invention of the modern social media recommendation algorithm). This is because the incentive structure behind the social technology is different this time to every other time: Instead of being a tool that allows a few people capable of using that technology (because they can afford it, or are in the right place at the right time to make effective use of it) to use it with the incentive of spreading a set of beliefs and achieve a greater level of collaboration towards a common goal, the incentive behind modern social media is instead one purely of profit.

We are more connected than ever before, in a finer level of detail than ever before - the barriers to reach are lower than ever, where I can reach out and individually contact a person on the other side of the world if I want to. I think this is an incredibly good thing. The bad thing is that due to social media’s profit incentive that results in a recommendation algorithm that is designed to keep us on the platform for as long as possible as the number one priority, the content that we are being shown which forms our understanding of the world we live in and the issues that are important to us are more fractured than ever before.

Social media recommendation algorithms are a godlike technology, that leverage our evolutionary predisposition to us vs them thinking to make us, in a sense, physiologically addicted to their platforms. Their recommendations cater content individually in a way that has previously been impossible, amplifying our beliefs in a twisted feedback cycle that is unique to each person. It is commonly known that every person’s algorithm is different - it’s obvious the moment you scroll through the Instagram feed on someone else’s phone.

This godlike, hyper-personalized, rage-inducing feedback cycle is causing the shared beliefs that have held society together at ever-broader horizons to unravel. Everyone’s heard that society is returning to tribalism, but comparatively few are talking about how social media is causing this shift.

What Needs to Change?

The genie is out of the bottle. We can’t go back in time to ~2010 and slow down the development of social media algorithms to a pace where we can keep up with regulation and control for externalities that are becoming obvious to us now. Even if we could, there are many benefits that have come as a result of social media that would be harmful if taken away from us now. As I said above, we’re more connected than ever before, and that’s a good thing. The barriers to reach are lower than ever before, and that’s also a good thing. The ability to organise ourselves around specific niche beliefs is more powerful than ever before, which is a good thing.

I’ve previously posted about incentives that are an artifact of how social media is owned and some alternative ownership structures that may result in more beneficial outcomes for humanity. I end that post saying that it is easier to approach the problem from a different angle, looking at the potential desirable outcomes that social media recommendation algorithms could be trained to reinforce. I think this perspective of our shared societal belief informs that point of view rather well. If we agree that social media algorithms in their current form are fragmenting our shared beliefs, well, how can we measure that? Once we can measure it, can we then measure the impact specific content has? Once we can measure the impact that specific content has, it’s not too large a challenge to proactively rate content based on how it either fragments or reinforces our shared beliefs (this kind of pattern recognition is one of the more well-developed applications of “AI”, think the cameras in the scales at supermarkets that guesses which vegetable you are weighing so that it can flag whether you are scanning garlic through as a potato to save money). Once we can proactively rate content on this measure, we can now train a social media recommendation algorithm that is trained to recommend content that brings us closer together, instead of sending us back to tribalism.

Criticisms of Sapiens

In interest of confirming the veracity of my sources, I briefly looked into the critical response to Sapiens and whether these ideas of shared belief are supported by the scientific community.

The most compelling criticisms I have found about Sapiens are about it’s approach to the topics of history and anthropology, which is that of a pop science novel. The scope is massive (“a brief history of humankind”), and Harari generalizes a lot in the text.

The main idea I am extracting from Sapiens, though, is the concept of humankind’s ability to hold shared beliefs (which is explained by Harari in the first chapters of the book), and that of the cognitive revolution which is really what separates us from other animals on the planet.

On Harari’s ideas of belief and the cognitive revolution, I could only find one criticism that I thought was particularly valid, which points out that the true unique ability of humankind is not specifically our ability to hold a shared belief, but instead our ability to think abstractly in the first place - I can imagine a horse with six legs that might exist just as well as I can imagine something more abstract that might exist, like a company that I work for and structures my efforts and the efforts of others in a concerted way.

Despite this criticism, I still believe that the concept of shared belief is at the very least a useful abstraction for us to frame our ability to cooperate, even if it is a downstream consequence of our ability to think abstractly in general.


other posts