Crafting Collective Meaning in a Digital World

An opinion piece for a book club in the AFK world •

We live in a post-modern world where “truth” is dying. And maybe it has always been like that. “Fake news” is not a new thing – Marc Anthony committed suicide based on fake news about Cleopatra doing the same. Truth is a useful concept. Why is it useful? How can we know if something is “true”?

In “Atoms, Institutions, Blockchains”1, Josh Stark analyses the concept of “cast”: a prediction that’s likely to hold true over time. The reliability of a cast comes from what we call its “hardness” - the underlying force that makes the prediction stable. Consider these three examples with different types of casts, each backed by different sources of hardness: When an engineer says “This bridge will stand for 100 years”, their prediction is hardened by the unchanging physical properties of materials - what Stark calls “atom-hardness”. When a bank promises “Your $100 deposit will be available next year”, this cast relies on “institutional hardness” - the stability of banking systems, regulations, and governments working together predictably. And when someone claims “This Bitcoin can never be spent without knowing a certain private key”, they’re relying on blockchain technology - a new form of hardness that combines mathematical certainty with distributed verification. These sources of hardness serve as fixed points that make our future more predictable, allowing human coordination at scale. Without somewhat reliable predictions, everything in our society, from economic trade to social day-to-day activities becomes much more fragile.

Science represents our best attempt to rely purely on “atom-hardness”. In its ideal form, scientific claims are backed by experiments that anyone can repeat anywhere, directly interacting with physical reality. This direct interaction with atoms makes scientific predictions particularly reliable - when you drop an object, gravity will act the same way regardless of who’s watching or what they believe. But modern science often strays from this ideal. Much of what we call “science” today, especially in academic institutions, actually relies more on “institutional hardness”. Instead of directly verifiable experiments, we increasingly depend on trusting authority figures, peer review processes, and complex webs of institutional credibility. This shift creates a peculiar opacity - scientific claims become harder to verify independently. When we can’t verify claims ourselves, science becomes more like faith, relying on authority rather than observation.

I find the concepts of “cast” and “hardness” very helpful to understand modern social changes and read online texts. For example, when reading Byong-Chul Han’s interview2, Han argues that truth and shared narratives used to be sources of “hardness”, providing stability and a centripetal force that held society together. These weren’t just any stories - they were backed by powerful institutions like churches and governments. But the hardness of these institutions is now extremely weakened, and casts about social behavior, meaning, and even events like plagues are lacking a base to stand on. This makes it much harder for society to agree on what’s true and meaningful.

The traditional global institutions that created collective meaning for our societies might be fading, but there might be new opportunities in this digital age at a smaller scale. The essay “First we shape our social graph; then it shapes us”3 sits on the other side of Han’s analysis. This essay emphasizes the importance of actively shaping one’s “milieu”, the particular set of influences that surround us, because it powerfully shapes who we become. The essay argues that we are constantly internalizing the culture around us, even unconsciously. This process of internalization can be understood as a new kind of another source of hardness, a precursor for Institutions.

While atoms and institutions provide reliable foundations for large-scale predictions, there’s another source of hardness that sits in the middle, and operates on a more personal scale. We could call it “psycho-physiological habits”: the predictable patterns in human behavior and social interactions. For example, the prediction “I will check my phone at least five more times today” is remarkably reliable, based on my direct experience and understanding of my habits and routines. Similarly, small-scale social customs allow us to make reliable predictions such as “These two siblings will meet for Sunday lunch as they always do.” Their source of hardness is local and personal - it’s harder to recognize from the outside, but often more reliable in our day-to-day lives than institutional promises. When you discover these patterns of human behavior, you can make surprisingly accurate predictions about what people will do next. The online advertising industry know this very well: they’ve become experts at identifying and exploiting these behavioral patterns.

Surveillance capitalism, as defined by Shoshana Zuboff in 20194, has become the dominant force shaping our digital environment. In it, she argues that surveillance capitalism operates by claiming human experience as raw material, and transforming it into behavioral data that is then used to predict and modify future actions. This process relies on identifying and exploiting predictable patterns in human behavior, creating “casts” about human behavior that are strongly evidence-based, continuously re-analyzed, and updated algorithmically to be as accurate as possible. The online advertising industry excels at exploiting these habits. By tracking our online behavior, our routines and preferences, they can craft highly accurate “casts” about our future actions and desires, leading to extremely personalized ads and targeted content. One of Zuboff’s key points is the awe-inspiring power and influence that can be achieved by exploiting the “hardness” of these habits. They create systems that not only predict our actions but also nudge us towards desired behaviors, often without our conscious awareness. This power dynamic highlights the potential dangers of unchecked control on the corporations powering the digital infrastructure that shapes our lives.

A similar book, first published two decades earlier, is Code and Other Laws of Cyberspace 5, which argues that computer code is a new kind of regulation and legislation over our lives. In it, Larry Lessig proposes the term “West Coast Code” to refer to the software and algorithms that increasingly govern our digital lives (and this was in 1999). Similar to institutional hardness, “West Coast Code” shapes our online behavior by establishing rules, constraints, and incentives within digital platforms. Consider how social media algorithms, driven by engagement metrics, amplify memes that trigger emotional responses, regardless of their veracity. This process, inadvertently or intentionally, reinforces the hardness of casts related to specific narratives, ideologies, or even conspiracy theories. As a result, “West Coast Code” has the power to shape our collective understanding of truth and reality, much like institutional hardness but with potentially greater reach and speed.

The internet serves as a vast ecosystem where information flows at unprecedented speed and scale, fundamentally reshaping how culture propagates across society. Digital phenomena like “Memestocks” exemplify this dynamic, spreading like cultural wildfire and transcending their online origins to influence real-world behaviors and events. The GameStop stock surge of 2021 demonstrated this power vividly – a conversation that began in an online forum transformed into a global financial movement that challenged Wall Street’s traditional power structures. Beneath this visible current of viral content lies a deeper force: the algorithmic architecture of major technology platforms. These sophisticated systems, controlled by a handful of corporations, quietly shape our digital experiences and, by extension, our cultural narratives, consumption patterns, and social behaviors in ways both profound and often imperceptible.

We need to craft better social understanding and meaning of how the Internet is shaping us. By Internet I’m not referring just to the technology, but also the social, political and cultural forces that are involved in it. It’s crucial to be aware of these forces and to engage in critical discussions about how they are shaping our digital and physical lives. We can’t just passively accept the laws of cyberspace as they’re given to us. We need to actively participate in shaping the future of the Internet. And we’re at the verge of a gigantic opportunity: up until now, only armies of product engineers were able to create world-scale applications. AIs that can run on your own machine are becoming increasingly better at crafting code. This is a crucial step towards a liberation of the means of information processing - the creation of code that helps humans to find meaning in data.

This is a moment that calls for active participation and creative thinking. We can champion systems that promote freedom, fairness, and human potential. We must move beyond passive consumption to become active architects of our societies. “Tuning the algorithm” is not enough, and the time for action is now – start building, start creating, shape the world you wish to see before market forces do it for you. Let us embrace this responsibility with courage and creativity, in this new chapter of the collective creation of meaning.


  1. “Atoms, Institutions, Blockchains”, a 2022 essay by Josh Stark https://stark.mirror.xyz/n2UpRqwdf7yjuiPKVICPpGoUNeDhlWxGqjulrlpyYi0  ↩︎

  2. “All That Is Solid Melts Into Information”, a 2022 interview with Byong-Chul Han on Noema Magazine https://www.noemamag.com/all-that-is-solid-melts-into-information/  ↩︎

  3. “First we shape our social graph; then it shapes us”, a 2022 essay by Henrik Karlsson https://www.lesswrong.com/posts/tpewFnmFKnKk5xijh/first-we-shape-our-social-graph-then-it-shapes-us  ↩︎

  4. “The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power” a 2019 non-fiction book by Shoshana Zuboff https://en.wikipedia.org/wiki/The_Age_of_Surveillance_Capitalism  ↩︎

  5. Code and Other Laws of Cyberspace is a 1999 book by Lawrence Lessig on the structure and nature of regulation of the Internet. https://en.wikipedia.org/wiki/Code_and_Other_Laws_of_Cyberspace  ↩︎