Terence McKenna never wrote a line of code, but he thought like a systems architect. His subject was consciousness, culture, and language. His method was the same recursive decomposition that engineers use to understand complex systems: find the abstraction layers, trace the dependencies, question the defaults.

Language as protocol

McKenna’s core claim: the world is made of language. Not literally – but in the way that a data model shapes everything built on top of it. Choose the wrong abstraction early, and you spend years working around its constraints. The same principle applies to the linguistic and conceptual frameworks we inherit.

Different cultures model time differently. Linear time optimizes for progress and productivity. Cyclical time optimizes for sustainability and reflection. Neither is objectively correct, but each creates different possibility spaces. They’re not descriptions of reality – they’re protocols that determine what kinds of experience are expressible.

Switching programming paradigms works the same way. Moving from object-oriented Java to composition-based Go wasn’t learning new syntax – it was rebuilding mental models for how programs should be structured. Patterns that felt natural in one paradigm became anti-patterns in the other. McKenna suggested that conscious examination of our cultural programming requires the same kind of cognitive refactoring: not just learning new ideas, but unlearning the structural assumptions that make certain ideas invisible.

Debug mode

McKenna framed altered states of consciousness as developer tools – a way to inspect the system while it’s running. The analogy is surprisingly precise.

In distributed systems, the most dangerous bugs are invisible during normal operation. They surface under specific conditions: unusual load, edge-case data, timing-dependent race conditions. Normal waking consciousness might have the same limitation – a mode of operation optimized for routine function, not for inspecting the architecture underneath.

The point isn’t to run permanently in debug mode. That’s like running production with verbose logging – you’d drown in noise. But periodic deep inspection can reveal assumptions baked into the system that you’d never notice from the application layer. Working across different cultural contexts does something similar: patterns you assumed were universal turn out to be implementation choices. How meetings are structured, how conflict is handled, how decisions are made – configurable parameters, not hardcoded requirements.

Culture as legacy code

McKenna framed culture as an operating system that most people run without ever reading the source. Not a conspiracy – accumulated convention. The same way a large codebase accumulates patterns that made sense when they were written but constrain what’s possible now.

Engineers call this technical debt. A decision optimized for a past context becomes a limitation in the current one. It persists because the cost of changing it exceeds the immediate pain of working around it. Cultural conditioning works identically. Mental models inherited from family, education, and social context served specific historical purposes. Some remain useful. Others are vestigial – maintained by inertia, not by fitness.

Consciousness, like software, is modifiable. The default configuration is not the only configuration. You can examine the assumptions, trace where they came from, and decide which ones to keep. McKenna called this gaining “root access” to your own mind.

Emergent complexity

Complexity accelerates in waves, punctuated by architectural shifts. McKenna observed this pattern in cultural evolution. It’s visible in software too – mainframes to PCs, monoliths to microservices, deterministic to probabilistic (LLMs). Each transition creates emergent behaviors that weren’t predictable from the previous paradigm.

Large language models are changing the nature of writing. Global supply chains create interdependencies no single organization fully maps. These are system-level behaviors – properties of the interactions between components, not of the components themselves.

McKenna’s speculation was that consciousness itself might be approaching a phase transition. Whether or not you buy that claim, the pattern recognition is sound: when enough components interact with enough complexity, the system develops properties that didn’t exist at lower scales. This is as true for human culture as it is for distributed software.

The recursive observation

McKenna applied systems thinking to the system doing the thinking. Layers of abstraction built on more fundamental processes. Emergent properties that can’t be reduced to their components. Default configurations that can be inspected and modified.

Everyone runs cultural programming. The question is whether you’ve ever read the source.