By Nate Hagens
20 February 2026
Why is it that most people day-to-day feel more pro-social than the picture of humanity that is currently formed from our media, institutions and the recent Jeffrey Epstein files. Part of the answer is that once systems scale, a small minority with dark triad traits can shape institutions and outcomes for broader society.
But is that it? I’ve started to think a lot more about scale itself and about the patterns and drivers that show up once populations move from small to large to currently huge, over 8 billion. It seems that the behaviors of individuals in small groups have decidedly different dynamics than large groups of humans in the millions or billions. The 21st century interconnected crises that we discuss on this platform are not emerging because humans suddenly became stupid or lazy, or especially malicious at civilization scale.
The same traits that helped our tribal ancestors coordinate, survive, and adapt, may destabilize the systems that we depend on. And I’m beginning to view this story at kind of a species level. Humanity at small scales is Dr. Jekyll, but humanity morphs into Mr. Hyde once scale exceeds some inflection point.
I’m going to outline three distinct layers of the more than human predicament. First are the problems and symptoms. Second, the recurring systemic patterns that produce the symptoms. And third, the deeper forces that are driving it all by locking those patterns into place. I think if we don’t separate these layers in our conversations, we end up talking past each other quite often, even when we might agree on the values and the stakes. We often attempt to fix symptoms while quietly reinforcing the same dynamics that produce them, and that ends up not being the best path forward.
I’m going to say some things that may sound provocative at first. Many of our familiar labels like capitalism or patriarchy or colonialism describe real harms, real phenomenon, but they’re often describing downstream expressions, and I’m trying to focus on a point at the deeper machinery underneath.
I’ll start with the symptom layer before we go underneath to the patterns and the drivers. Here they are: Global heating, biodiversity loss, soil degradation, currently exceeding seven of nine planetary ecological boundaries, rising inequality and poverty at the same time all time. Stock market highs, geopolitical tensions and war polarization, attention fragmentation and psychological strain, widespread depression, anxiety, loneliness, and more. And now AI and its impact on all those other things.
These are usually treated as separate issues with different vocabularies and different experts championing each of them but, in practice, they arise together and accelerate together. And though we don’t often see it, their solutions often push against one another. This list, and you all know it’s much longer than what I just summarized, is what I’m going to call here “the symptoms” in the diagnostic sense because calling them symptoms is a way for us to admit that something deeper is driving them. A final caveat worth naming explicitly in this preamble is that there are biophysical constraints under all these phenomena that are non-negotiable since energy and materials have biophysical costs.
Net energy matters more than gross. Feedbacks in the natural world are delayed and often initially unknown. Decline in both natural and human systems tend to be much faster than the ascent on the way up. These phenomena and many others are features of physics and ecology. Every human system, markets, technologies, institutions is bounded by those constraints, whether we design with them in mind or learn about them the hard way later.
This sets up more interesting questions. Why these symptoms and problems? Why do they show up together and why now? Why, when we zoom out across history and across societies, do many of these same symptoms often show up? Maybe they are systemic patterns – the second layer
Once you start looking for them, these symptoms and problems show up again and again across cultures and different political systems. Maybe these patterns are not accidents. But neither are they human moral failures. I am objectively describing human behavior in large groups, not complaining about it or blaming, describing.
I’m going to name six systemic patterns here. The list is definitely not exhaustive but, when combined, they explain much of what we are seeing in 2026 right now. The first pattern is power law concentration in many systems in both the natural and human worlds. Outcomes do not distribute evenly. In fact, they do the opposite. They concentrate in ecosystems. A small number of species capture most of the energy flow. In forests, a few of the large trees intercept most of the sunlight and hold most of the biomass. In river basins, most water ends up moving through a handful, a small percentage, of channels.
This is all akin to something that many of you have probably heard of the 80/20 rule where 20% of the inputs in a given situation are responsible for 80% of the outcomes. This same geometry expresses itself in human systems. A small number of firms capture most of the profits. A small number of social media platforms capture most of the attention.
A small number of countries capture most of the surplus energy and materials. Sociologists call this the Matthew Principle, after a biblical passage about the rich getting richer. And it’s based on the idea that initial advantages tend to compound due to feedback loops. Those who start with more tend to end up with more and those who start behind fall further behind. Not because of malice, but because systems amplify early differences. Once scale enters the picture, the systemic power only compounds from there. And as we’ll see, once this power law concentration sets in, it reshapes everything downstream.
The second pattern is overshoot and depletion. Overshoot is an ecological phenomenon that describes what happens when a population grows beyond the long-term carrying capacity of its environment. This usually happens because the population found a way to draw down stored stocks instead of living within regenerative flows that renew on a daily or seasonal cycle.
Think of accessing groundwater instead of rainfall or using fossil hydrocarbons instead of daily sunlight. And, for a while, it all looks like success but eventually the bill arrives. The ecosystem can stay at a certain scale only by continuing to draw down stored stocks. Depletion follows when those stocks are reduced faster than the flows that rebuild them. And when that happens, the system can no longer support the scale it once achieved. Commonly used examples of overshoot are the reindeer on St. Matthew Island or the Atlantic Cod fishery on the East coast, but there are lots of them.
Third are arms races. Arms races happen when one person or player or corporation or nation upgrades. This forces everyone else to upgrade just to maintain their position. Once one actor adopts a powerful new capability, others feel they must respond in kind. Not because they want to escalate, but because standing still or ignoring the new capability on the game board becomes too risky. This shows up in militaries and in corporate competition very clearly. In AI, in finance, in resource extraction, even actors who strongly prefer restraint eventually end up participating because opting out carries huge and sometimes existential penalties. Arms races are not caused by bad actors. Only by the very social primate fear of falling behind and then they lock systems into trajectories that are really hard to exit. This explains a lot.
The fourth pattern is rebound effects and Jevons paradox. When we make something more efficient, there’s a hope that this will result in us using less of it but in large, interconnected economic systems, efficiency lowers cost, which often expands use and demand for those who previously weren’t able to afford it. And it also frees up surplus to be spent elsewhere in the system. Modern cars get more fuel efficient, but then we drive farther and in bigger cars. Faster logistics and cheap shipping made it effortless to buy one small thing at a time.
So we end up placing more orders and more brown trucks show up in the driveway. Efficiency improvements from technology and innovation don’t automatically slow systems down. And quite often they actually speed them up and these rebound effects get bigger as the nodes in the system increase.
The fifth pattern is the tragedy of the unregulated commons. Many of the most important challenges we face require coordination across large groups. Climate, oceans, weapons control, AI alignment, financial stability, etc. The problem is that the short-term incentives facing individual humans, firms, and even nations rarely align with the outcomes that would benefit the long-term wellbeing of society or the biosphere as a whole.
And here’s the key point. In an unregulated system, acting responsibly feels like a cost. If others don’t also act responsibly, defection and selfishness in many domains actually feels rational to us in the short run. And the result is that trust erodes making further defection even more likely, with each actor trying to protect themselves in isolation. This ends up degrading the shared system that everyone depends on. This is the tragedy of the unregulated commons expressed at scale. Even when most participants understand and care about the problem, systems drift towards outcomes that none of us explicitly want or, or choose.
The six pattern is complexification/simplification. As systems grow more complex, optimized, and interconnected, they also become more fragile because they require more energy, tighter coordination, and fewer interruptions just to keep functioning at that high level. This is sometimes called complexification, a concept made famous by historian Joseph Tainter.
Joe was, an early guest on this podcast. If you want the longer version, I recommend his episode. When shocks arrive or resources tighten with constraints, complexity now becomes a liability. Systems respond by shedding what they can no longer support. In nature, ecosystems lose species and complex life.
In the human system, institutions narrow and cut back on services. economies lose diversity and optionality in both jobs and goods and services. Simplification is what happens when a system can no longer afford its current complexity. And this has shown up again and again in both natural and human systems. given the scale and interdependence of the modern human project, any simplification ahead is unlikely to be minor, which is why I insert the adjective “Great”, before I use it.
These are the six systemic patterns, power law, concentration, overshoot and depletion, arms races, rebound effects, tragedy of the unregulated commons, and simplification.
These patterns are the “what happens”. They don’t explain why they keep happening. To understand why they keep appearing across cultures and technologies, we need to drill down to a deeper level but, before I go there, there’s a critical context shift I want to highlight that’s relevant.
Many podcast guests on The Great Simplification have said it in various ways. John Gowdy, Lecy Crawl, Luke Kemp, Joe Tanner, and many others say there was a distinct phase change in how human behavior expressed itself before and after settled agriculture. For most of our pre-agricultural history, the traits that made humans successful were the same traits that kept us in balance with each other and with our natural surroundings. In small groups, our instincts worked as checks and balances rather than as accelerants. Social status was earned face-to-face every day. Power existed, but it was visible and it was bounded by the tribe and resource use was constrained by our muscles, time, and proximity to the things that we were doing. The consequences of our actions arrived quickly enough that we could learn from mistakes. Scale that was small in our ancestral environment acted like a throttle or a muzzle on aggregate human behavior. It limited how far our errors could travel, but larger scale changed this. As stable climate and agriculture enabled human systems to grow larger, faster, and more interconnected, our successful adaptations didn’t disappear.
They stayed with us and were amplified. And when that amplification outgrew restraint, the entire character of the human and global ecosystem changed. And again, this was not a moral failure. It was a profound phase shift for our species, homo sapiens, because of a transformation from Dr. Jekyll at small tribal scales to Mr. Hyde at larger population scales.
What about the third layer, the deep drivers underneath these patterns? The first deep driver is the maximum power principle. Across biology, ecology, and humans there is a recurring tendency for systems that capture and use energy more effectively to outcompete those that don’t. When in competition, plants that grow faster shade others out. Animals that secure more energy reproduce more successfully. Societies that mobilize energy at scale tend to dominate territory, production, influence and win wars. For most of human history, this tendency operated inside pretty tight constraints with small groups, low energy density and little to no storage with immediate feedbacks to their actions.
So maximizing power expressed itself as local adaptation, not as a runaway economic Superorganism. And this matters, I think, because humans didn’t suddenly, with the advent of agriculture, say “we should now maximize power”. We were always doing that.
What changed was that the historic brakes on our behaviors and actions were removed. And once fossil fuels entered the picture, this biological tendency went into overdrive. And suddenly our success wasn’t constrained by our muscles, or land, or the seasons. It was constrained only by our access to dense, concentrated energy and our ability to build more machines and networks that could deploy it.
Economic growth, military power and technological expansion all now became expressions of energy throughput. And here’s the key. Maximum power never asks whether the system is wise. It doesn’t ask whether outcomes are sustainable. It doesn’t choose for long-term stability. Such questions or goals aren’t really even on the menu.
Maximum power only rewards speed, scale, and momentary advantage. No one designed it that way, but systems that didn’t follow this path were outcompeted by those that did. And once societies organized around that logic, the maximum power principle acted like gravity, everything else tended to bend toward it. Power concentrates feedback, social group identities harden, and people’s time horizons shrank. You might be asking, how does this relate to the 80/20 rule? I suggest that maximum power explains the drive toward more throughput. And the 80/20 pattern describes how gains and leverage concentrate.
The second deep driver is hierarchy drift. In small groups, hierarchy was temporary and contextual. Someone in the group led a hunt, someone else knew plants, someone mediated conflict, social status moved based on the situation, and human authority shifted with the task. Importantly, if someone abused power, the costs were immediate and personal.
But that changes at larger scales because surplus can be stored. Power can then persist beyond a season or even a generation. Control then detaches from an individual human’s contribution to a broader class. And once that happens, the selection pressures shift. Systems stop selecting primarily for competence and begin selecting for strategies that win status and control. Power can be held onto rather than constantly daily renegotiated in small groups. Coercive behavior is really expensive. People see it and they resist. But at scale it can be profitable because the costs are born outside of the core group and also at a later time. So the traits that would’ve gotten you pushed out of your camp start to look like leadership.
If the metrics reward, domination, persuasion, and risk taking without associated accountability, a small minority can steer outcomes once they control the socioeconomic choke points. Their control over resources leads to control over narratives because the ability to insulate oneself from negative feedback is critical to maintain control.
If a whole bunch of negative consequences were to show up immediately, the group would handicap or even eliminate the current power holders. This is what I mean by hierarchy drift. Luke Kemp’s book, Goliath’s Curse, describes a similar phenomenon. Over time, people in control of institutions become less exposed to the consequences of their actions.
Once this dynamic sets in, it reinforces itself. Those with power shape the rules; rules that favor accumulating more power. Gradually, and then suddenly, the group began selecting for dominance rather than stewardship. Cue the agricultural, industrial, and perhaps artificial intelligence revolutions.
In this layer. patriarchy and colonialism typically show up as downstream expressions of hierarchy drift. They’re real. And hierarchy drift isn’t unique to modern capitalism or any single political system. You see it historically in empires. Corporations, bureaucracies, religious institutions anywhere scaled and stored surplus to allow their power to persist. Power then stops being a temporary tribal role and starts behaving more like property. Again, you don’t need evil intentions for this to happen.
A third deep driver, which I think synergizes with hierarchy drift, is delayed and distorted feedback in small human systems. Feedback used to be fast and pretty obvious. If you over-hunted your area, the game disappeared. If you poisoned the stream, people in your village got sick. So actions and consequences stayed pretty closely tethered, to allow for learning and course correction. But again, scale broke that relationship.
As societies grow larger and more complex, actions and their consequences get routed through channels instead of being felt directly. Causes become harder to trace, and then responsibility is harder to assign. For example, we burned fossil fuels yesterday and today and most of the warming will show up decades and centuries in the future.
We degrade soils now and see yield losses generations down the line. We throw something in the garbage and we have no idea where the trash ends up or what pollution it causes in Indonesia. And when the true signal arrives, the system that caused the damage is often already locked in. This is how systems drift into overshoot without anyone intending it.
Furthermore, distortion of the signal matters as much as delay. Information travels upward through layers of institutions, and incentives and narratives before it actually reaches the decision makers. So the real signals get softened and filtered out entirely, at times. And what reaches the top is rarely the full picture because it’s been sanitized and simplified and, as we see everywhere today, purposely optimistic in most situations.
Because modern systems are tightly coupled, small distortions propagate into large errors, and then decisions get made using inaccurate information. By the time the intervention becomes obvious, the cost of course correction at is much higher. So delayed feedback removes the natural brakes that once kept human behavior within bounds and distorted feedback removes the clarity needed to respond effectively even when the problems are visible.
Together, delayed and distorted feedback they accumulation, pollution and expansion to continue long past the point where restraint would’ve been feasible and appropriate. You can see how this force interacts directly with what I just described as hierarchy drift, because the further decision makers are from consequences, the weaker the corrective signals become.
The fourth core driver is how humans tend to divide the world into “us and them”. That instinct is, of course, ancient in small ancestral groups and ensured our survival. At ancient times and scales, it often supported essential cooperation, which is why oxytocin isn’t only a love chemical, but it’s an ingroup versus outgroup turbocharger. At large scales the expression of this instinct changes as societies grow beyond face-to-face relationships. Identity replaces what trust and feedback used to do back in the day. Group membership becomes a shortcut for judgment, and soon loyalty starts to matter more than accuracy.
As systems scale competition between groups intensifies based on nations, companies, political identities, ideologies, platforms. Each group optimizes for its own position inside a shared system, and costs get pushed outward. Then the responsibility gets diluted, which channels directly then into arms races, and tragedy of the unregulated commons dynamics I mentioned above.
Global heating is a clear and obvious example as are fisheries and nuclear weapons build-up. Most humans are smart. We see the collective danger, but instead of wisdom or coordinated response, the first two questions quickly become “says who” and “who’s gonna pay for it”. So decisions increasingly get filtered through identity before they’re filtered through evidence.
This “us and them” instinct also shapes how information is processed. Signals that threaten a person’s group identity get discounted and warnings from outside the group are pretty much ignored or dismissed, even when scientifically grounded. This makes it impossible to comprehend the nuance, depth, and complexity of the human predicament. Thus, entire societies can understand a problem in theory and still fail to act.
The fifth core driver is how humans value time. As biological animals, we discount the future heavily. and this made sense for most of our 500,000 years as a species. If you lived in a small group with high mortality and real uncertainty, prioritizing the near term helped you survive. That prioritizing still exists. But on a large society scale, this time bias has big downsides.
Modern systems prioritize short-term gains over long-term viability. Quarterly earnings and election cycles matter more than ecosystem health or infrastructure. Maintenance and immediate brain stimulation matters more than reading or working out or meditating. The result? Decisions that look rational in the moment accumulate into outcomes that over time are unstable.
This shows up everywhere. Overfishing outweighs restraint because the fish that you leave today might be caught by someone else tomorrow. Extraction outcompetes stewardship because the payoff to leaders is faster today and more certain at the individual level. Steep discount rates feel normal and even logical but, at the system level, they slowly hollow out the future.
This time mismatch creates a skew towards short-term optimization, even when most people increasingly understand the long-term consequences. This is different from delayed feedback I mentioned earlier. Delayed feedback is about when the consequences arrive. Time bias is about how we value the consequences versus the cost, and together they form a powerful social trap.
Last but definitely not least, is the sixth driving force that’s been normalized for so long. it feels invisible to most of us. It’s how narrowly we define humans at large scales.
Simultaneously with power solidifying and accumulating as scale increases, our sense of being part of nature tends to shrink because power is not only over other nations and competitors and other humans, but also dominion over the natural world. Dr. Jekyll loved nature. Mr. Hyde does too, but primarily as cheap inputs and waste sinks on spreadsheets.
In physical terms, functioning ecosystems are the most valuable asset on any civilization’s balance sheet. Our institutions today just don’t know how to count them, so when energy, materials, or human attention are allocated, the living world culturally and economically is treated as static and infinitely available.
The living world culturally and economically is also treated as politically silent. And once that boundary is in place, it’s hard to remove because drawdown of shared resources becomes rational, and growth requires ecosystem liquidation. At Civilization scale, the living world is absorbing this damage quietly until it no longer can.
When the pushback finally arrives, it will likely arrive at scale, too. Where does this leave us other than a bit depressed?
So we have symptoms: Global heating, biodiversity loss, soil degradation, currently exceeding seven of nine planetary ecological boundaries, rising inequality and poverty. Stock market highs, geopolitical tensions and war polarization, attention fragmentation and psychological strain, widespread depression, anxiety, loneliness, and AI.
We have six symptomatic patterns that are creating these symptoms:
- Power Law Concentration
- Overshoot and Depletion
- Arms Races
- Rebound Effects and Jevons Paradox
- Tragedy of the Unregulated Commons
- Complexification/Simplification
We have three deep drivers creating the six patterns:
- Maximum Power Principal
- Hierarchy Drift
- Delayed and Distorted Feedback
Perhaps you might disagree with the specific symptoms or patterns or drivers I’ve outlined, but I do think separating the more than human predicament into these layers is useful because it helps us avoid arguing at the wrong level.
The truth is, despite our situation, none of the forces I’ve described here are bugs in the human system. They’re extensions of traits that helped us survive and cooperate for most of our history. And this is where the Jekyll and Hyde framing is important. I think, in small groups with visible consequences and tight feedback, our instincts kept us more or less in bounds.
Power was negotiated, status was earned, and the mistakes stayed local. That was our Homo Sapiens, Dr. Jekyll phase. Then scale increased energy, abundance skyrocketed. Tools began to outlast human relationships, and those same instincts stopped being self-limiting. Power could be accumulated instead of renegotiated.
Lived experience and complexity obscured the consequences. This is our late stage Mr. Hyde phase, not evil but what happens when ancient instincts operate in systems that move faster and reach farther than our moral intuitions were shaped for, and the ground we’re standing on today is one where Mr. Hyde has slowly, but firmly taken the wheel.
Okay. this has been long and perhaps a bit of a downer, but I want to close with a few questions to consider. Are we trying to solve the right layer of the problem or are we repeatedly optimizing on the symptoms while reinforcing the forces underneath?
What if what I’ve presented here is generally correct? What would it mean to design systems that recognize and assume human fallibility rather than human virtue? Systems that work even when people are tired and scared and status seeking and short-term oriented.
If many of our hardest problems come from instincts that once helped us survive, what does responsibility mean at a civilizational scale? Not blame or guilt, but responsibility. How do we build systems where our ancestral Jekyll traits can persist? Even as scale is very large, we’ve seen partial answers in Mondragon Cooperative Enterprises, in common space resource management, and in those cultures that deliberately shorten feedback loops.
But are those answers necessarily fundamentally local or might there be a way to carry them forward without scaling them up into Mr. Hyde? What does agency look like when no one is really in control? What kind of agency is still available to us?
Maybe understanding these forces doesn’t give us a steering wheel, but it does give us better brakes.
When do we choose to slow down and how, or do we only respond after the crash?