back

Social Systems are not People

It's common to conceptualize states, governments, companies, etc as though they are individuals with human-like motivations, relations, even emotions. Perhaps this is understood as approximate. It's clearly an anthropomorphization- humans intuitively understand humans, and tend to understand other systems by pretending they are like humans.

But it should be obvious that they aren't. It can't be taken for granted that the behavior of a system-as-a-whole will resemble the individual behaviors of its components. For groups of humans to behave like individual humans would be a remarkable thing. This is why it generally is not the case.

The current popular humanist conception, I think, is that there exists a "rational agent" structure, which all "intelligent" systems will converge to, including humans and social systems. This would feel more plausible to me if humans really had any resemblence to rational agents. If there is such a convergence, it's clearly incredibly weak.

What of irrational agents? Oftentimes, these things are so broadly defined, any reasonably dynamical system displays "agential" behavior. The intention is for "agential" behavior to refer to goal-seeking or teleological processes. But to define something as goal-seeking is only meaningful if you can tell me what that goal is (or give me some way to determine it). Are Newtonian particles rational agents which have a desire to slide down potential gradients?

Arguably, most everyday systems are goal-seeking in the cybernetic-autopoietic sense of tending to perpetuate oneself. This is a basic consequence of the "0th Law": that which can persist, persists, and that which cannot, does not. Hence, the world is full of self-reenforcing processes*, be they simple stable states, oscillators, or organisms. In hostile environments, self-preservation requires complex defenses, so a kind of "intelligence" is built up.

I suppose, then, that this complex self-preserving tendency is the primary resemblence of social systems to individuals. That complexity serves the additional function of making prediction difficult, leaving us free to invent a variety of unverifiable post-hoc explanations, projecting our humanist intuitions without resistance.

* There is subtlety with regards to what exactly it means to "reenforce" "oneself". I would like to understand this better.