back

The Importance of Decoupling and the
Emergence of Part-Whole Relationships

The importance of the concept of decoupling has been sinking in for me more and more, lately. It seems to be one of the de-facto strategies for managing complexity. It's arguably a primary focus of computer science, both on a hardware and software level. But it goes even beyond that- When physicists use a fourier transform to solve the klein-gordon equation, they are using a decoupling strategy. When a group of workers splits apart to focus on separate tasks, they are using a decoupling strategy. When you write using paragraphs rather than one huge wall of text, you are using a decoupling strategy.

Complexity seems to generally be a matter of nonlinearity1. In a linear system, you can consider each "part" individually; in a nonlinear system, you can't, because the parts interact nontrivially. In other words, they are coupled. Decoupling, then, is finding a clever change-of-basis, a new perspective, so that the system can be represented linearly (or "more linearly", I guess?). The ability to break something down into parts and consider them individually is extremely useful; it allows for specialization, and it reduces the number of factors you have to consider at once. Computers would not be anywhere near as advanced as they are if we couldn't separate them into discrete parts, and then have people specialize in optimizing the functionality of each of those parts in isolation.

As the number of degrees of freedom in a system increases linearly, the number of states the total system can take on increases exponentially2. In Brain of the Firm, Stafford Beer called these high-variety situations "unthinkable systems". Sometimes it's called State Explosion. Sometimes it's called "spaghetti code" ("spaghetti" implying everything is "tangled up" haphazardly, ie coupled). The point is that, in nonlinear systems with many parts, complexity becomes overwhelming in a hurry. In linear (decoupled) systems, it only increases... linearly.

So we understand the importance of decoupling. The question is, how do we do it? Software designers will give you many different answers for different situations. Is there some unified principle that all these answers could be derived from? Is there a general strategy for effective decoupling?

1 Linearity in this sense is sometimes called Separability. Nonlinearity is sometimes called Synergy.
2 In cybernetics, the number of states is called the variety of the system.
(Using footnotes is a decoupling strategy!)

Probing Intuitions

Systems have states, and they evolve over time according to some function which maps one state onto the next (or times onto states, if you prefer). I wrote about this more thoroughly in a previous post. This model, at its core, considers only the state of the system as a whole, not intrinsically considering it to be composed of parts.

But that isn't how we think of the world. We think of it in terms of objects, subsystems, each with individual states evolving according to their intrinsic tendencies, plus their interactions with other objects. How does one make this jump from "system-as-a-whole" to "series-of-interacting-parts"? It seems that there's more than one way to do it. I could, for example, consider my left hand, the chair I'm sitting on, and Jupiter to be "one object", and then analyze their collective behavior and interactions with other objects. But that would be silly, and unenlightening, because those three things have very little to do with one another.

"Having to do with one another" is basically another way to say that things are coupled. The question of how best to consider a system as composed of parts is secretly the same as the question of how to effectively decouple its degrees of freedom. In the general case, we want to be able to identify which degrees of freedom are deeply related, and which are shallowly related or just unrelated. The sets of deeply related ones can be conceptually bundled together as "objects" with shallower relationships between them. In other words, we decouple by looking for the subsystems which interact strongly (or complex-ly) on the inside, but weakly (or straightforwardly) on the outside. That is my suspicion, anyway.

In the physical world, this has a lot to do with position in space, because the physical world is local, so it's harder for things to interact from a distance. Hence why it's obvious I shouldn't consider my left hand and Jupiter as two parts of the same object: There is a huge spatial separation between them. Locality is a decoupling strategy (sort of).

Now, my left hand may rest on top of my keyboard, but despite the lack of spatial separation, I still consider them to be separate objects. This is, intuitively, because my hand and the keyboard are not "stuck together". That means that, if I move my hand, the keyboard will not move with it: their positions in space are largely decoupled, so they are considered separate objects. On the other hand, if I moved the left side of the keyboard, the right side would likely move with it: their positions are tightly coupled, so they are considered two parts of the same whole.

What about the keys on the keyboard? They can all move up and down independently of the keyboard as a whole, so they are sort of considered separate objects. But if I push the whole keyboard, all the keys will move with it. It immediately seems that there is a hierarchical relationship between the keyboard and the keys concerning their status as distinct objects.

So let's consider scale. My left hand might intuitively be considered an "object". But so would my body as a whole. And so would the cells inside of it, and the particles inside those. It seems that there is a recursive structure here. Objects are composed of objects, which are composed of objects3. After all, since they are simply subsystems, they are systems in their own right, and can themselves be broken into subsystems. There seems to be a sort of scale symmetry here.

3 Much like how trees are composed of trees.

Breakdown

In software design, what I'm calling "objects" are usually examples of abstractions. They are considered "abstract" because they are bundles of "concretions"; that's to say, they are larger-scale objects, as opposed to the built-in elements of the programming language which are arbitrarily considered as the "concrete" stuff 4. This concept of scale recursion is clearly relevant.

Also relevant, of course, is decoupling. Abstractions tend to be "leaky"; that's to say, programmers are not always so good at keeping things decoupled.

Deleuze for Developers offers an explanation of Deleuze and Guattari's concept of "territory" which I quite like5. "Abstractions" are "assemblages", and when they "leak", they are being "deterritorialized". The point here is that the ideal decoupling, the sensible division of the system into separate objects, is not static. It changes with time, of course, and tracking these changes opens up a huge space of analysis. Borders are broken down, objects melt together, or evaporate, scientific disciplines split apart or cross-over, languages colexify and cultures blur together, everything is taken apart and put back together as capitalism deterritorializes and entropy increases...

I will stop here, because this post is pretty long and dense already, and because we've been teetering on the edge of my understanding for a while now. I don't really know these subjects well enough to speak about them.

4 ...even though those things are also abstractions built from machine code, which is an abstraction built from the hardware, which is an abstraction built from rigid bodies and electricity, which are abstractions built from atoms and electromagnetism, which are abstractions built from quantum field theory, which is an abstraction built from... something that hasn't been discovered yet, probably. Although most would probably consider QFT more "abstract" than computer hardware...
5 I don't know whether this view of assemblages and deterritorialization is faithful to D&G's intentions.

Another note: Similar, but distinct from the concept of decoupling is decentralization. That is, not "putting all your eggs in one basket", but rather spreading your subsystem out such that there's no one well-defined "heart" that can be attacked to take the whole thing down. I think I would describe decentralization as more of a "coupling strategy", intentionally taking advantage of the complexity of coupled systems for the sake of stability. I suppose it's similar to using a complex math problem for encryption, in this sense.