back

Reductionism = "Gainy" Compression

"Reductionism" here is referring to the scientific tendency to "reduce" phenomena to simpler underlying laws*. Newton's laws explain (essentially) all planetary motion in a single equation, for example. The point is to take some set of phenomena you observe in the world, and write down a simple theory which captures all of its behavior. In doing so, you are often able to explain a much broader range of phenomena than just that which inspired the theory. As you integrate more and more phenomena into the theory, it at first becomes more complicated. But eventually, you notice even deeper patterns, and the theory becomes simple again, while becoming capable of an even greater breadth of explanatory power. The theories become simultaneously simpler, deeper, and broader, despite the seeming conflict between these qualities.

This is an interesting characteristic of scientific progress. Good theories make predictions not originally accounted for by the theorizer, and when those predictions are verified by experiment, there's a sense that the equations are smarter than the person who wrote them down. Eintein didn't know about black holes when he developed his theory of General Relativity. Only later was it discovered that black holes can exist as a consequence of the equations, and yet later were they actually observed.

By "Compression" I mean data compression; as a practical example, the string "1111111111111111111111111111111111111111" can be compressed as "40 ones", because "40 ones" captures the same information but with fewer characters. We can achieve this by noticing a pattern in the data. The string of 1's has a lot of redundancy, so you can get the same point across by describing the pattern rather than the data itself. Notice how, in doing this, we are abstracting away from the raw data.

That would be a "lossless" compression, the kind that represents the exact same information as the original data. "Lossy" compression, on the other land, loses information, only approximating the original data. Jpeg's are an example of lossy compression (as opposed to png's, which are lossless). They don't exactly preserve each pixel on the grid; rather, they find a simpler (compressed) pattern of pixels which looks roughly the same as the original image.

My point here is that Reductionism is sort of like "gainy" compression: You compress a bunch of observations into some simple law, only to find that the law explains many more observations than you intended. The development of scientific theories can be understood as noticing and capturing patterns in data; AKA, compression. (Notice the conceptual similarity between "reduction" and "compression" in ordinary contexts.) Doing so abstracts our theory away from the raw experience we are trying to describe, a tendency which amplifies as science progresses and becomes more esoteric, with abstractions layering on abstractions, drilling deeper and deeper toward some mythical theory-bedrock. This kind of compression is neither lossless nor lossy: It is "gainy". You get out more than you put in.

The effectiveness of Reductionism can ultimately be attributed to the fact that nature is so damn compressible.

* I am not referring to Reductionism in the sense of linearity; that is, the idea that a whole can be fully understood simply by understanding each of its parts in isolation.