Does information really diminish? Looking at social media, information definitely grows. New content is endlessly added. But, given this enormous growth in content what is diminishing then? Let’s illustrate: You witness an incident and you describe it to somebody, who in turn passes it further, etc. Your original account swells the more it’s retold. It gets more details and color, perhaps it even gets juicier. But the information about what has really happened diminishes at each step. That’s the Law of diminishing information (LDI):
Second hand information is never better than firsthand.
As Chico Marx said: Who Ya Gonna Believe, Me or Your Own Eyes? Exemplified in the old-fashioned news media, a reporter is telling you what he sees for himself. This information tends to be true. Information given by social media can be true too, but more likely by accident.
The LDI is statistical in nature. You tell someone the PIN code is 1234. The listener hears it as 1254, and repeats that. But the next recipient hears it as 1234 again. Now the correct information is restored. Yet with overwhelming probability, any mishearing likely leads to more errors, than to correction. We end up with a series of random numbers, giving no information whatsoever about the original PIN.
The same statistical principle works with the mechanics of a large number of particles. In physics, LDI is called the second law of thermodynamics. In quantum mechanics this statistical phenomenon is called decoherence. Claude Shannon’s communication theory builds on a special case of LDI, where information is measured in entropy (i.e. disorder). When information is measured in Fisher information (i.e. sharpness) LDI leads to Schrödinger’s wave equation in quantum mechanics. LDI also applies to automatic control theory, to von Neumann’s theory of games, and to Einstein’s special relativity theory.