Information theory, sparked by Claude Shannon, is used to quantify textual meaning. At it's heart lies the notion of entropy
, which, in terms of information theory (not thermodynamics) means the amount of text needed to convey a given message. It looks at ways of encoding a certain symbolic text into a system of different symbols
, which are often binary code.
The beauty of information theory is that it chooses to talk about meaning as being composed of other symbols, and by seeing it in that way can say a great many productive things about how to convey it. It neatly dodges the philosophical question. All of the standard information theoretical examples involve mathematical models of coin tosses, balls hidden under boxes and the like. But when the meaning in question is language, not data, and more than just that, literary — how is it possible to define the entropy of a sentence that forces you to reconsider your interpretation of an entire novel? It calls to mind the old Antoine de Saint-Exupéry adage:
"Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away."
The practice of writing has always been concerned with discovering the different potential ways to express your ideas, and choosing the best one among them. Perhaps great works of literature approach the entropy of the ideas that they convey. If there is nothing left to take away, and you have achieved Saint-Exupéry's perfection, Claude Shannon's entropy, then you have a work packed dense with meaning, brimming with ideas. When a text would suffer from any continued revision, then that text is done.