I have been interested in Claude Shannon's information theory for some time. In A Mathematical Theory of Communication Shannon pointed out that a stream of symbols (information) can be qualitatively measured by its entropy or randomness. The stuff that isn't entropy is redundant, and can be removed (via compression) without losing the essential nugget of information. Shannon provided a very precise way of thinking about entropy and it's counterpart, redundancy.
Much of the information we deal with in everyday life contains a good deal of redundancy. The english language, for example, can be mostly understood if you remove the vowels. Cnsdr ths sntnc, cn y rd t? If you remove the consonants, on the other hand it is much harder to understand. So the vowels are somewhat redundant (compared to the consonants). Popular music also contains a great deal of redundancy, as does a lot of art.
One of the purposes of adding redundancy to a stream of information is to make it easier for humans (and computers, in some cases) to digest information. Although the sentance without vowels can be read, it is harder to read. On a noisy transmission channel, the redundancy enables the reader to correct errors that may have been introduced into the stream of information.
When I read the first few sentences of a novel in a bookstore, I am often unconsciously judging its entropy - some novelists write in a style which is unnatural to me, and contains more entropy than I would prefer. I find this prose dense and tiring to read. Other novelists write in a style which is too simple.
Much of the software I write,such as the kaleidoscopes, employs random numbers to generate content. The random numbers provide a nugget of entropy, and the rest of the software modifies that random kernal in various ways. A kaleidoscope program adds symmetry (a kind of redundancy). A music-writing program might add repetition or a matching contrapuntal melody (also a kind of redundancy). Kaleidoscopic images and other art can be beautiful because they achieve a balance of entropy and redundancy.