Both order and disorder, at their extremes, have little amount of useful information. Think of a white wall, a mural and an old worn wall for example:
• The white wall carries little information, it basically just says it is a wall, white and of a certain size.
• The old worn and cracked wall with lots of random patterns all over the place carries little actual information, just lots of noise. Those are the extremes. A mural on the other hand, carries all the extra information painted on the wall; colours, shapes etc...
Higher amounts of useful information exists in between complete order and chaos. However, the information capacity of any system is "the number of distinguishable states a system can have." From New Scientist, further quoted and attributed in “A Brief History Of Information” (on my site; In other words, the systems potential entropy.

"Entropy is the measure of a system's energy that is unavailable for work. Since work is obtained from order, the amount of entropy is also a measure of the disorder, or randomness, of a system. The concept of entropy was proposed in 1850 by the German physicist Rudolf Clausius and is sometimes presented as the second law of thermodynamics." (2003)

But since entropy is a measure of disorder, it is important to understand that the potential storage capacity of any system is the same as the potential entropy of the system, not its actual entropy.

Imagine a large grid with lots of squares which can be any colour each. The more squares there are (and thus the opportunity for a noisy mess) the more opportunity there is for a highly informative design even though the informative design may only 'use' a small part of the possibilities of the colourful grids.

What is information for one person may well be a mess to another, the normal, everyday definition of information may more properly thought of as knowledge. "It is important to note that "information" as understood in information theory has nothing to do with any inherent meaning in a message; it is rather a measure of the predictability of each transmitted message” ( 2003).

It is surprising to see that takes such a hard-nosed, basic physics view of information. In the academic world of psychology though, information has a more human-focused meaning: “information is defined as the reduction of uncertainty” as quoted from Wickens & Hollands (2000). But who are they quoting? Our hero, Claude Shannon, with Weaver from ‘The Mathematical theory of communications’

The upshot? It's easy to spin into abstract-land and get lost when analysing information, trying to get to the essence of ‘pure’ information, whatever that is. But you know what? Information, by its very nature, by its essence, has to be useful (in some way, for some purpose, by someone, at some time, - there is no absolute definition of usefulness here). Otherwise it's just noise. Can't get more pragmatic than that!

So, information = something which is useful to someone/something. This will be repeatedly useful to keep in mind when trying to work out a system for allowing people to gather information from a long series of discourses.

“Information: noun. 1: the communication or reception of knowledge or intelligence. 2
A: (1): knowledge obtained from investigation, study, or instruction
(2): intelligence, news
(3): facts, data.
B: the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects.
C: (1): a signal or character (as in a communication system or computer) representing data. (2): something (as a message, experimental data, or a picture) which justifies change in a construct (as a plan or theory) that represents physical or mental experience or another construct.
D: a quantitative measure of the content of information; specifically: a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed.” (2003)