A Brief History Of Information Continously evolving article.

 

This article traces information from the beginning through to how we have transformed it and given information new dimensions first by language, then by writing, early computers, networked computers and beyond. It is by no means intended to be anything but a rough overview to highlight what I feel are the salient periods.

 

The first information : The first information was binary. Before anything else could be determined or described about the universe, before there was a potential for abstraction or even interaction, there was only a universe which could proclaim: "I am." Before the big bang, there was nothing- nothing in terms of our current universe anyway. Immediately as the big bang happened, there was something: Of. Then on. Without any further elaboration.

Since then the universe has blossomed from a soup of elementary particles to unimaginably many rich worlds with living creatures - us, endowed with the capability to invent new worlds at will in the dimension of their minds, culture and tools.

Since then information has grown in scale and complexity, much like the physical world which embodies and represents information has grown from physics to chemistry to biology and so on. The stages are not from reality to abstraction, but to different levels of reality depending on scale. So has information moved, from elementary interactions at quantum scales to the macroscopic scales we inhabit to the worlds our minds and imaginations give rise to.

Information and the physical world are often treated as mind and brain, where information is an abstraction of the physical world and exists in a separate world from the physical world the information represents. As the tired argument of a mind outside the brain, the Descartes duality.

This way of looking at information I will dismiss simply as a matter of scale.

The very same principles which separates physics from chemistry and a single car from city road planning are the ones of scale which gives rise to mind from matter and information from interactions.

 

~

 

Information's natural state : Information's natural state used to be one of motion, of activity. Information is generated by interactions, information is interaction, as without comparison, without a context, without interaction, there is nothing.

There is no temperature with only hot. There is no darkness without light. In quantum physics a particle has no identity until it meets another and its superposed wave function collapses.

It is like that with all information. Information cannot exist without context.

Information is not an abstraction, it is nothing separate from the physical world as it is often discussed as being.

And for those snoozing through this paper thinking it is way to esoteric and over analytical for its own good, it is important to point out that information has to be useful to exist. No that it should be useful, but that it has to be useful. Useful in a specific context. If it is not, it either does not exist, or it becomes negatively useful. As in noise.

The first information was incidental. A cratered moon records the history of the impacts but it does so by accident, as a by-product of the events, not for the events.

 

~

 

Life : Life appeared and information could be stored in useful form. Life is different from non-life, storing information about how to replicate something, how to replicate itself. Information now served a purpose, as before it had only existed as a record, an imprint. Not to say that these imprints, these records didn't have any effect.

Life by itself is pretty much a steam train chugging along expanding where conditions are ripe and faltering where they are not. Simple life just goes on and on as best it can.

 

~

 

Consciousness & Language : Consciousness & language are hard to separate. Language is dynamic. "Language: A system of conventional spoken or written symbols by means of which human beings, as members of a social group and participants in its culture, communicate." (Encyclopedia Britannica) Writing is not dynamic, which is pretty much the whole point of writing, locking information down and counting on it staying that way.

 

~

 

Writing : Along came one of the greatest inventions of mankind; writing- the freezing of information.

Sumerian bookkeepers needed a way to keep track of agricultural goods. So they used tokens for simple book-keeping purposes to the development of written tablets on which graphs of the script stand for morphemes of spoken Sumerian. These are thought to date back to as early as 8000 BC, about the time that hunter-gatherer societies were giving way to an agricultural way of life.

That was all well and fine for many uses, but many syllables were left out. The first writing system consistently based on the sound structure of a language was Linear B, a Mycenaean Greek orthography developed around 1400 BC. Writing was becoming more like regular speech.

The final stage in the evolution of writing systems was the discovery of the alphabetic principle, the system of building the syllables from their consonantal and vowel sounds. According to the British linguist Geoffrey Sampson; "Most, and probably all, 'alphabetic' scripts derive from a single ancestor: the Semitic alphabet, created sometime in the 2nd millennium (BC)."(Encyclopedia Britannica) Even closer to speech.

But not quite there: Writing: "Form of human communication by means of a set of visible marks that are related, by convention, to some particular structural level of language. This definition highlights the fact that writing is in principle the representation of rather than a direct representation of thought..." (Encyclopedia Britannica)

Since writing doesn't have the precision or accuracy of objective representation, or even a verbatim representation of human perspective, not only is written language accurate at the time of writing, it only gets worse as time goes by and language moves on. The context slowly gets stretched away from the information until, at one point, it breaks down entirely and the writing can no longer be understood.

So information could be frozen but not in a perfect, complete form, as that would require freezing and including all the information's context, which would have to be all information everywhere, it would have to be everything, and if that was done the information would of course cease to exist.

Information could survive, with writing, in an incomplete form pretty much permanently, degraded only by it's physical media and the outside world's ability to access it- altered through changes in language, customs and culture as well as physical accessibility. From the human perspective though, written information becomes for most practical purposes though, solid.

"In the Phaedrus, Plato argued that the new arrival of writing would revolutionize culture for the worst. He suggested that it would substitute reminiscence for thought and mechanical learning for the true dialect of the living quest for truth by discourse and conversation." Marshall McLuhan 1954 "From "Essential McLuhan"Edited by Eric McLuhan & Frank Zingrone p 285.)

The only time it would thaw, was when someone would read it and re-introduce it into the dynamic environment of their minds.

 

~

 

Computers : This changed again when the frozen information could be manipulated in chunks, with the advent of computers. Information got translated from the smooth, continuous, analog world into chunky bits (binary digits -on/off). Complicated language with complicated meanings formerly stored and expressed in writing and speech also got chunked, with nothing more than zeros and ones to embody them.

But these lifeless bits were machine manipulatable, this is where the magic lies.

With George Boole's "algebra of logic"; 'boolean' logic, all mathematics could be be reduced to, and expressed in terms of sets with the notation x and y.

The concept Boole used to connect the two heretofore different thinking tools of logic and calculation was the idea of a mathematical system in which there were only two quantities, which he called "the Universe" and "Nothing" and denoted by the signs 1 and 0. Although he didn't know it at the time, Boole had invented a two-state (binary) system for quantifying logic that also happened to be a perfect method for analyzing the logic of two-state physical devices like electrical relays or vacuum tubes.

The century ticked over from the 19th to the 20th.

And along came Alan Turing who at the age of twenty-four, when confronted with the problems of formally stating what is computable (Hilbert's Entscheidungsproblem) creating the theoretical basis for computation. He invented the concept of the general computer, also called the Turing Machine, which worked on a single continuous stream of input of binary, on/of, pieces of data. The machine would only be aware of the one symbol at a time which would enter, the machines current state and a set of rules or algorithms which had previously supplied to it.

Assembly line manipulation became possible. Databases could be organized and re-organized, yielding new information in its relationships at every turn. Programs and procedures could be devised with the confidence that the machines would tirelessly follow them to the letter. Impossibly boring manipulations with potentially exciting results became first promised, practical, then routine. The rush of building the logical machines had started. First big powerhouses, then 'personal computers' and no networked computers and other digital devices which computes and communicates with computers.

The computer had been set on a trajectory into the future, becoming steadily more capable and powerful. The issue started to become what could we get out of this? How could they help us learn, communicate and make decisions? How could they augment our intellect? Enter Douglas Engelbart who, as a radar operator during World War II had stared at a screen or two. He promptly set out to invent the mouse, windows, hypermedia(hypertext) and most of the rest of the human-computer-interfaces we still use today, putting more of the man in the machine and more of the power of the machine into the man. First demoed in 1968. Still to day we see only a trickle of what he invented.

The primitive PC's of the 80's and early 90's empowered the individual. Enabled us to do, well more. More of what was previously segregated into specialist fields. We could publish magazines from home! We could solve amazing equations, do our own complex financial planning! Design like there was no tomorrow. Write print quality letters. We became, in effect, our own secretaries.

"As technology advances, it reverses the characteristics of every situation again and again. The age of automation is going to be the age of "do it yourself"." McLuhan in 1957.

 

~

 

Computation : Speed matters and computers double in processing capacity every 18 months.

That's Moore's Law, named after Gordon Moore (co-founder of Intel in 1968). "Moore is widely known for "Moore's Law," in which he predicted that the number of transistors that the industry would be able to place on a computer chip would double every year. In 1995, he updated his prediction to once every two years. While originally intended as a rule of thumb in 1965, it has become the guiding principle for the industry to deliver ever-more-powerful semiconductor chips at proportionate decreases in cost." From his Executive Bio.

"In 1965, Gordon Moore was preparing a speech and made a memorable observation. When he started to graph data about the growth in memory chip performance, he realized there was a striking trend. Each new chip contained roughly twice as much capacity as its predecessor, and each chip was released within 18-24 months of the previous chip. If this trend continued, he reasoned, computing power would rise exponentially over relatively brief periods of time. Moore's observation, now known as Moore's Law, described a trend that has continued and is still remarkably accurate. It is the basis for many planners' performance forecasts. In 26 years the number of transistors on a chip has increased more than 3,200 times, from 2,300 on the 4004 in 1971 to 7.5 million on the Pentium II processor." (http://www.intel.com/intel/museum/25anniv/hof/moore.htm)

Ray Kurzweil takes it further, stating that it has been this way for a lot longer than we first thought, this trend did not start when noticed, in the sixties.

To understand the the implications of this exponential growth, let's go back to 1984, the birth of Macintosh, a year dear to me, a year computers, and us, got liberated from the text based interface. Let's say we put one dollar in the bank back then. And let's say this bank gave interest in line with the speed of computers evolution. We are in 1999 (at time of writing anyway), that was 1984 so that's 15 years right? OK, that's seven and a half times our dollar has doubled in value. It started at $1 and in 24 months it was worth $2, then 4, 8, 16, 32, 64, 128, finally to be worth $256 at next count ($192 now).

So guess what kind of money we are looking at for the next couple of years? Have a look at the table below, which is based on numbers taken from "The Age Of The Spiritual Machine" by Ray Kurzweil.

Current new machines costing a thousand US dollars or so have the processing power of an insect brain. In ten years we will be able to spend the same amount of money and get the processing power of a mouse brain (with a bank balance then at 8,192 dollars) We will be able to buy the equivalent processing power of a human brain in 2023 (with over half a million in the bank). In 2060 we will be able to get a machine with the processing power of all human brains. Our bank balance would be 137,216 million dollars at that point!

                     
Computer Processing Capacity since 1984 compared interest in a bank doubling every 18 months.
 Year  1984  1986  1988  1990  1992  1994  1996  1998   2000
2002
 $  1  2  4  8  16  32  64  128  256  512
Capacity:                insect    
                     
  Year  2004  2006  2008  2010  2012  2014  2016  2018  2020  2022
  $  1,024  2,048  4,096  8,192  16,384  32,768  65,536  131,072  262,144  524,288
 Capacity:       mouse            human
                     brain
  Year  2024  2026 2028  2030   2032 2034   2036 2038   2040 2042 
  $  1 m  2 m 4 m   8 m 16 m    33 m  67 m  134 m 269 m   536 m
Capacity:                    
                     
  Year  2046  2048  2050  2052  2054  2056  2058  2060    
  $ 1,072 m 2,144 m
4,288 m
8,576 m 17,152 m 34,304 m 68,608 m 137,216 million dollars
 Capacity:               all human brains
                     


When will it end? Every decade someone predicts the slow down of Moore law, but so far it's just kept on going.

This is important as we cannot afford to look at the last decades as a fluke and think that the evolution of computers has stabilized or that it soon will.

Seth Lloyd, a physicist based at MIT has studied how far Moores Law has left to go within the limits of our current understanding of science. His findings were reported in New Scientist magazine "The Last Computer." 2 September 2000. http://www.newscientist.com.

"People have been claiming the law is about to break down every decade since it was formulated,' Seth Lloyd, says. 'But they've all been wrong. I thought, let's see where Moore's law has to stop and can go no further."

"To begin with, he wasn't too concerned with the details of how the ultimate computer might work- those can be sorted out by the engineers of the future. Instead he stuck to considering basic physical quantities such as energy, volume and temperature (Nature vol 406). The speed of a computer Lloyd realized, is limited by the total energy available to it. The argument for this is rather subtle. A computer performs a logical operation by flipping a '0' to a '1' or vice versa. But there is a limit to how fast this can be done because of the need to change a physical state representing a '0' to a state representing a '1'."

To go to the ultimate in computers, we need to start small, as small as we can. "In the quantum world any object, including a computer, is simply a packet of waves of various frequencies all superimposed. Frequency is linked to energy by Planck's constant, so if the wave packet has a wide range of energies, it is made up of a large range of different frequencies. As these waves interfere with one another, the overall amplitude can change very fast. On the other hand, a small energy spread means a narrow range of frequencies , and much slower change in state."

"Because a computer can't contain negative energies, the spread in energy of a bit cannot be greater then its total energy. In 1998, Norman Margolus and Lev Levtin of MIT calculated that the minimum time for a bit to flip is Planck's constant divided by four times the energy." "Lloyd had built on Margolus's work by considering a hypothetical 1-kilogram laptop. Then the maximum energy available is a quantity famously given by the formula E=mc2 If this mass-energy were turned into a form such as radiant energy, you'd have 1017 joules in photons, says Lloyd."

As for memory, the article jumps straight to Boltzmann's constant. "What limits memory? The short answer is entropy." "Entropy is intimately connected to information, because information needs disorder: a smooth, ordered system has almost no information content." "Entropy is linked to the number of distinguishable states a system can have by the equation inscribed on Boltsmann's headstone S=K ln W. Entropy (S) is the natural logarithm of the number of states (W) multiplied by Boltzmann's constant (k). Equally, to store a lot of information you need a lot of indistinguishable states." "To register one bit of information you need two states, one representing 'on,' the other 'off.' Similarly, 2 bits require 4 states and so on. In short, both the entropy of a system and its information content are proportional to the logarithm of both states."

So we have a very fast, very dense computer with a lot of energy. About a billion degrees hot.

Estimated time to build the ultimate computer within the realm of our current understanding of the laws of physics? Moore's Law has about 200 years left. According to our current understanding of science.

 

~

 

The Internet : Digital information gets connected.

 

 

Bandwidth growth. Modems are beginning to be replaced by fast, cheap and always on Internet connections such as DSL, cable modem and satellite. As of the close of 2000, 3,1 million people are connected to the Internet by DSL in the US. As McLuhan noted, speed changes things- speeding up still images makes movies. Fast, cheap and always on Internet connections change the medium in a similarly profound way. No longer a research and communications tool, the Internet becomes an active assistant and notifier, the conveyer of streamed interactive entertainment and as as Sun has been preaching for years, the network becomes the computer.

 

Everybody to everybody communication When it is as cheap to communicate with someone on the other side of the world as it is with someone in the same building. When it costs as much to send one message as 500, the nature of communication and information changes.

 

Everybody becomes an interactive publisher. When any individual with an Internet connection can publish on the World Wide Web and have the published site available to anyone else with an Internet connection, everyone becomes a publisher and broadcaster. But being found by someone actively looking is not the same as publizising or marketing the information. The multipoint nature of the communication is between the informatin and the access points, not between the people.

 

~

 

resulting in: The Information Explosion : "More information has been produced in the last 30 years than in the previous 5,000. About 1,000 books are published internationally every day, and the total of all printed knowledge doubles every eight years" according to Peter Large as quoted by Richard Saul Wurman in Information Anxiety.

Information waits for no man.

Vannevar Bush raised the alarm in The Atlantic Monthly already in 1945: "Thus far we seem to be worse off than ever before - for we can enormously extend the record; yet even in it's present bulk we can hardly consult it."

And the wired world certainly hasn't slowed down the information onslaught as Evan I. Schwartz's mind boggling numbers bring it home in his book Webonomics:"Roy Williams, a researcher at the California Institute of Technology's Center for Advanced Computing Research, estimates that all the information from all of human history stored on paper in the world today amounts to about 200 petabytes. A byte roughly equals a printed character. So a petabyte is about one quadrillion (or thousand trillion) characters. That figure includes all the paper in corporate filing cabinets, all government archives, all homes, all schools, universities, and libraries." "By the year 2000, Williams estimates, the amount of online information that will have accumulated in just a the few decades leading up to the new millennium will be about two and a half times that amount now on paper. "

As Marshall McLuhan put it on The Best of Ideas on CBC Radio in 1967: "One of the effects of living with electric information is that we live habitually in a state of information overload. There's always more than you can cope with".

Richard Saul Wurman, in his book What-If, Could be, puts it in maybe even after but bleaker terms: "Everyone spoke of an information overload, but what there was in fact was a non-information overload."

Trying to stay afloat is not the answer, you need to start swimming in the information.

 

~

 

the next step, Liquid Information : Information has changed.

Originally, information could only exist through direct interaction. Later it could be frozen, or stored through writing. Then is could be thawed in chunks, processed by early computers. With even a regular desktop computer being capable of 3 billion calculations a second, storing tens of billion bits, chunks hardly seem the correct term anymore.

When the information then gets melted- when it gets digitized, it doesn't revert to its earlier state, it becomes liquid, it doesn't behave in any previously inherent ways, it gathers a new, relative identity.

Previously its identity was in relation to everything else in the physical world. A book is a book.

In cyberspace there are new relationships, relationships with less physical constraints. Depending on the forces it interacts with, it can go anywhere, be processed in any way and change into anything: With computers, you can make music from a picture, paint a picture from sound. You can treat any information in any way to interact with any other information.

Digital information has characteristics far beyond it's name, the ones and zeros, as do you have characteristics beyond your atoms, even your genes.

For further elaboration on information getting liquid, please have a look at Analog, Digital, Liquid.

 


©1995-2001 The Liquid Information Company    www.liquidinformation.org