INFORMATION: THE THIRD FUNDAMENTAL QUALITY

Picture of the Human Brain and a Computer Chip

adapted from articles by

Professor Werner Gitt

Used with the kind permission of the Creation Science Movement, 50 Brecon Avenue, Cosham, Portsmouth, England, P06 2AW.

In science and technology, energy and matter have long been considered basic and universal factors. But now the “information factor' in things has come to be recognized as being equally fundamental and far-reaching in its scientific significance. Information confronts us at every turn: in data processing, in communications engineering, in systems control, in biological communication systems, and in the information-based processes of living cells. It is an interdisciplinary concept of central importance to manufacturing technology, biological engineering, linguistics, etc. Because of rapid developments in computer technology in recent years, the new field of study known as information science has attained a significance that could not have been foreseen a few decades ago. Information has become known as the third fundamental and universal quantity.

The remarkable amount of information content in living things is arguably the best evidence for special creation. Therefore, we shall analyze the elements involved in the transfer of information to see why the multitude of life forms on earth could only have been created by a purposeful intelligence.

The Levels of Information

There are four levels of information that combine to produce any deliberate effect. Information rides upon a set of symbols (the statistical level) organized by a code (the syntactical level) to give a meaningful message (the semantical level) that generates a response (the purpose). These relationships are illustrated in Figure 1.

The Levels of Information: Statistics

The most basic requirement for information to be passed from a transmitter to a receiver is a set of symbols. For example, a printed message rides upon the alphabet. Generally, the longer the message, the more information it is capable of containing (though, of course, it is possible to write much while saying little)! Not all “languages” employ alphabetic letters, however. Whether it is the language of chemical formulae, musical scoring, circuit diagramming, hieroglyphics, Morse code, the genetic code, or signing for the deaf, each has its own set of symbols.

With his paper entitled “A Mathematical Theory of Communication” (1948), Claude E. Shannon was the first to devise a mathematical definition for the concept of information. His measure of information, given in “bits” (binary digits), possessed the advantage of allowing quantitative statements to convey relationships that had previously defied mathematical description.

In the simplest case, when there are just two symbols occurring with equal frequency, the information content of one of these symbols can be assigned a unit of one bit. According to Shannon, the statistical information content of a chain of symbols is then a quantitative expression given in bits. Information density - the amount of information in a unit volume - is a measure of the efficiency of storage and transmission of information.

Picture Showing the Levels of Information

Highest Information Density

The highest information density known is that of DNA molecules, which comprise the genes of living cells. This chemical medium for information storage occupies a volume of only 1.068 x 1020cm3 per spiral, with a diameter of 2 nm (109 nm = 1 metre) and a helix pitch of 3.4 nm (see Figure 2). Each spiral contains ten nucleotides, identified by ten chemical letters, giving an information density of 9.4 x 1020 letters per cm3. Since the information content of each of the four different DNA nucleotides is two bits, the statistical information density of DNA is 1.88 x 1021 bits per cm3.

Compare this with the highest information density in man-made silicon chips (1990). A 1-Mbit DRAM (see cover) permits the storage of 1,048,576 bits in an area of approximately 0.5 cm2. With a thickness of about 0.05 cm, the 1-Mbit DRAM has a storage density of 4.2 x 107 bits per cm3. Therefore, the storage capacity of DNA, the information carrier of living things, is 4.5 x 1013 times more than the most advanced silicon chip!

To illustrate the remarkable storage efficiency of DNA, consider the following: The sum total of knowledge currently stored in the libraries of the world is estimated at 1 x 1019 bits. If this information could be stored in DNA molecules, one percent of the volume of a pinhead could hold it. If, on the other hand, this information were to be stored by megachips, we would need a pile higher than the distance between the earth and the moon.

Picture of a DNA Molecule

DNA achieves this efficiency in two ways. First, it utilizes three dimensions to store information; the chip has only a two dimensional storage framework. Second, the two circuit states of a chip only allow for binary coding, whereas the four different nucleotides of DNA give a quaternary code, in which one state delivers two bits of information. Even the most advanced VLSI technology does not yield a level of performance anything like the single DNA molecule achieves.

Clever as it is, Shannon's definition of information relates solely to the statistical relationship of symbol chains. He ignores their “higher” aspects. The theory makes it possible to offer a quantitative description of those characteristics of languages that are intrinsically based on frequencies, but whether a chain of symbols is syntactically correct and has meaning are issues not addressed in Shannon's work.

The Levels of Information: Syntax

In any written message, the grouping of letters to form words and the stringing together of words to make sentences are processes governed by various rules of construction. In addition to letters, written language has the lexicon of words, conventions and word order, and sentence grammar. These rules must be complete and unambiguous. They must be known to the transmitter and receiver alike, and they must be strictly followed to effectively transmit information. Thus, at the syntactical level of information transfer, the symbols of information are used to define the lexicon and the grammar of communication - the code, if you will.

If a basic code is found in any system, it can be concluded that the system originated from a mental concept and did not arise by chance.

In transferring information, the code is every bit (please excuse the pun) as necessary as the symbols it employs. The code constitutes a mental process. If a basic code is found in any system, it can be concluded that the system originated from a mental concept and did not arise by chance.

The syntactical structure of natural languages is generally much more complex than that of artificial ones, such as computer languages. Anyone familiar with the rigours of computer programming knows how ludicrous it would be to suggest that programs could be written with a series of accidental, haphazard keystrokes. All the more ridiculous, then, is the idea that the far more complicated genetic code arose by chance.

Proteins are the basic building blocks of living organisms. They constitute important substances such as enzymes, antibodies, haemoglobins, hormones, etc. Proteins are both organ- and species-specific. In the human body alone, there are at least 50,000 different proteins!

Remarkably, every one of the proteins in living things is made up of just twenty different amino acids strung together in various orders, and each amino acid is coded for by a chain of only three nucleotides on the DNA helix. Surely, the genetic code, like all other codes, could only have originated from a mental concept - it simply could not have arisen by chance.

The genetic code, like all other codes, could only have originated from a mental concept - it simply could not have arisen by chance.

The Levels of Information: Semantics

Chains of symbols and syntactical rules are the necessary preconditions for representing information, but the ultimate nature of a message is not found symbols or codes; rather, it is found in what it means at the semantical level. It is the meaning to both the sender and the recipient that turns a sequence of symbols into information.

According to Norbert Wiener, a pioneer of cybernetics and information theory, information cannot be of a physical nature, even though it is transmitted by physical means: “Information is information, neither matter nor energy. No materialism that fails to take account of this can survive the present day.” Semantical information, therefore, defies a mechanistic approach to understanding.

A computer is merely a syntactical device that knows no semantical categories. People are needed to distinguish between data and knowledge, between algorithmically-conditioned branches in a program and deliberate decisions, between numerical values and meanings, between the formal processes in a decision tree and individual selection, between the results of operations in a computer and a truly creative thought process, and between the accumulation of data and genuine learning.

Meanings always represent mental concept and are distinct from matter and energy. They originate from an intelligence source. It is by means of language that information may be stored and transmitted on physical carriers, but the information itself is invariant with changes in the type of transmission system (acoustical, optical, electrical, etc.) and the kind of storage system (a brain, a book, a computer, etc.). The reason for this invariance lies in its non-material nature.

The Levels of Information: Purpose

The highest level of information transfer concerns its purpose. Information is transmitted in order to elicit a response from the recipient. The purposive aspect of information requires that the statistical, syntactical, and semantical levels all work together to trigger the desired response. In spoken language, sentences are strung together to formulate requests, complaints, statements, questions, instructions, etc. With its song or a dance, a bird would like to gain the attention of a potential mate or warn others of his claim to territory. And a robot on an assembly line will repeatedly go through the motions it has been programmed to perform.

The predictability of the response from a given transmission of information depends on the nature of the system. The responses elicited in a computer program, and mechanized manufacturing operation, or biological organ are predetermined: there is but one response for a given input of information. The translation of a foreign language and instinctive behaviour are examples of responses with limited degrees of freedom. Certain outcomes may be likely, but are not guaranteed. In humans one finds flexible, creative, original responses because mankind enjoys a great amount of freedom in dealing with information.

Information in Living Organisms

Life exists in an immense variety of forms. For all its apparent simplicity, even a mono-cellular organism is more complex and purposeful in its design than any product of human invention. Although matter and energy are fundamental and necessary properties of life, they do not in themselves imply any basic differentiation between animate and inanimate systems. One of the prime characteristics of all living organisms, however, is the information they contain that enables them to perform all of life's functions, including the transmission of genetic information to the next generation.

Without a doubt, the most complex information-processing system in existence is the human being. Each person processes 1 x 1024 bits of information daily, some of it consciously (e.g., language and other information-controlled, voluntary movements) and some of it unconsciously (e.g., the information-controlled, involuntary functions of the organs). This astronomically-high figure is a million times greater than the 1 x 1018 bits of human knowledge stored in all the world's libraries.

Conclusion

Information is a mental - not a material - quantity. The study of the nature of information at its statistical, syntactical, semantical, and purposive levels decisively rules out any notion of a materialistic origin for information systems. Therefore, the originator of the information-processing systems present in all living organisms must have been a purposeful Creator of unimaginably great intelligence.

Credits