Browse Search Feedback Other Links Home Home The Talk.Origins Archive: Exploring the Creation/Evolution Controversy

Information Theory and Creationism

Werner Gitt

by
Copyright © 2005-2006
[Posted: July 14, 2005]

Previous
Previous

Up
Main

Next
Next

[Shannon information] [Algorithmic information] [Dembski information] [Spetner information]

Information According to Werner Gitt

Introduction

Information Theorist and Creationist Werner Gitt, author of the book In the Beginning was Information, attempts to create a system for dealing with the semantic aspects of information. Gitt defines the following empirical principles:

  1. No information can exist without a code.

  2. No code can exist without a free and deliberate convention.

  3. No information can exist without the five hierarchical levels: statistics, syntax, semantics, pragmatics, and apobetics.

  4. No information can exist in purely statistical processes.

  5. No information can exist without a transmitter.

  6. No information chain can exist without a mental origin.

  7. No information can exist without an initial mental source; that is, information is, by its nature, a mental and not a material quantity.

  8. No information can exist without a will.

Here, syntax means an established convention for formatting data (Gitt insists it must be consciously established); semantics means meaning; pragmatics means the structure of communication by the transmitter to achieve specific reactions in the receiver; and apobetics means purpose.

Gitt, in his paper " Information, Science and Biology " (published in AiG's periodical Technical Journal 10(2):181-187, 1996) states that these principles are an extension of Shannon

On the basis of Shannon's information theory, which can now be regarded as being mathematically complete, we have extended the concept of information as far as the fifth level. The most important empirical principles relating to the concept of information have been defined in the form of theorems.

Gitt'sTheorems  [ Top]

In addition Gitt proposes the following theorems:

Theorem 1: The statistical information content of a chain of symbols is a quantitative concept. It is given in bits (binary digits).

Theorem 2: According to Shannon's theory, a disturbed signal generally contains more information than an undisturbed signal, because, in comparison with the undisturbed transmission, it originates from a larger quantity of possible alternatives.

Theorem 3: Since Shannon's definition of information relates exclusively to the statistical relationship of chains of symbols and completely ignores their semantic aspect, this concept of information is wholly unsuitable for the evaluation of chains of symbols conveying a meaning.

Theorem 4: A code is an absolutely necessary condition for the representation of information.

Theorem 5: The assignment of the symbol set is based on convention and constitutes a mental process.

Theorem 6: Once the code has been freely defined by convention, this definition must be strictly observed thereafter.

Theorem 7: The code used must be known both to the transmitter and receiver if the information is to be understood.

Theorem 8: Only those structures that are based on a code can represent information (because of Theorem 4). This is a necessary, but still inadequate, condition for the existence of information.

Theorem 9: Only that which contains semantics is information.

Theorem 10: Each item of information needs, if it is traced back to the beginning of the transmission chain, a mental source (transmitter).

Theorem 11: The apobetic aspect of information is the most important, because it embraces the objective of the transmitter. The entire effort involved in the four lower levels is necessary only as a means to an end in order to achieve this objective.

Theorem 12: The five aspects of information apply both at the transmitter and receiver ends. They always involve an interaction between transmitter and receiver.

Theorem 13: The individual aspects of information are linked to one another in such a manner that the lower levels are always a prerequisite for the realisation of higher levels.

Theorem 14: The apobetic aspect may sometimes largely coincide with the pragmatic aspect. It is, however, possible in principle to separate the two.

Gitt's Conditions  [ Top]

Gitt offers two necessary conditions and two sufficient conditions for information to exist:

NC1: A code system must exist.

NC2: The chain of symbols must contain semantics.

SCI: It must be possible to discern the ulterior intention at the semantic, pragmatic and apobetic levels (example: Karl v. Frisch analysed the dance of foraging bees and, in conformance with our model, ascertained the levels of semantics, pragmatics and apobetics. In this case, information is unambiguously present).

SC2: A sequence of symbols does not represent information if it is based on randomness. According to G. J. Chaitin, an American informatics expert, randomness cannot, in principle, be proven; in this case, therefore, communication about the originating cause is necessary.

Where Gitt Goes Wrong  [Top]

A striking contradiction is readily apparent in Gitt's thinking- he holds that his view of information is an extension of Shannon, even while he rejects the underpinnings of Shannon's work. Contrast Gitt's words

(4) No information can exist in purely statistical processes.

and

Theorem 3: Since Shannon's definition of information relates exclusively to the statistical relationship of chains of symbols and completely ignores their semantic aspect, this concept of information is wholly unsuitable for the evaluation of chains of symbols conveying a meaning.

with Shannon's statement in his key 1948 paper, "A Mathematical Theory of Communication"

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.

It becomes very difficult to see how he has provided an extension to Shannon, who purposely modeled information sources as producing random sequences of symbols (see the article Classical Information Theory for further information). It would be more proper to state that Gitt offers at best a restriction of Shannon, and at worst, an outright contradiction.

In SC2 Gitt notes that Chaitin showed randomness cannot be proven (see Chaitin's article "Randomness and Mathematical Proof"), and that the cause of a string of symbols must be therefore be known to determine information is present; yet in SC1 he relies on discerning the "ulterior intention at the semantic, pragmatic and apobetic levels." In other words, Gitt allows himself to make guesses about the intelligence and purpose behind a source of a series of symbols, even though he doesn't know whether the source of the symbols is random. Gitt is trying to have it both ways here. He wants to assert that the genome fits his strictly non-random definition of information, even after acknowledging that randomness cannot be proven.

(There is a deeper problem here, in that Chaitin is discussing algorithmic randomness and not statistical randomness. Algorithmic randomness for a given string depends on the selection of reference computer – see Algorithmic Information Theory. Chaitin shows that you can’t prove a string is uncompressible or algorithmically random on a given reference computer. Now a string may be laden with meaning yet algorithmically random on a given computer. It may also be meaningless yet highly compressible. Statistical randomness is a different concept, as long as we stick with finite-length strings. While it is possible to compare use statistical tests on long strings, there are classes of deterministic programs called Pseudo-Random Number Generators or PRNGs, of great importance to cryptography, that meet statistical tests for randomness. In other words, neither type of randomness can be proven, but Gitt appears to be confusing the two types of randomness.)

Gitt describes his principles as "empirical", yet the data is not provided to back this up. Similarly, he proposes fourteen "theorems", yet fails to demonstrate them. Shannon, in contrast, offers the math to back up his theorems. It is difficult to see how Gitt's "empirical principles" and "theorems" are anything but arbitrary assertions.

Neither do we see a working measure for meaning (a yet-unsolved problem Shannon wisely avoided). Since Gitt can't define what meaning is sufficiently to measure it, his ideas don't amount to much more than arm-waving.

By asserting that data must have an intelligent source to be considered information, and by assuming genomic sequences are information fitting that definition, Gitt defines into existence an intelligent source for the genome without going to the trouble of checking whether one was actually there. This is circular reasoning.

If we use a semantic definition for information, we cannot assume that data found in nature is information. We cannot know a priori that it had an intelligent source. We cannot make the data have semantic meaning or intelligent purpose by simply defining it so.


[Top]

Previous
Previous

Up
Main

Next
Next

[Shannon information] [Algorithmic information] [Dembski information] [Spetner information]


Home Browse Search Feedback Other Links The FAQ Must-Read Files Index Evolution Creationism Age of the Earth Flood Geology Catastrophism Debates
Home Page | Browse | Search | Feedback | Links
The FAQ | Must-Read Files | Index | Creationism | Evolution | Age of the Earth | Flood Geology | Catastrophism | Debates