Information

The existence of symbols or a set of symbols commonly described as a code or scheme is direct evidence against non-directed processes as the origin of life. By far the most interesting subject on intelligent design is the use of symbols and codes.

The enigma is the "origin". The existence of patterns requires an interpreter. The use of a symbol is used to express the existence of something else other than the medium by which the content is stored. What is the origin of the information? If undirected material processes are not capable of generating a "code", What is? Based on all observations is there a common origin or cause for all codes?

Codes are arbitrary in their use of physical assets.

Binary captures a small electronic (electromagnetic) charge and is stored in sets of eight (or more) to represent something it is not.

The fact that the code could be anything is also the same evidence that it must only be one thing, one set condition.

One scheme.

And from these arbitrary codes such as ASCII where the decimal value for the upper case "A" is sixty five and the "Tab" key is set as number nine. The choice in the "code" is unique to a single man named "Bob". Bob's scheme. Bob used bits to code for an explicit character. The character is not the bits or the byte but is represented in Bob's scheme.

Is their a physical necessity for the TAB key to be represented by the number nine?

The answer of course is that Bob chose which decimal values would be represented for each character. And the Tab Key would be represented by the first bit and the fourth bit. There was no material necessity for this selection. It is interesting that the Carriage Return is the number thirteen. Very appropriate.

The setting of lower numeric or digital values as "controls" prior to defining numerals and alpha characters was his design. There is no physical necessity for this deliniation between control characters and alpha-numeric chracters bound to binary represented as decimals. Click on the link and check out the scheme. Did you see what the number zero is in his design?

Fortunately "Bob" took the hard work out of it for us. And he was also one of the first to recognize the usefulness of an "escape" character or condition very useful in stablizing programmatic sequences. And to provide a mechanism to stop all programming requests to prevent runaway processes from overtaking the entire system.

In order for the code to succeed we almost have to do nothing. We just have to agree. We must agree with Bob. And the agreement is "implicit", in that you may have never known the electronic code, but it is used everyday. Today and forever, every electronic device coded for the alphanumeric system generates one big fat "48" for the number Zero. The fifth and sixth bits, side by side not knowing that the other exists.

Bob Bemer could have used different values, however, since he could have chosen anything, it is in the actual selection (freely chosen) that the specific symbolic transformations function as a standard. No matter what the standard is, the external agreement is necessary (not bound by the code) that causes the code to be an effective form of information transmission. Again, there is no physical necessity or constraint for the structural order.

ASCII and DNA are both symbolic schemes.

Schemes require a schemer.

Symbolic linking of a physical resource to a non-physical characteristic is unique to intelligence.

Binary data like a bit or byte are not aware of their environment. Consider the number "0" (48) and consider the byte. 48 is determined by the state of seven bits. Each bit in succession is doubled and used to add to the over all final total. (64,32,16,8,4,2,1) 0110000 equals 48.

The bit as an electronic trap of either on or off. Unaware it is part of a byte and a byte unaware it is used as a code. A two bit, four bit or eight bit byte can all be used as long as the standard is agreed to. The scheme.

The digital code is not only arbitrary but it is also neutral.

No single bit can flip another bit or open the trap. The physical characteristics are not random and work only in stasis. Random changes (especially radiation and magneticism) most frequently corrupt the original digital message. These alterations cause deleterious effects to the message.

Adenine, Guanine, Thymine and Cytosine are the building blocks of DNA and are not based on two bits, but rather a four bit system. We use our binary system as a two-bit byte to replicate or transcribe DNA. The triple nucleotide base codes for a single amino acid (of twenty) into linear strings of single amino acids that combine to fold into shapes that comprise the molecular machines. The sugar phosphate backbone is neutral and does not favor specific amino acids (base preferences) in exactly the same way a bit doesn't favor a byte.

The arbitrary condition and neutrality of binary data is exactly the same for DNA.

Nucleotides are not aware that they are part of a nucleotide triplet and do not alter their neighboring codons based on physicality. The triplet isn't aware that it is part of "codifying" an amino acid codon such as the combination of Adenine, Adenine, and one more Adenine, coding for the Lysine Amino Acid. A=65.

Contemporary research has also identified the likelihood that triplets are combined with two other triplets to make a triplet of a triplet. The juxtaposition of the triplets alters the percentage of development and rate of production.

If your mind is not blown and your jaw agape then you are not paying attention...

A code within a code.

"If it is

arbitrary,

then you know it is not"

Any system that is 100% arbitrary (could be anything) and is also neutral (no physical constraint that determines significance/non-material) then "intelligence or design" is required for the arbitrary system which obviously is not arbitrary.

Intelligence created the scheme and intelligent beings can identify signs of intelligence.

From a preservation perspective it is critical that data assets are abstracted from the physicality where physical forces could alter the desired specified complexity. Magnetism alters binary data without programming and radiation can alter biological data. Mutation or degradation of a message is consistent with known laws of decay. Forward thinking or projection prepares the design to maintain stasis even with potential problems, such as message loss. Fortunately error checking algorithms protect data integrity for the most part. Stasis is the normative state of life.

The law of biogenesis is a powerful and consistent force.

Binary data and biological data share many common traits. They are arbitrary, neutral and do error checking. They also share common origin.

Only an agent can use a physical resource to code for a non-physical entity. Commonly called an "oracle" within evolutionary informatics . The oracle in "all" cases is the programmer.

Evolutionary biology never offers a reason for believing that an arbitrary and neutral based coding system could form out of material processes. The forces that impact the evolutionary or material model, "Chance, necessity, energy and mass" within a nearly infinite set of possible combinations, yet, no formal model exists for the non-directed method to arrive at a code or a scheme.

Formal sciences like ProtoBioCybernetics is evidence of "intelligent design" in biological systems.

I suspect all computer programs used to illustrate "evolutionary biology" like AVIDA actually end up demonstrating "intelligent design" more than they demonstrate the likelihood of "pre-existent" entities mutating and "naturally" selecting into more complex ones.

Information is not physical. One interesting bit of evidence is that we use computer systems to model DNA and we have stored binary data directly into DNA. This shows that the physical layer is used and "translated" yet constant regardless of the actual medium utilized. Stasis provides certainty about the message.

The ENCODE project has substantially negated the claim that large portions of the nucleotide chains are junk similarly as useless or vestigial organs. Junk DNA is not nor has it ever been junk.

The genome has been wrecked, but the design of non-coding areas for control is part of stable formalism.

The first time I ever heard the term "Surprise Effect" was listening to a wonderfully compassionate and very knowledgeable scientist named Dr. A. E. Wilder Smith. He authored the following thesis which is an easy to understand model comparison between Evolution and Creation.

Thesis: "The formulae of the Doctrine of Creation (DOC) is more valid than the formulae of the Evolution Theory (ETH)."

(Wilder-Smith on Biogenesis and Speciation)

ETH: Inorganic matter + Energy + Time = Biogenesis

DOC: Inorganic Matter + Time + Energy + Extrinsic Information = Biogenesis

Evoluted Speciation

ETH: "Simple" Cell + Time + Energy = ? DOC: "Simple" Cell + Time + Energy + Extrinsic Information = Evoluted Speciation﻿

Based on these models it is easy to identify the likely cause of a "Surprise Effect". The concept of Evoluted Speciation is most closely linked to "Epigenetics" , and is not meant to affirm the evolutionary model.

Evolution has no method for generating "new information", which is not really "new information" or a surprise at all.