Wednesday, June 30, 2010

Evolution as Intelligent Design

As earlier shown, Gaussian adaptation, GaA, may be used for maximization of manufacturing yield. The biological analogy to technical manufacturing yield becomes mean fitness. And a plausible definition of mean fitness, P, as a mean of probabilities is

P = integral{ s(x) N(
mx) dx }

where s(x) is the probability that the individual having the array of n quantitative (Gaussian distributed) traits xi, i = 1, 2, …, n. N is the Gaussian probability density function, p.d.f., with mean =
m. It may be that this definition is not very suitable for breeding programs. Nevertheless, it seems very useful in many philosophical discussions.

A pocketful of theorems makes it plausible to use GaA as a simple second order statistical model of the evolution of quantitative traits provided that those traits are Gaussian distributed, or nearly so. This opinion has thus far not been accepted by the scientific community, but nobody has told me that any one of the theorems - I refer to - is wrong or that it can’t be applied to evolution.

Together those theorems shows a duality between mean fitness and phenotypic disorder (average information, diversity) and that evolution may carry out a simultaneous maximization of mean fitness and average information. An alternative interpretation is that a more disordered gene pool is more spread out over a region of acceptability and thus gives more information in the art of survival.

Definitions of phenotypic disorder, average information and diversity, H – are assumed to be equivalent and are valid for all statistical frequency functions, pi , (i = 1, 2, …, n). Sum(pi)=1.

H = sum{ pi log(pi) }.

According to point 5 below there must also be a balance between order and disorder obtained by a heritable mutation rate such that P is kept at a suitable level. In such a case evolution may maximize average information while keeping mean fitness constant.

1. The
central limit theorem: Sums of a large number of random steps tend to become Gaussian distributed.Since the development from fertilized egg to adult individual may be seen as a modified recapitulation of the stepwise evolution of a particular individual, morphological characters (parameters x) tend to become Gaussian distributed. As examples of such parameters we may mention the length of a bone or the distance between the pupils, or even the IQ.

2. The
Hardy-Weinberg law: If mating takes place at random, then the allele frequencies in the next generation are the same as they were for the parents. Thus, the centre of gravity of phenotypes of offspring, m, coincides with ditto, m*, of the parents.

3. The Theorem of Gaussian adaptation:
a. The gradient of the mean fitness of a normal p. d. f. with respect to its centre of gravity,
m, is equal to gradient P(m) = P inverse(M) ( m* – m).

The maximizing necessary condition for mean fitness is m* = m (at selective equilibrium).
m* is the centre of gravity of the phenotypes of the parents.

b. The gradient of phenotypic disorder (entropy, average information, diversity) with respect to
m – assuming P constant - points in the same direction as gradient P(m).

c. A Gaussian p.d.f. may be adapted for maximum average information/phenotypic disorder to any s(x) at any given value of P. The maximum necessary conditions are:

m* = m and M* proportional to M.

When m* = m at selective equilibrium, the gradients = 0 for mean fitness and average information (phenotypic disorder, diversity) and thus may be simultaneously maximal.

See also Kjellström, G. & Taxén, L. Stochastic Optimization in System Design. IEEE Trans. on Circ. and Syst., vol. CAS-28, no. 7, July 1981.

4. A Theorem about Disorder: The normal distribution is the most disordered distribution among all statistical distributions having the same second order moment matrix, M.
See also Middleton, D. An Introduction to Statistical Communication Theory. McGraw-Hill, 1960.

5. The theorem of efficiency. All measures of efficiency satisfying certain simple relevant postulates, are asymptotically proportional to -P*log(P) when the number of statistically independent parameters tend towards infinity.

Kjellström, G. On the Efficiency of Gaussian Adaptation. Journal of Optimization Theory and Applications, vol. 71, no. 3, December 1991.

6. The second law of thermodynamics (entropy law): The disorder will always increase in all isolated systems. But in order to avoid considering isolated systems an alternative formulation will be used: A system attains its possible macro states in proportion to their probability of occurrence. Then, the most probable states are the most disordered.

The most important difference between the natural and the simulated evolution in my PC is that the natural one is able to test millions of individuals in parallel, while my PC has to test one at a time. This means that when evolution replaces one generation of a population with one million individuals with a new one in one year, the same operation will take one million years in my PC. In spite of this I find the simulated evolution very efficient. As earlier shown, maximum efficiency is achieved when P = 1/e = 0.37.

No comments: