Winston Ewert is a software engineer and researcher with a passion for applying his skill as a computer scientist to uncovering the mysteries of life. He obtained a Bachelor's of Science in Computer Science degree at Trinity Western University, a Master of Science degree in Computer Science at Baylor University, and a Doctorate of Philosophy in Electrical and Computer Engineering at Baylor University. He works primarily in the field of intelligent design, exploring the implications of computer simulations of evolution, developing the theory of specified complexity, and understanding genomes as examples of sophisticated software.
Winston Ewert is:
This paper expands on the dependency graph model of life, applying to amino acid sequences with special attention to the prestin gene.
This paper introduces the concept of a dependency graph as an alternative account of the nested hierarchy pattern. It shows that the dependency graph can explain the same patterns as the tree of life, but can also explain deviations from those patterns which are difficult for evolutionary to account for.
I contributed a chapter to this book making the case that intelligent design made successful predictions about computer simulations of evolution while evolutionary theory was unfalsifiable.
This book puts together much of the work I did while at the Evolutionary Informatics Lab.
This paper evaluates free lunches proposed to exist in coevolution scenarios. We showed that the co-evolutionary scenarios considered evaluate partial queries. Instead of determining the quality of a solution all at once, the quality of a solution is only evaluated partially. The "free lunch" produced by these scenarios only arises due to this use of partial queries. The performance is still bounded by the limitations of the no free lunch theorem.
I developed a user friendly desktop interface for using the Stylus evolutionary simulation.
This paper explores what happens when typical evolutionary simulations are run with biologically realistic mutation rates. They cease to work. Typical evolutionary simulations only work because they have extremely high mutation rates.
This paper applies the concept of algorithmic specified complexity to images. It shows how it can capture the difference between random noise, simple patterns, and images. It particularly explores how background knowledge can be used in the compression of an image thereby providing a specification.
This paper applies the concept of algorithmic specified complexity to the game of life. It shows how we can differentiate between patterns which arise by chance and thus which are deliberately designed.
This paper explores the mathematical implications of an omniscient oracle. It shows that there can only be one omniscient oracle who must exist outside of time.
This paper evaluates various computer simulations claimed to have evolved irreducible complexity. It develops the idea of irreducible complexity to have four specific requirements and show that none of the previously claimed examples show irreducible complexity.
This paper introduces algorithmic specified complexity, a version of specified complexity that utilizes Kolmogorov complexity. It provides a metric for evaluating the information content of an object and evaluating whether or not it could plausibly have been produced by a random process.
This paper evaluates Gregory Chaitin's Metabiology project. We show that the project depends on ignoring the immense improbabilities involved.
This paper combines swarm intelligence with evolutionary algorithms to produce interesting behaviors.
This paper explores behavior of agents emerging from simple rules. It explores a number of examples of complex behavior that arise from the interaction of agents following simple rules.
This paper provides the fundamental theorem for algorithmic specified complexity demonstrating that it is improbable to observe an event with high algorithmic specified complexity.
This paper evaluates certain free lunches that had been discovered in relative search performance. These are cases where a search program is only evaluated relative to another search algorithm instead of being evaluated by an absolute standard. Despite the results of the no free lunch theorems, it is possible for one algorithm to be better than another over all possible programs. We show that this due to these metrics not following the transitivity principle.
This paper evaluated a model put forward in a paper entitled There’s plenty of time for evolution. The authors put forward a very simplistic model of evolution to argue that evolution can work rapidly. We showed how the success of the model derived from unrealistic assumptions about the independence of all mutations.
This paper evaluates a simulation by Dave Thomas which evolved Steiner trees. He loudly proclaim that it destroyed intelligent design and proved evolution. We showed that the simulation contained various points of fine tuning in order to be successful.
This paper considers the "methinks it is like a weasel" program utilized by Richard Dawkins in his book, The Blind Watchmaker. This program evolves the phrase "methinks it is like a weasel" by a process of mutation and selection. Its success is due to the use of a "hamming oracle" which tells evolution how close it is to the target.
This paper considers Avida, especially the work presented in the 2003 paper The Evolutionary Origin of Complex Features. We showed how the evolutionary process was assisted by the chosen initialization, selection of instruction set, and favoring of prerequisites. These instances of prior knowledge are utilize to create active information which explains the success of Avida.